Friday, November 4, 2011

Highlights from SenSys 2011

ACM SenSys 2011 just wrapped up this week in Seattle. This is the premier conference in the area of wireless sensor networks, although lately the conference has embraced a bunch of other technologies, including sensing on smartphones and micro-air vehicles. It's an exciting conference and brings together a bunch of different areas.

Rather than a full trip report, I wanted to quickly write up two highlights of the conference: The keynote by Michel Maharbiz on cybernetic beetles (!), and an awesome talk by James Biagioni on using smartphone data to automatically determine bus routes and schedules.

Keynote by Mich Maharbiz - Cyborg beetles: building interfaces between the synthetic and the multicellular

Mich is a professor at Berkeley and works in the interface between biology and engineering. His latest project is to adding a "remote control" circuit to a live insect -- a large beetle -- allowing one to control the flight of the insect. Basically, they stick electrodes into the beetle's brain and muscles, and a little microcontroller mounted on the back of the insect sends pulses to cause the insect to take off, land, and turn. A low-power radio on the microcontroller lets you control the flight using, literally, a Wii Mote.

Oh yes ... this is real.
There has been a lot of interest in the research community in building insect-scale flying robots -- the Harvard RoboBees project is just one example. Mich's work takes a different approach: let nature do the work of building the flyer, but augment it with remote control capabilities. These beetles are large enough that they can carry a 3 gram payload, can fly for kilometers at a time, and live up to 180 days.

Mich's group found that by sending simple electrical pulses to the brain and muscles that they could activate and deactivate the insect's flying mechanism, causing it to take off and land. Controlling turns is a bit more complicated, but by stimulating certain muscles behind the wings they can cause the beetle to turn left or right on command.

They have also started looking at how to tap into the beetle's sensory organs -- essentially implanting electrodes behind the eye and antennae -- so it is possible to take electrical recordings of the neural activity. And they are also looking at implanting a micro fuel cell that generates electricity from the insect's hemolymph -- essentially turning its own internal fuel source into a battery.

Mich and I were actually good friends while undergrads at Cornell together. Back then he was trying to build a six-legged insect inspired walking robot. I am not sure if it ever worked, but it's kind of amazing to run into him some 15 years later and see he's still working on these totally out-there ideas.

EasyTracker: Automatic Transit Tracking, Mapping, and Arrival Time Prediction Using Smartphones
James Biagioni, Tomas Gerlich, Timothy Merrifield, and Jakob Eriksson (University of Illinois at Chicago)


James, a PhD student at UIC, gave a great talk on this project. (One of the best conference talks I have seen in a long time. I found out later that he won the best talk award - well deserved!) The idea is amazing: To use GPS data collected from buses to automatically determine both the route and the schedule of the bus system, and give users real-time indications of expected arrival times for each route. All the transit agency has to do is install a GPS-enabled cellphone in each bus (and not even label which bus it is, or which route it would be taking - routes change all the time anyway). The data is collected and processed centrally to automatically build the tracking system for that agency.

The system starts with unlabeled GPS traces to extract routes and locations / times of stops. They use kernel density estimation with a Gaussian kernel function to “clean up” the raw traces and come up with clean route information. Some clever statistical analysis to throw out bogus route data.

To do stop extraction, they use a point density estimate with thresholding for each GPS location, which results in clusters at points where buses tend to stop. This will produce a bunch of "fake" stops at traffic lights and stop signs - the authors decided to err on the side of too many stops than too few, so they consider this to be an acceptable tradeoff.

To extract the bus schedule, they look at the arrival times of buses on individual days and use k-means clustering to determine the “centroid time” of each stop. This works fine for first stop on route (which should be close to true schedule). For downstream stops this data ends up being to be too noisy, so instead they compute the mean travel time to each downstream stop.

Another challenge is labeling buses: Need to know which bus is coming down the road towards you. For this, they use a history of GPS traces from each bus, and build an HMM to determine which route the bus is currently serving. Since buses change routes all the time, even during the same day, this has to be tracked over time. Finally, for arrival time prediction, they use the previously-computed arrival time between stops to estimate when the bus is likely to arrive.

I really liked this work and the nice combination of techniques used to take some noisy and complex sensor data and distill it into something useful.

6 comments:

  1. The beetle stuff is super cool. I'm sure others will think of this as well, but isn't attaching a bunch of wires to forcibly control a beetle a bit far in the animal cruelty dept? I guess ethics don't apply to beetles, but it's just a wee bit creepy.

    ReplyDelete
  2. Ben - First of all, Mich's group of course has all of the required animal research approvals to do what they are doing. As you well know a wide range of research projects use animals in some form, including primates and other mammals, doing things that are perhaps more disturbing than what Mich's group does. What you are really asking is a broader question of what should be the role of animals in scientific research, and where do we draw the line between what is acceptable and what is not.

    Mich addressed this question directly in his talk, and asked the audience to think about their own values and whether this kind of work makes them comfortable or not. There is not an obvious answer. Mich's assertion is that these beetles operate much more like machines than what we normally think of as animals with feelings and emotions. And - perhaps jokingly - he reminded us that after a day of experiments on a given beetle, they are "placed in a terrarium with plenty of food and members of the opposite sex" for the rest of their lives. This is not to excuse what he does, but I think it's important to wrestle with this question directly.

    I think the deeper question is how far do we, as a society, let this kind of work go. Say we could do this to birds, to gerbils, to cats, to small monkeys, to humans. Where do we draw the line? Some would say that we should not be doing this kind of research at all, which is a fine answer. Personally I think there are tremendous things to be gained through this line of work. Imagine being able to cure Parkinson's, or epilepsy, or paralysis with an implant that could stimulate muscles based on "artificial" signals. To get there we are going to have to tackle some difficult ethical questions.

    ReplyDelete
  3. Mich's work is also featured in 2010 December issue of Scientific American
    http://www.scientificamerican.com/article.cfm?id=cyborg-beetles. I read this article at SA first and it's fascinating to see what our WSN community does can be cool enough to be featured in SA.

    ReplyDelete
  4. Regarding the Easytracker: I have not seen the talk but from your description, I feel that this is akin to figuring out a nice way to swim when your hands and feet are tied. It is so easy to use a little more technology to make the work much easier. Why bother? I must be missing something here.

    ReplyDelete
  5. Anon re: EasyTracker: The idea is to have a cheap and easy-to-install solution for transit agencies that does not require the drivers to do *anything*. You just mount the cell phone in the bus and you're done.

    ReplyDelete
  6. I have to say, this year's SenSys may be better named as MobiSenSys... I submitted a paper about the "traditional sensor networks" and it was rejected. I was disappointed at that time and I somehow understand why it was rejected after studying the reviews and checking the program of this year's SenSys.
    I do find good papers, but I'm curious, will the trend shown in this year's SenSys continue in future, namely, replacing sensor motes with smart phones? (make it more like MobiSys)
    This year's SenSys does remind me about your talk in EWSN several years ago, the talk is about whether sensor network will be attractive anymore (if I remembered correctly).

    I like doing research in sensor networks, though many researchers shifted their focus to mobile phones, I do still see lots of problems left there. I thought SenSys would be the first option to submit my papers to, but I somehow don't know whether I should submit my papers to SenSys next year, as my ongoing work has nothing to do with smart phones.

    ReplyDelete

Startup Life: Three Months In

I've posted a story to Medium on what it's been like to work at a startup, after years at Google. Check it out here.