Monday, August 24, 2009
Postdoc openings on the RoboBees project
The Harvard RoboBees project (which I blogged about previously) has postdoc openings in the areas of wireless sensor networks, embedded computing, swarm robotics, and biologically-inspired multiagent systems. Check out the job posting here for more details.
Thursday, August 20, 2009
Welcome to Stephen Chong and Krzysztof Gajos
Last year was a huge success for faculty hiring at Harvard CS -- we added three new faculty members to our ranks, two of whom are starting this fall. (Yiling Chen started last year -- she works at the intersection of Computer Science and Economics.) Fortunately, we managed to do the search before the present economic unpleasantness; so I'm pleased to welcome Stephen Chong and Krzysztof Gajos to Harvard.
Stephen got his Ph.D. from Cornell and works in the area of programming languages and security. His work on the Swift system (published in SOSP'07) allows one to build secure Web applications where the client- and server-side code are automatically partitioned from a single, unified program written using the Jif variant of Java, which incorporates support for information flow in the programming model. This is a very practical approach to providing information flow support in a real system.
Krzysztof got his Ph.D. from University of Washington and is the first HCI person that we've hired at Harvard. Actually we've been looking to hire in this area for some time, but never found someone we really liked -- until Krzysztof. He is a great match for the kind of multidisciplinary work that we do here. Among other things, he developed the SUPPLE system, which automatically generates user interfaces for users with motor disabilities -- a nice combination of HCI and machine learning work.
Harvard grad students can look forward to the fact that Stephen and Krzysztof are both teaching graduate seminars this term.
(On a related note, I am delighted to have a new name-spelling challenge on my hands. It took me a couple of years to learn how to spell Mema Roussopoulos' surname without looking it up -- I am resisting the urge to create a macro for Krzysztof. This reminds me of my colleague from Berkeley, Rob Szewczyk, who once cheekily explained to me that his name is spelled exactly as it is pronounced.)
Stephen got his Ph.D. from Cornell and works in the area of programming languages and security. His work on the Swift system (published in SOSP'07) allows one to build secure Web applications where the client- and server-side code are automatically partitioned from a single, unified program written using the Jif variant of Java, which incorporates support for information flow in the programming model. This is a very practical approach to providing information flow support in a real system.
Krzysztof got his Ph.D. from University of Washington and is the first HCI person that we've hired at Harvard. Actually we've been looking to hire in this area for some time, but never found someone we really liked -- until Krzysztof. He is a great match for the kind of multidisciplinary work that we do here. Among other things, he developed the SUPPLE system, which automatically generates user interfaces for users with motor disabilities -- a nice combination of HCI and machine learning work.
Harvard grad students can look forward to the fact that Stephen and Krzysztof are both teaching graduate seminars this term.
(On a related note, I am delighted to have a new name-spelling challenge on my hands. It took me a couple of years to learn how to spell Mema Roussopoulos' surname without looking it up -- I am resisting the urge to create a macro for Krzysztof. This reminds me of my colleague from Berkeley, Rob Szewczyk, who once cheekily explained to me that his name is spelled exactly as it is pronounced.)
Wednesday, August 19, 2009
WhiteFi: Wi-fi like networking in the UHF White Spaces
This week our paper, joint with Microsoft Research, on White Space Networking with Wi-Fi Like Connectivity was presented at SIGCOMM 2009, where it actually won the best paper award. This paper lays the foundations for the first Wi-Fi like network operating in the UHF white spaces (that is, the portions of the TV spectrum unoccupied by TV channels, wireless mics, and other devices). There's been some press on this work from Technology Review, Engadget, and other sites. My student, Rohan Murty, gave the talk. He is pictured to the right, apparently wearing the UHF antenna on his head -- I am not sure whether this improves his mental capacity or not. (Update 8/24/09: The slides are now available.)
By way of background, in 2008 the FCC issued a ruling allowing unlicensed devices to operate in the UHF white spaces, under certain restrictions. Opening up this spectrum for unlicensed wireless networks is a huge opportunity -- for example, UHF devices would achieve much longer range than networks operating in the 2.4 GHz and 5 GHz ISM bands. There's been a lot of recent research on establishing individual links in the UHF white spaces, but to our knowledge nobody has proposed a network design allowing multiple clients to communicate via an access point. That's where WhiteFi comes in.
Networking in the UHF white spaces raises a number of interesting challenges. The first is that the spectrum is fairly fragmented, and we can make use of variable-width channels (unlike the standard 5 MHz channels used by existing 802.11 networks). This makes AP discovery more difficult since there are many combinations of center frequencies and channel widths that would require scanning.
The second is that, by FCC mandate, a white space device must avoid interfering with any "primary users" of the spectrum. TV channels are relatively easy to avoid, given that they don't tend to come and go (although a mobile device would need to determine when it is coming in range of a new station). It turns out that wireless microphones also operate in this band, and of course you can't predict when one might be turned on. This requires the use of channel sensing to rapidly determine the presence of a wireless mic and mechanisms for switching an access point and any associated clients over to a new channel when interference is detected.
In WhiteFi, the key idea is to use a software-defined radio to scan the physical RF channel and use an efficient algorithm for performing AP discovery without performing a full decode on the signal. The SIFT technique (described in the paper) is a simple time-series analysis of the raw samples from the SDR that quickly determines if there is an AP operating at the chosen center frequency, as well as its probable channel width. The SDR is also used to detect incumbents. WhiteFi also includes algorithms for assigning channels to APs based on spectrum availability, as well as for handling disconnections due to interference or station mobility.
Going forward, we are continuing to collaborate with Microsoft Research and are developing a white space testbed here at Harvard that will allow us to experiment with these ideas at larger scales. Ranveer Chandra, Thomas Moscibroda, and Victor Bahl from the Microsoft Networking Research group are all involved in this effort.
By way of background, in 2008 the FCC issued a ruling allowing unlicensed devices to operate in the UHF white spaces, under certain restrictions. Opening up this spectrum for unlicensed wireless networks is a huge opportunity -- for example, UHF devices would achieve much longer range than networks operating in the 2.4 GHz and 5 GHz ISM bands. There's been a lot of recent research on establishing individual links in the UHF white spaces, but to our knowledge nobody has proposed a network design allowing multiple clients to communicate via an access point. That's where WhiteFi comes in.
Networking in the UHF white spaces raises a number of interesting challenges. The first is that the spectrum is fairly fragmented, and we can make use of variable-width channels (unlike the standard 5 MHz channels used by existing 802.11 networks). This makes AP discovery more difficult since there are many combinations of center frequencies and channel widths that would require scanning.
The second is that, by FCC mandate, a white space device must avoid interfering with any "primary users" of the spectrum. TV channels are relatively easy to avoid, given that they don't tend to come and go (although a mobile device would need to determine when it is coming in range of a new station). It turns out that wireless microphones also operate in this band, and of course you can't predict when one might be turned on. This requires the use of channel sensing to rapidly determine the presence of a wireless mic and mechanisms for switching an access point and any associated clients over to a new channel when interference is detected.
In WhiteFi, the key idea is to use a software-defined radio to scan the physical RF channel and use an efficient algorithm for performing AP discovery without performing a full decode on the signal. The SIFT technique (described in the paper) is a simple time-series analysis of the raw samples from the SDR that quickly determines if there is an AP operating at the chosen center frequency, as well as its probable channel width. The SDR is also used to detect incumbents. WhiteFi also includes algorithms for assigning channels to APs based on spectrum availability, as well as for handling disconnections due to interference or station mobility.
Going forward, we are continuing to collaborate with Microsoft Research and are developing a white space testbed here at Harvard that will allow us to experiment with these ideas at larger scales. Ranveer Chandra, Thomas Moscibroda, and Victor Bahl from the Microsoft Networking Research group are all involved in this effort.
Thursday, August 13, 2009
RoboBees - A Convergence of Body, Brain, and Colony
I'm part of a team that was recently awarded a $10M NSF "Expeditions in Computing" grant for a project to develop an autonomous colony of robotic bees. This is a big effort headed up by Prof. Rob Wood at Harvard and includes a team of 11 researchers in Computer Science, engineering, and biology. The project title is RoboBees: A Convergence of Body, Brain, and Colony, and you can check out the preliminary project website here. I'm very excited about this project as it will open up a lot of research directions for programming complex behaviors in a coordinated swarm of tiny aerial robots.
The press release from Harvard describes the project as follows:
A multidisciplinary team of computer scientists, engineers, and biologists at Harvard received a $10 million National Science Foundation (NSF) Expeditions in Computing grant to fund the development of small-scale mobile robotic devices. Inspired by the biology of a bee and the insect’s hive behavior, the researchers aim to push advances in miniature robotics and the design of compact high-energy power sources; spur innovations in ultra-low-power computing and electronic “smart” sensors; and refine coordination algorithms to manage multiple, independent machines. [...]Now, what is interesting is that the release never explicitly mentions the central theme of the project -- that is to build a colony of robotic bees -- nor the title of the project ("RoboBees"). Apparently the PR machine at Harvard got nervous about some aspect of this and, despite the NSF's large investment in our project, decided it was better to tone down the language. Baffling.
Cutting through the PR fog, you can read the full description of the project on our website, as well as the award description from the NSF, which makes it pretty clear what we plan to do.
Subscribe to:
Posts (Atom)
Startup Life: Three Months In
I've posted a story to Medium on what it's been like to work at a startup, after years at Google. Check it out here.
-
The word is out that I have decided to resign my tenured faculty job at Harvard to remain at Google. Obviously this will be a big change in ...
-
My team at Google is wrapping up an effort to rewrite a large production system (almost) entirely in Go . I say "almost" because ...
-
I'm often asked what my job is like at Google since I left academia. I guess going from tenured professor to software engineer sounds l...