I've been doing a lot of thinking lately about the role of academic computer science research vis-à-vis the state of the art in industry. When I was in grad school at Berkeley, we were building "big systems" -- the 200+ node cluster that I did all my research on was on the rough order of the size of sites like Inktomi and Google at the time. Since then, industry has scaled up by orders of magnitude, leaving academics in the dust. These days it's not clear that it makes sense for a university research group to work on systems problems that try to approximate industry: not only do we not have the resources, we simply have no idea what the real problems are at that scale. My friend and colleague Steve Gribble told me that after doing a sabbatical at Google, he decided that there's no way a university group could compete with what they're doing.
So what should academics be spending their time on? Too many academic systems researchers, I think, are doing "industry research in the small", with project horizons on the order of 2-3 years, not things that are radical departures from the state of the art. Is this the right approach? In a large department, it might be possible to approximate industry; David Patterson's PARLab at Berkeley is an example. This takes a tremendous amount of industry funding, though, and it's not scalable in the long run -- and there are not many people like Patterson who can pull that off. So what are the rest of us to do?
Ideally, academics should be pushing the envelope of computer science well beyond where industry is looking today. My research at Harvard has focused on radically new computing platforms -- mostly wireless sensor networks -- and our RoboBees project is pretty out there too. The downside is that industry isn't as interested in these problems, so it's harder to get funding from industry when you're not working on problems that are on their critical path. The other problem is that working on sci-fi it's more difficult to have impact on the real world, unless you're willing to wait for many years.
DARPA could be our savior. When I started my faculty job in 2003 I was told that getting money from DARPA would be no problem, but during the Tether years it mostly dried up. As a result nearly all of my research at Harvard has been funded by relatively small NSF grants plus various industry gifts. I am very hopeful that with Peter Lee heading up the new TCTO office that we'll see bolder initiatives from DARPA to bring back the good old days.
In the meantime, I guess I could find some bugs to squash in the Linux kernel...