Skip to main content

Will DARPA save computer science?

I've been doing a lot of thinking lately about the role of academic computer science research vis-à-vis the state of the art in industry. When I was in grad school at Berkeley, we were building "big systems" -- the 200+ node cluster that I did all my research on was on the rough order of the size of sites like Inktomi and Google at the time. Since then, industry has scaled up by orders of magnitude, leaving academics in the dust. These days it's not clear that it makes sense for a university research group to work on systems problems that try to approximate industry: not only do we not have the resources, we simply have no idea what the real problems are at that scale. My friend and colleague Steve Gribble told me that after doing a sabbatical at Google, he decided that there's no way a university group could compete with what they're doing.

So what should academics be spending their time on? Too many academic systems researchers, I think, are doing "industry research in the small", with project horizons on the order of 2-3 years, not things that are radical departures from the state of the art. Is this the right approach? In a large department, it might be possible to approximate industry; David Patterson's PARLab at Berkeley is an example. This takes a tremendous amount of industry funding, though, and it's not scalable in the long run -- and there are not many people like Patterson who can pull that off. So what are the rest of us to do?

Ideally, academics should be pushing the envelope of computer science well beyond where industry is looking today. My research at Harvard has focused on radically new computing platforms -- mostly wireless sensor networks -- and our RoboBees project is pretty out there too. The downside is that industry isn't as interested in these problems, so it's harder to get funding from industry when you're not working on problems that are on their critical path. The other problem is that working on sci-fi it's more difficult to have impact on the real world, unless you're willing to wait for many years.

DARPA could be our savior. When I started my faculty job in 2003 I was told that getting money from DARPA would be no problem, but during the Tether years it mostly dried up. As a result nearly all of my research at Harvard has been funded by relatively small NSF grants plus various industry gifts. I am very hopeful that with Peter Lee heading up the new TCTO office that we'll see bolder initiatives from DARPA to bring back the good old days.

In the meantime, I guess I could find some bugs to squash in the Linux kernel...


  1. David OppenheimerApril 26, 2010 at 6:38 PM

    Matt, I agree with you. Also, I'd observe that it has been almost exactly ten years since Rob Pike wrote the famous "Systems Software Research Is Irrelevant" talk. Maybe it's time for you to come out with an updated version?

    BTW I think companies are being very short-sighted if it's true that "it's harder to get funding from industry when you're not working on problems that are on their critical path." I would think that industry would be *primarily* interested in off-the-beaten-path research that isn't just duplicating stuff they're already working on internally. Of course, it still has to be relevant to their business, so maybe the difficulty is finding projects that fall into that middle region.

  2. Actually, Microsoft has been very good about funding our "far out" research. We've also gotten some funding from other sources, but since I don't work on conventional systems problems it's always a bit of a hard sell...

  3. I agree with this post. Most of the top system conferences are full of papers that deal with 'hot problems' that are currently faced by the industry (e.g., issues with data center management etc). I wonder whether it is because conferences are less open to more radical, long term ideas or it is just that very few researchers are doing this kind of research.

  4. I work in an industry research lab. I wish I could work on a project with a 2-3 year horizon.

  5. agree with you. Check more at: here


Post a Comment

Popular posts from this blog

Why I'm leaving Harvard

The word is out that I have decided to resign my tenured faculty job at Harvard to remain at Google. Obviously this will be a big change in my career, and one that I have spent a tremendous amount of time mulling over the last few months.

Rather than let rumors spread about the reasons for my move, I think I should be pretty direct in explaining my thinking here.

I should say first of all that I'm not leaving because of any problems with Harvard. On the contrary, I love Harvard, and will miss it a lot. The computer science faculty are absolutely top-notch, and the students are the best a professor could ever hope to work with. It is a fantastic environment, very supportive, and full of great people. They were crazy enough to give me tenure, and I feel no small pang of guilt for leaving now. I joined Harvard because it offered the opportunity to make a big impact on a great department at an important school, and I have no regrets about my decision to go there eight years ago. But m…

Rewriting a large production system in Go

My team at Google is wrapping up an effort to rewrite a large production system (almost) entirely in Go. I say "almost" because one component of the system -- a library for transcoding between image formats -- works perfectly well in C++, so we decided to leave it as-is. But the rest of the system is 100% Go, not just wrappers to existing modules in C++ or another language. It's been a fun experience and I thought I'd share some lessons learned.

Why rewrite?

The first question we must answer is why we considered a rewrite in the first place. When we started this project, we adopted an existing C++ based system, which had been developed over the course of a couple of years by two of our sister teams at Google. It's a good system and does its job remarkably well. However, it has been used in several different projects with vastly different goals, leading to a nontrivial accretion of cruft. Over time, it became apparent that for us to continue to innovate rapidly wo…

Running a software team at Google

I'm often asked what my job is like at Google since I left academia. I guess going from tenured professor to software engineer sounds like a big step down. Job titles aside, I'm much happier and more productive in my new role than I was in the 8 years at Harvard, though there are actually a lot of similarities between being a professor and running a software team.

I lead a team at Google's Seattle office which is responsible for a range of projects in the mobile web performance area (for more background on my team's work see my earlier blog post on the topic). One of our projects is the recently-announced data compression proxy support in Chrome Mobile. We also work on the PageSpeed suite of technologies, specifically focusing on mobile web optimization, as well as a bunch of other cool stuff that I can't talk about just yet.

My official job title is just "software engineer," which is the most common (and coveted) role at Google. (I say "coveted&quo…