This month's Communications of the ACM features an article on Google's Hybrid Approach to Research by Alfred Spector, Peter Norvig, and Slav Petrov. Since this is a topic I've blogged about here before, I thought I'd provide a quick pointer to the article:
http://cacm.acm.org/magazines/2012/7/151226-googles-hybrid-approach-to-research/fulltext
Overall I think the article does a nice job of summarizing Google's approach. The key takeaway is that Google doesn't separate its research and engineering activities: most "research" at Google happens during the day-to-day work of building products.
The benefit of this model is that it's easy to have real world impact, and the pace of innovation is fairly rapid, meaning research results get translated into products quickly. The possible downside is that you don't always get a chance to fork off long-term (multi-year) projects that will take a long time to translate into a product. However, there are exceptions to this rule -- things like Google Glass, for example -- and plenty of things I can't talk about publicly. It is true that Google tends not to do "pure academic" research just for the purpose of publishing papers. We could have a healthy debate about whether this is good or bad, but I'll leave that for the comments...
Thursday, June 21, 2012
Sunday, June 17, 2012
Startup University
The academic research process is incredibly inefficient when it comes to producing real products that shape the world. It can take decades for a good research idea to turn into a product - and of course most research never reaches this phase. However, I don't think it has to be that way: We could greatly accelerate the research-to-product pipeline if we could fix the academic value system and funding model.
Here's the problem: Some of the smartest people in the world have spent their entire careers building throwaway prototypes. I sure never built anything real until I moved to Google, after nearly ten years of college and grad school, and seven years as a faculty member. And by "real," I don't just mean a prototype that we developed for a couple of years and then threw away as soon as the papers got published. In effect, I "wasted" millions of dollars in funding, and countless man-years of development effort by my students and lab staff -- apart from a bunch of papers, nothing of practical value came out of my entire academic research career. (Maybe I'm being a little hard on myself, but let's take this as a given for sake of argument.) And I don't think my lack of real-world impact is at all unusual in a university setting.
What would the world be like if all of this hard work had actually translated into real, shipping products that people could use? How could we change the structure of academic research to close the gap between playing in the sandbox and making things real?
The plight of the academic is that there is often no direct way to translate ideas into reality -- you don't have the resources to do it at the university, and the academic process forces you to bounce between ideas every few years, rather than sticking it out to turn something into a product. In theory, academics are supposed to be patenting their ideas, and companies are supposed to come along and license the patents and turn them into real products. However, I am not aware of a single project from a computer science department that ever been commercialized through this route. This approach is more commonplace in fields like biotech, but in computer science it is rarely done.
A far more common (and successful) approach is for academics to spin out their own startups. However, this involves a high degree of risk (potentially career-ending for pre-tenure faculty), and many universities do not structure their sabbatical and leave policies to make this easy to do. Most universities also make starting a company painfully difficult when it comes to questions of IP ownership, licensing, and forcing the academic's research to be dissociated with their commercial activities. As a result, you get a bunch of super smart academics who play it safe and stay within their tenured faculty jobs, subsisting on grants and rarely commercializing their work. This means that a lot of great ideas never get beyond the prototype phase.
What I'd like to see is a university with a startup incubator attached to it, taking all of the best ideas and turning them into companies, with a large chunk of the money from successful companies feeding back into the university to fund the next round of great ideas. This could be a perpetual motion machine to drive research. Some universities have experimented with an incubator model, but I'm not aware of any cases where this resulted in a string of successful startups that funded the next round of research projects at that university.
Typically, when a startup spins off, the university gets a tiny slice of the pie, and the venture capitalists -- who fill the much-needed funding gap -- reap most of the benefits. But why not close the air gap between the research lab and the startup? Allow the faculty to stay involved in their offspring companies while keeping their research day job? Leverage the tremendous resources of a university to streamline the commercialization process -- e.g., use of space, equipment, IT infrastructure, etc.? Allow students to work at the startups for course credit or work-study without having to quit school? Maintain a regular staff of "serial entrepreneurs" who help get new startups off the ground? Connect the course curriculum to the fledgling startups, rather than teaching based on artificial problems? One might joke that some universities, like Stanford, effectively already operate in this way, but this is the exception rather than the rule.
It seems to me that bringing together the university model with the startup incubator would be a great benefit both for spinning out products and doing better research.
Here's the problem: Some of the smartest people in the world have spent their entire careers building throwaway prototypes. I sure never built anything real until I moved to Google, after nearly ten years of college and grad school, and seven years as a faculty member. And by "real," I don't just mean a prototype that we developed for a couple of years and then threw away as soon as the papers got published. In effect, I "wasted" millions of dollars in funding, and countless man-years of development effort by my students and lab staff -- apart from a bunch of papers, nothing of practical value came out of my entire academic research career. (Maybe I'm being a little hard on myself, but let's take this as a given for sake of argument.) And I don't think my lack of real-world impact is at all unusual in a university setting.
What would the world be like if all of this hard work had actually translated into real, shipping products that people could use? How could we change the structure of academic research to close the gap between playing in the sandbox and making things real?
The plight of the academic is that there is often no direct way to translate ideas into reality -- you don't have the resources to do it at the university, and the academic process forces you to bounce between ideas every few years, rather than sticking it out to turn something into a product. In theory, academics are supposed to be patenting their ideas, and companies are supposed to come along and license the patents and turn them into real products. However, I am not aware of a single project from a computer science department that ever been commercialized through this route. This approach is more commonplace in fields like biotech, but in computer science it is rarely done.
A far more common (and successful) approach is for academics to spin out their own startups. However, this involves a high degree of risk (potentially career-ending for pre-tenure faculty), and many universities do not structure their sabbatical and leave policies to make this easy to do. Most universities also make starting a company painfully difficult when it comes to questions of IP ownership, licensing, and forcing the academic's research to be dissociated with their commercial activities. As a result, you get a bunch of super smart academics who play it safe and stay within their tenured faculty jobs, subsisting on grants and rarely commercializing their work. This means that a lot of great ideas never get beyond the prototype phase.
What I'd like to see is a university with a startup incubator attached to it, taking all of the best ideas and turning them into companies, with a large chunk of the money from successful companies feeding back into the university to fund the next round of great ideas. This could be a perpetual motion machine to drive research. Some universities have experimented with an incubator model, but I'm not aware of any cases where this resulted in a string of successful startups that funded the next round of research projects at that university.
Typically, when a startup spins off, the university gets a tiny slice of the pie, and the venture capitalists -- who fill the much-needed funding gap -- reap most of the benefits. But why not close the air gap between the research lab and the startup? Allow the faculty to stay involved in their offspring companies while keeping their research day job? Leverage the tremendous resources of a university to streamline the commercialization process -- e.g., use of space, equipment, IT infrastructure, etc.? Allow students to work at the startups for course credit or work-study without having to quit school? Maintain a regular staff of "serial entrepreneurs" who help get new startups off the ground? Connect the course curriculum to the fledgling startups, rather than teaching based on artificial problems? One might joke that some universities, like Stanford, effectively already operate in this way, but this is the exception rather than the rule.
It seems to me that bringing together the university model with the startup incubator would be a great benefit both for spinning out products and doing better research.
Subscribe to:
Posts (Atom)
Startup Life: Three Months In
I've posted a story to Medium on what it's been like to work at a startup, after years at Google. Check it out here.
-
The word is out that I have decided to resign my tenured faculty job at Harvard to remain at Google. Obviously this will be a big change in ...
-
My team at Google is wrapping up an effort to rewrite a large production system (almost) entirely in Go . I say "almost" because ...
-
I'm often asked what my job is like at Google since I left academia. I guess going from tenured professor to software engineer sounds l...