Yesterday's announcement of the Apple iPad -- basically an iPhone with a larger screen and no camera -- ushered in a new era for OS design. That is, an OS that was originally designed for a phone (the "iPhone OS", which is itself a stripped down version of Mac OS X), is now being ported over to larger, much more capable devices. I have no doubt that the iPhone OS will be very successful on the iPad. It would not surprise me to see it running on laptops and desktops in the near future. This simplified OS eliminates most of the complexity that makes people hate computers: the quagmire of configuration options, keeping up with upgrades, and of course the constant battle against viruses and malware. When I first saw the iPad, I immediately recognized that this is going to be the perfect computer for "non-computer users", like my dear mother-in-law, who pretty much only uses her Windows PC to read email and surf the web. (I'm not sure she's ever saved a file to the hard drive.) But it's also going to be the perfect computer for those of us who just want something that works. Like the original Macintosh, the iPad raises the bar of user-centric computer system design. It is elegant in its minimalism and simplicity.
Still, this trend of dumbing down the OS raises some interesting questions. The iPad OS lacks many features that operating systems have had for decades: multitasking, an open application API and development tools, multiple protection domains -- heck, it doesn't even have a proper filesystem. Arguably, it is this lack of features that makes the iPhone and iPad platforms so attractive to application developers and users alike. This trend suggests very strongly that the feature-rich, complex OSs that we love so much are going to look too baroque, top-heavy, and expensive to survive in a field where the "OS" is little more than GUI gloss over a pretty basic system. (Something tells me that in a few years we're going to get iPad OS v2.0 which builds in some of these "advanced" features that date back to the 1960's.)
Basically, what I'm saying is that the iPad would appear to make most current OS research irrelevant. Discuss.
(Update: Check out this petition that claims that the iPad's DRM is "endangering freedom." I'm not sure I'd go that far, but an interesting perspective nonetheless.)
Subscribe to:
Post Comments (Atom)
Startup Life: Three Months In
I've posted a story to Medium on what it's been like to work at a startup, after years at Google. Check it out here.
-
The word is out that I have decided to resign my tenured faculty job at Harvard to remain at Google. Obviously this will be a big change in ...
-
My team at Google is wrapping up an effort to rewrite a large production system (almost) entirely in Go . I say "almost" because ...
-
I'm often asked what my job is like at Google since I left academia. I guess going from tenured professor to software engineer sounds l...
Wow...CS nerd snipe. :)
ReplyDeleteConsider that perhaps you're projecting preconceived notions of what a computer is onto the iPad. Maybe it shouldn't be thought of as your desktop but rather as a special-purpose device. Your car is in many ways just as much of a computer system, with processors, software, etc., but you don't bemoan the irrelevance of OS advancement just because you don't know how to context-switch your car. ;)
OS research is still very much relevant, but which parts are relevant where is very much dependent on context. With cloud computing and the increased use of shared infrastructure, parts of the computing world are going back to a time-sharing-ish world instead of a one-PC-per-person world. In that world, resource sharing, isolation, etc., topics one usually thinks of when one thinks about an "operating system," are still very apropos even if they seem less so in battery life-conscious single-application devices.
I would also note that people seem to get too caught up in focusing on the operating system. Any system is a chain of tools and agents that stretches from the mental model of the programmer, to the compiler tools, to the computing and user environment, to the OS, and down to the hardware. There's no commandment that says certain properties have to be enforced at certain levels in all situations. Certain things we think of the OS as doing can often just as easily be accomplished (or accomplished with a different set of tradeoffs) somewhere else, e.g., the compiler toolchain. Apple chose a route where safety is maintained not only by the OS but by a limited API and a by-hand vetting process. For their environment, maybe that works. In others it might not.
The Web browser is now the OS. This is a good thing - it means that we have yet another chance to reinvent everything from the ground up!
ReplyDeleteNice analogy of ipad OS: regular OS::automatic transmission : manual transmission
ReplyDeletehttp://daringfireball.net/2010/01/various_ipad_thoughts
Well I agree with what is said above. I was really interested in buying a iPad but now a tablet with no multi tasking seems useless. While I need a tablet mainly for internet, email and reading books, I still might want to use it for more now and then.
ReplyDeleteAs for Cloud OSs, I am working on software that enable cloud computing (distributed services managers, virtualization), but I wouldnt want all my data and OS to go to a cloud..I prefer to carry it around on my laptop...
Not only is the iPad a dumbed down mac. But to a lesser extent so is Windows 7. Windows 7 does the same thing as Vista but uses less resources. This was the first Operating System made by Microsoft that that used less resources than the one before it. It seems that we have reached a point that the hardware has topped out at what people do the most. The next innovations will occur in network speeds. Because by going network centric is the only way content can be controlled. Steve Jobs sees the future and he is taking the control of it out of our hands.
ReplyDelete