Here was Rick Rashid's summation of the past and future of operating systems:
|out with the old||in with the new|
|run programs||event execution|
|file focused||database focused|
|implicit data definition||self-describing data|
|explicit resource management||automatic resource management|
|program centric||task focused|
|system behaves same for all||system models user behavior and responds|
Sounds great. But I couldn't help wondering about whether we've really mastered phase one. I bought a new PC the other week. It came with Windows XP, which I'd never used. I decided to try it, so of course a few days later I was on the phone with an HP tech support guy dealing with my PhotoSmart printer. He dragged a bunch of skeletons out of the closet, including 386enh.ini if anybody remembers that. In the end he showed me how to use msiconfig to selectively disable and reorder boot-time services, and we "solved" the problem.
After a while we realized that we'd both been playing these games for 20 years. I wish we could say of today's systems that resource management is explicit and deterministic, but it doesn't really feel that way. If we haven't mastered explicit and deterministic, are we really ready for event-driven and probabilistic?
Cory is talking right now about the triumph of non-optimized systems . " Machines that are optimized for one purpose end up doing just that one thing," Cory says. This fails to account for unintended consequences . " When you are reliable, and when you optimize, you close the door on innovation," he says.
This is an issue of scale, of course. We want emergent behaviors to arise from loosely-coupled systems. At global scale, they can. But at local scale, reliability is an innovation that's still waiting to happen. It would be nice if the small pieces that are loosely joined worked well.
Former URL: http://weblog.infoworld.com/udell/2002/05/15.html#a241