Systems Software Research is Irrelevant

来源:百度文库 编辑:神马文学网 时间:2024/04/29 03:07:32
OSTG | ThinkGeek - Slashdot - ITMJ - Linux.com - NewsForge - SourceForge - Newsletters - PriceGrabber - Jobs - Broadband

Application Development, Porting, Kernel Modules & Drivers, Embedded.
Fully preemptive, multi-threaded, Linux-based RTOS from Concurrent
LinuxQuestions.org now has a dedicated Linux from Scratch forum.
Linux on Compact Flash. For industrial systems etc.

home |browse |articles |contact |chat |submit |faq |newsletter |about |stats |scoop 17:29 PDT

Search for in projects articles comments
Section Main Unix Handhelds Mac OS X Themeslogin «
register «
recover password «

add comment



 Systems Software Research is Irrelevant
byRob Pike, inEditorials - Sat, Aug 5th 2000 23:59 PDT
The ubiquity of PC hardware has been mirrored in the software market, where a handful of operating systems are firmly entrenched and new systems find it difficult to attract interest to themselves. In today's editorial, Rob Pike of Bell Laboratories gives his reflections on the state of systems software research, drawing on his own experiences working on Plan 9 and Inferno.
Copyright notice: All reader-contributed material on freshmeat.net is the property and responsibility of its author; for reprint rights, please contact the author directly.
A Polemic
This talk is a polemic that distills the pessimistic side of my feelings about systems research these days. I won't talk much about the optimistic side, since lots of others can do that for me; everyone's excited about the computer industry. I may therefore present a picture somewhat darker than reality.
However, I think the situation is genuinely bad and requires action.
Definitions
Systems:
Operating systems, networking, languages; the things that connect programs together.
Software:
As you expect.
Research:
Primarily academic research -- universities and a few industrial labs.
Is:
Now, not ten years ago, and I hope not in another ten years.
Irrelevant:
Does not influence industry.
Thesis
Systems software research has become a sideline to the excitement in the computing industry.
When did you last see an exciting non-commercial demo?
Ironically, at a time when computing is almost the definition of innovation, research in both software and hardware at universities and much of industry is becoming insular, ossified, and irrelevant.
There are many reasons, some avoidable, some endemic.
There may be ways to improve the situation, but they will require a community-wide effort.
A Field in Decline

"Who needs new operating systems, anyway?" you ask. Maybe no one, but then that supports my thesis.
"But now there are lots of papers in file systems, performance, security, Web caching, etc.," you say. Yes, but is anyone outside the research field paying attention?
Systems Research's Contribution to the Boom
A high-end workstation 1990 2000
Hardware
33 MHz Mips R 3000
32 megabytes of RAM
10 Mbps Ethernet
600 MHz Alpha or Pentium III
512 megabytes of RAM
100 Mbps Ethernet
Software
Unix
X
Emacs
TCP/IP
Unix
X
Emacs
TCP/IP
Netscape
Language
C
C++
C
C++
Java
Perl (a little)
Hardware has changed dramatically; software is stagnant.
Where is the Innovation?
Microsoft, mostly. Exercise: Compare 1990 Microsoft software with 2000.
If you claim that's not innovation, but copying, I reply that Java is to C++ as Windows is to the Macintosh: an industrial response to an interesting but technically flawed piece of systems software.
If systems research was relevant, we'd see new operating systems and new languages making inroads into the industry, the way we did in the '70s and '80s.
Instead, we see a thriving software industry that largely ignores research, and a research community that writes papers rather than software.
Linux
Innovation? New? No, it's just another copy of the same old stuff.
OLD stuff. Compare program development on Linux with Microsoft Visual Studio or one of the IBM Java/Web toolkits.
Linux's success may indeed be the single strongest argument for my thesis: The excitement generated by a clone of a decades-old operating system demonstrates the void that the systems software research community has failed to fill.
Besides, Linux's cleverness is not in the software, but in the development model, hardly a triumph of academic CS (especially software engineering) by any measure.
What is Systems Research these days?
Web caches, Web servers, file systems, network packet delays, all that stuff. Performance, peripherals, and applications, but not kernels or even user-level applications.
Mostly, though, it's just a lot of measurement, a misinterpretation and misapplication of the scientific method.
Too much phenomenology: invention has been replaced by observation. Today we see papers comparing interrupt latency on Linux vs. Windows. They may be interesting, they may even be relevant, but they aren't research.
In a misguided attempt to seem scientific, there's too much measurement: performance minutiae and bad charts.
By contrast, a new language or OS can make the machine feel different, give excitement, novelty. But today that's done by a cool Web site or a higher CPU clock rate or some cute little device that should be a computer but isn't.
The art is gone.
But art is not science, and that's part of the point. Systems research cannot be just science; there must be engineering, design, and art.
What Happened?
A lot of things:
PC
Hardware became cheap, and cheap hardware became good. Eventually, if it didn't run on a PC, it didn't matter because the average, mean, median, and mode computer was a PC.
Even into the 1980s, much systems work revolved around new architectures (RISC, iAPX/432, Lisp Machines). No more. A major source of interesting problems and, perhaps, interesting solutions is gone.
Much systems work also revolved around making stuff work across architectures: portability. But when hardware's all the same, it's a non-issue.
Plan 9 may be the most portable operating system in the world. We're about to do a new release, for the PC only. (For old time's sake, we'll include source for other architectures, but expect almost no one will use it.)
And that's just the PC as hardware; as software, it's the same sort of story.
Microsoft
Enough has been said about this topic. (Although people will continue to say lots more.)
Microsoft is an easy target, but it's a scapegoat, not the real source of difficulty.
Details to follow.
Web
The Web happened in the early 1990s and it surprised the computer science community as much as the commercial one.
It then came to dominate much of the discussion, but not to much effect. Business controls it. (The Web came from physicists and prospered in industry.)
Bruce Lindsay of IBM: HDLC ~= HTTP/HTML; 3270s have been replaced by Web browsers. (Compare with Visicalc and PC.)
Research has contributed little, despite a huge flow of papers on caches, proxies, server architectures, etc.
Standards
To be a viable computer system, one must honor a huge list of large, and often changing, standards: TCP/IP, HTTP, HTML, XML, CORBA, Unicode, POSIX, NFS, SMB, MIME, POP, IMAP, X, ...
A huge amount of work, but if you don't honor the standards, you're marginalized.
I estimate that 90-95% of the work in Plan 9 was directly or indirectly to honor externally imposed standards.
At another level, instruction architectures, buses, etc. have the same influence.
With so much externally imposed structure, there's little slop left for novelty.
Even worse, commercial companies that "own" standards, such as Microsoft and Cisco, deliberately make standards hard to comply with, to frustrate competition. Academia is a casualty.
Orthodoxy
Today's graduating PhDs use Unix, X, Emacs, and Tex. That's their world. It's often the only computing world they've ever used for technical work.
Twenty years ago, a student would have been exposed to a wide variety of operating systems, all with good and bad points.
New employees in our lab now bring their world with them, or expect it to be there when they arrive. That's reasonable, but there was a time when joining a new lab was a chance to explore new ways of working.
Narrowness of experience leads to narrowness of imagination.
The situation with languages is a little better -- many curricula include exposure to functional languages, etc. -- but there is also a language orthodoxy: C++ and Java.
In science, we reserve our highest honors for those who prove we were wrong. But in computer science...
Change of scale
With so many external constraints, and so many things already done, much of the interesting work requires effort on a large scale. Many person-years are required to write a modern, realistic system. That is beyond the scope of most university departments.
Also, the time scale is long: from design to final version can be five years. Again, that's beyond the scope of most grad students.
This means that industry tends to do the big, defining projects -- operating systems, infrastructure, etc. -- and small research groups must find smaller things to work on.
Three trends result:
Don't build, measure. (Phenomenology, not new things.)
Don't go for breadth, go for depth. (Microspecialization, not systems work.)
Take an existing thing and tweak it.
I believe this is the main explanation of the SOSP curve.
Unix
New operating systems today tend to be just ways of reimplementing Unix. If they have a novel architecture -- and some do -- the first thing to build is the Unix emulation layer.
How can operating systems research be relevant when the resulting operating systems are all indistinguishable?
There was a claim in the late 1970s and early 1980s that Unix had killed operating systems research because no one would try anything else. At the time, I didn't believe it. Today, I grudgingly accept that the claim may be true (Microsoft notwithstanding).
A victim of its own success: portability led to ubiquity. That meant architecture didn't matter, so now there's only one.
Linux is the hot new thing... but it's just another Unix.
Linux -- the Academic Microsoft Windows
The holy trinity: Linux, gcc, and Netscape.
Of course, it's just another orthodoxy.
These have become icons not because of what they are, but because of what they are not: Microsoft.
But technically, they're not that hot. And Microsoft has been working hard, and I claim that on many (not all) dimensions, their corresponding products are superior technically. And they continue to improve.
Linux may fall into the Macintosh trap: smug isolation leading to (near) obsolescence.
Besides, systems research is doing little to advance the trinity.
Startups
Startups are the dominant competition for academia for ideas, funds, personnel, and students. (Others are Microsoft, big corporations, legions of free hackers, and the IETF.)
In response, government-funded and especially corporate research is directed at very fast "return on investment".
This distorts the priorities:
Research is bent towards what can make big money (IPO) in a year.
The horizon is too short for long-term work. (There go infrastructure and the problems of scale.)
Funding sources (government, industry) perceive the same pressures, so there is a vicious circle.
The metric of merit is wrong.
Stanford now encourages students to go to startups because successful CEOs give money to the campus. The new president of Stanford is a successful computer entrepreneur.
Grandma
Grandma's online.
This means that the industry is designing systems and services for ordinary people. The focus is on applications and devices, not on infrastructure and architecture, the domain of systems research.
The cause is largely marketing, the result a proliferation of incompatible devices. You can't make money on software, only hardware, so design a niche gimmick, not a Big New Idea.
Programmability -- once the Big Idea in computing -- has fallen by the wayside.
Again, systems research loses out.
Things to Do
Startups are too focused on short time scale and practical results to try new things. Big corporations are too focused on existing priorities to try new things. Startups suck energy from research. But gold rushes leave ghost towns; be prepared to move in.
"Why do you use Plan 9?"
Go back to thinking about and building systems. Narrowness is irrelevant; breadth is relevant: it's the essence of system.
Work on how systems behave and work, not just how they compare. Concentrate on interfaces and architecture, not just engineering.
Be courageous. Try different things; experiment. Try to give a cool demo.
Funding bodies: fund more courageously, particularly long-term projects. Universities, in turn, should explore ways to let students contribute to long-term projects.
Measure success by ideas, not just papers and money. Make the industry want your work.
Things to Build
There are lots of valid, useful, interesting things to do. I offer a small sample as evidence. If the field is moribund, it's not from a lack of possibilities.
Only one GUI has ever been seriously tried, and its best ideas date from the 1970s. (In some ways, it's been getting worse; today the screen is covered with confusing little pictures.) Surely there are other possibilities. (Linux's interface isn't even as good as Windows!)
There has been much talk about component architectures but only one true success: Unix pipes. It should be possible to build interactive and distributed applications from piece parts.
The future is distributed computation, but the language community has done very little to address that possibility.
The Web has dominated how systems present and use information: the model is forced interaction; the user must go get it. Let's go back to having the data come to the user instead.
System administration remains a deeply difficult problem. Unglamorous, sure, but there's plenty of room to make a huge, even commercial, contribution.
Conclusions
The world has decided how it wants computers to be. The systems software research community influenced that decision somewhat, but very little, and now it is shut out of the discussion.
It has reached the point where I doubt that a brilliant systems project would even be funded, and if funded, wouldn't find the bodies to do the work. The odds of success were always low; now they're essentially zero.
The community -- universities, students, industry, funding bodies -- must change its priorities.
The community must accept and explore unorthodox ideas.
The community must separate research from market capitalization.
Copyright © 2000 Lucent Technologies Inc. All rights reserved.
Rob Pike (rob@plan9.bell-labs.com), well known for his appearances on "Late Night with David Letterman", is also a Distinguished Member of Technical Staff at Bell Laboratories in Murray Hill, New Jersey, where he has been since 1980, the same year he won the Olympic silver medal in Archery. In 1981 he wrote the first bitmap window system for Unix systems, and has since written ten more. He was a principal designer and implementor of the Plan 9 and Inferno operating systems. With Bart Locanthi he designed the Blit terminal; with Brian Kernighan he wrote The Unix Programming Environment and The Practice of Programming. A shuttle mission nearly launched a gamma-ray telescope he designed. He is a Canadian citizen and has never written a program that uses cursor addressing.
T-Shirts and Fame!
We're eager to find people interested in writing editorials on software-related topics. We're flexible on length, style, and topic, so long as you know what you're talking about and back up your opinions with facts. Anyone who writes an editorial gets a freshmeat t-shirt fromThinkGeek in addition to 15 minutes of fame. If you think you'd like to try your hand at it, letjeff.covey@freshmeat.net know what you'd like to write about.
[add comment]
 Referenced categories
Topic :: System :: Operating System
Topic :: System :: Operating System Kernels
 Referenced projects
Inferno Operating System - A small OS with a virtual machine for embedded systems.
Linux - The Linux Kernel.
Plan 9 - The Plan 9 Operating System.
 Comments
[»] I agree
byFrancesco Belletti - Oct 7th 2004 04:36:40
I think most people do not understand what Mr. Pike really intended to say.
The problem was "systems research", while a lot of people replied that "there's no need of a new OS", but I don't think that systems research could be only writing a new OS. I think that writing a new OS to replace Linux or Windows (or any other OS do you like) it's NOT systems research.
Other people say that there's the need of a lot of specific application. Mr. Pikes clearly said that this is no systems research. Maybe it's only "optimization".
I think that there's need to do a lot of research about the philosophy of building .. everything: applications, OSes, newtorks. Todat there's too much detail. Actual programming languages do not provide the tools to get a "wide view" of the systems. I want to say that there's the need of more "structure".
Objective programming languages do not provide sufficient abstraction, because logical block are connected by bad interfaces, because code reusability it's used to much or to much little, because I want to manage "ideas" rather than the boring and tightening details.
The need of "structure" involves everything: programming languages, OSes, applications, etc...
I think that in this field there a lot of research to do. Current systems are working, so people doesn't see the need of something better..until something better will be available.
(I'm sorry: my english is too bad!)
[reply][top]
[»] Maybe it's time than to move on to other fields.
byBas Burger - Aug 16th 2002 11:14:23
I can be short about this, the industry is maturing...
1 comment on Linux, because of ongoing research by many people it is supiriour to what Unix ever was...
Also is it a negative thing that this field is no longer interesting in the eyes of the writer?
I see a lot of projects by independant people on various subjects...
Also i think this person also looks a bit down on "normal people" using areas where before was his domain only, i get a feeling he feels invaded with all these common people, not special ones like he is...
:)
[reply][top]
[»] So what?
byscottg.net - Oct 28th 2000 13:29:05
My car still runs an internal combustion engine, developed long before I was born. It's continually tweaked and gets better gas mileage etc. over time. Eventually it'll be replaced by something less polluting, but until it becomes a problem, there isn't much incentive to replace it.
I'd like to see truly revolutionary stuff coming out of research, but much of the funding is going to solve current "problems", or what "we" think are current problems.
The point is, we want fast access to the web NOW. So research goes into web caching etc. to provide for those needs because the market demands it. What are you driving these days? I'll bet whatever it is has an internal combustion engine...
[reply][top]
[»] Rob Pike on Innovation -- Right On
byJax - Aug 29th 2000 12:13:25
As a former Forth programmer (Forth having become more or less roadkill under the onslaught of the vastly-less-suitable-for-embedding C, C++ and Java), I find the illustrious Rob Pike's article right on. Of course, there still exist heterodox systems to play with. IBM VM/ESA and IBM AS/400 come to mind.
[reply][top]
[»] "Research" vs. "Pure Research on Operating Systems"
byJoseph S. D. Yao - Aug 24th 2000 12:47:04
Rob is always interesting to read, listen to, or talk to. And often controversial.
But first you have to understand WHAT he is talking about.
Only a few commentators have made any attempt to distinguish pure from applied research. Pure research is seeking knowledge for its own sake. This is, despite commercial bias against it, a GOOD thing. Who knew what a transistor could do when it was invented? Who knew what complex numbers could do? There is a story - possibly apocryphal - that a leading pure mathematician who had spent his whole life studying complex numbers, learned that someone had found a USE for them, and promptly changed his field of study. I only label that story as possibly apocryphal because I do not have a name to go with it. It certainly describes that attitude that SOME pure researchers have towards the difference between pure and applied research.
Applied research is anything that has an application. Most of the research mentioned by many of the responders is, in fact, applied research. This is the sort of research that any corporate entity will gladly pay you academic research ["slave"] rates to perform. It gives them the kind of quicker return on investment that they love. Only the more farsighted companies, and those that could afford it, ever sponsored pure research. And AT&T is no more [or only a weakened shadow of itself], while IBM and Ford Aerospace ... I don't know what they are doing. That leaves academia and the remains of Lucent Bell Labs. Bellcore was the half of the late Bell Labs that was supposed to do applied research.
Computer "science" per se IS pure research. I cringe every time I see curricula with such courses as "Computer Science 101 - Fortran". Or C, or C++, or Modula - it doesn't matter. That's PROGRAMMING. It has as much to do with computer science as learning to add does to engineering a skyscraper or doing quantum physics research - which is to say, it may provide a foundation for understanding the subject, but I would not call a class in 1+1 "Engineering 101" or "Quantum Physics 101". Ir isn't even SOFTWARE ENGINEERING which, as Rob correctly says, is rarely seen anywhere these days. Software Engineering is the engineering of software in much the same way that the physical engineering disciplines engineer parts of the physical world. As Gerald Weinberg said in the 1970s, and it is still true - it was quoted at a hacker convention recently - "If we built our buildings the same way that we build our software, the first woodpecker to come along would destroy our civilization."
Which is not to say that I totally agree with Rob's bleak assessment. He has always been one to go to extremes in his pronouncements, I have assumed to provoke a reaction from a complacent community. ;-) But I think that his words should be heard and taken to heart by all. Perhaps, if you have influence with those giving grants or considering research projects [which do NOT all have to be kernel projects; that's just Rob's own pet] to think about the kind of pure research about which he is talking. I don't think we all need to find new and incompatible OS's on which to work [but remember when working on the Incompatible TimeSharing System was a mark of honour?] - we certainly can benefit from others' previous work. But the light of pure research certainly needs a lot more fuel!
[reply][top]
[»] No vacuum of ideas; we just need to get to work
bybaccala@freesoft.org - Aug 10th 2000 12:43:24
Let's face it - computers suck. They're useful, but very hard to use. There's a lot of work to be done in almost every facet of CS - operating systems, networking, file systems, programming environments, etc.
That doesn't mean that there's no room for hot new ideas, but right now implementing them requires putting up with all the other problems.
Looking for a hot new idea? Try either of these papers:
http://www.freesoft.org/Essays/robotics.htm
http://www.freesoft.org/Essays/smartcar.htm
And if you're rather fix some existing problems that chase after some kind of bleeding edge technology, I'd be happy to discuss this document, too :-)
http://www.freesoft.org/bbaccala/TODO
[reply][top]
[»] because of what they are not ...
byjeremy - Aug 9th 2000 13:54:08
Without having to become a pro-Microsoft advocate, I think that most people would agree that Linux's main appeal comes from the fact that it developed on the fringe without any commercial help. It attracts users/ developers who are willing to get it to work. Linux's future success and survival will depend on its widespread adoption by end-users. It's important to note, that for them to adopt Linux, the novelty of something non-microsoft that requires larval time to set up and get working just won't be enough.
[reply][top]
[»] The Language Defines the OS
byJesper Frickmann - Aug 7th 2000 16:03:21
I have also been struck by how similar many OS's are. The Mach kernel, for instance, is said to be completely different from UNIX because of its microkernel approach. And heated discussions, like the famous one between Linus Torvalds and Andy Tanenbaum, have been made about whether microkernels are better than monolithic kernels like Linux. But when you look at it, much of the Mach research and applications have been dealing with making UNIX servers. And the messaging system is based on procedure calls just like other UNIX system calls. And threads in Mach follow the same process model as UNIX with a kernel and a user space stack. My perception is that all OS's written in C will be more or less similar to UNIX. Because C and UNIX are made for each other. And this is not necessarily bad, as long as we like *NIX systems! But I believe that if one really wants to make a radically new system, one must also design a language to match it. My "little" pet project is therefore to first write a compiler for my new language and then use it to create a working system. I expect to release alpha about 2030!! The new language is object based. That means no stack. If we need an activation record it must be allocated in the object. New processors have 32-64 registers and most of the time everything can fit in here. This solves the problems with kernel and user space stacks, stack detachment / attachment, memory use in massively multithreaded systems etc. Moreover I want to make a distributed system with global shared virtual memory. A pointer to an object is thus usable from the entire system. If a thread takes a segment fault, the kernel checks if the object at the faulting virtual address contains a capability to enter a new protection domain. If it does, the thread is transferred to that domain and execution continued. The receiving object must then perform authentication. This makes remote messaging completely transparent. And objects can be reached in other user-spaces or in the kernel-space. This will be a system which is built on object-oriented programming from the ground and up. As opposed to e.g. MacOS X where objects are a high level abstraction. That's my humble proposal for something new!
[reply][top]
[»] True, but there is reason for optimism.
byBill Rugolsky - Aug 7th 2000 12:26:25
First off, whether one agrees with Rob Pike or not, do not dismiss him. His work on Unix/Plan9/Inferno/... speaks for itself. The Bell Labs research group consistently produces powerful, compact solutions, and Rob Pike's contributions are legion. Compare Limbo/Dis/Inferno to Java/JVM/JavaOS. I have an enormous amount of respect for Rob Pike.
I mostly agree with the points made. But the gloom and doom is overstated. Research is no longer the exclusive province of MIT, CMU, Stanford, Bell Labs, Xerox PARC, and IBM. The world had changed.
Pike says:
"There was a claim in the late 1970s and early 1980s that Unix had killed operating systems research because no one would try anything else. At the time, I didn't believe it. Today, I grudgingly accept that the claim may be true (Microsoft notwithstanding)."
This, I say without reservation as a Unix user since 1978, is entirely the fault of AT&T and the subsequent Unix license holders. (The lawyers, not Pike!) It did not have to be this way. The commercialization and fracture of Unix in the 1980's killed operating systems research. Mainstream Unix has not improved since the days of BSD Unix. Sure, Ritchie created Streams, and it was good, but SysV inherited bastardized, all-things-to-all-people STREAMS. Then the SysV people ignored the whole Unix philosophy and created SysV IPC. Yuck! Other than that, there have been few significant changes in Unix.
Until GNU/Linux mindshare exploded, the trend was for *any* academic research in computer science to be routed through the university intellectual property lawyers so that they could try to make a buck off of it. And a lot of it was being done on NT, under non-disclosure.
One can't have it both ways: either one treats OS research as an academic discipline, share results, and build on the work of others, or one patents and commercializes everything. The same is true of most other scientific and artistic inquiry. Lawyers are a friction that retard innovation.
"With so many external constraints, and so many things already done, much of the interesting work requires effort on a large scale. Many person-years are required to write a modern, realistic system. That is beyond the scope of most university departments."
This is true. But today, in addition to operating systems, we have hosted operating environments: Java, Inferno, Mozilla, ActiveX, ... These *can* be handled by one person. (Hell, look at the Java VM -- one could do something better as a class project. :-) One thing that would enormously aid kernel work is the completion of the Plex86 project, and its extension to all common architectures (PowerPC, SPARC, Alpha, MIPS, StrongARM.) Want to lend your considerable talents to a worthwile project, Mr. Pike? Jump in and help bring the Plex86 (www.plex86.org) team up to speed building a free VM system. People will be much more inclined to work on Plan 9, or GNU Hurd, EROS, TUNES, etc., if they can run them in a window on Linux, crash them at will, move files back and forth, etc. Linux developers have another solution at the moment -- a user-mode port of Linux to its own API. The port will probably get merged into the source tree some time in 2.5.x.
"Linux's success may indeed be the single strongest argument for my thesis: The excitement generated by a clone of a decades-old operating system demonstrates the void that the systems software research community has failed to fill."
"Besides, Linux's cleverness is not in the software, but in the development model, hardly a triumph of academic CS (especially software engineering) by any measure."
I remember in college how I was eager to do *real* physics research. I took Quantum Field Theory early in my undergraduate career, then marched into the office of one of the bright lights of that decade and asked for something to work on. What did he say? "Well kid, can you do QCD calculations yet? No? Well come back when you can and we can talk."
What's the point of this example? Well, one has to crawl before one can walk. Students re-implemented Unix because it was well-understood. Don't denigrate them for doing the exercises at the back of the book. Linux has proven to be maintainable and extensible, which is more than can be said for IRIX, say.
Collaborative, evolutionary development is still in its infancy. Sustaining it requires that stakeholders use what they develop. (Yes, eat their own dog food. :-) That means the system must be usable for everyday, real-world tasks. People contribute what they can, when they can -- few individuals (unlike you) can design and write an operating system from scratch. For most, that means reusing existing freely available POSIX software.
Again, the free software movement had to build its own infrastructure. GNU Hurd had an ambitious design, and ran into significant problems. Linux took a very conservative design, and produced something usable. The Linux kernel is evolving in interesting directions (towards Plan 9!): pseudo filesystems are proliferating, like procfs, devfs, shmfs, usbfs, ...; Al Viro has implemented multi-mount, union mount, and has laid the groundwork for mount traps. If he gets his way in 2.5.x, Linux will allow per-process namespaces. Linux already has a threading model built on a Plan 9-like clone(). POSIX-thread brain-damage lives out is userland, as a library. Increasingly, POSIX compatibility is done in glibc (as it was supposed to be done in HURD). Perhaps some day we can get rid of those unsafe system calls.
This is being done in an *evolutionary* way -- e.g., the Unix security model (esp. suid) and per-process namespaces interact in difficult ways. To implement secure applets, one wants to turn off the system calls that manipulate other namespaces, such as socket(). All of this is being worked on -- Linux is adopting the best practices of other systems.
Even so, Linux is not the be-all-and-end-all of OS's, and it is not intended to be.
"But technically, they're not that hot. And Microsoft has been working hard, and I claim that on many (not all) dimensions, their corresponding products are superior technically. And they continue to improve."
"There has been much talk about component architectures but only one true success: Unix pipes. It should be possible to build interactive and distributed applications from piece parts."
One thing that Microsoft can be credited with is an attempt to build components. Unfortunately, they tied it to Microsoft-compiler virtual tables, and forget security. It is also butt-ugly, as are may of their API's. These are difficult things to fix later.
What makes the pipe model work is the common currency: files and strams of bytes, newline-terminated text. We have yet to provide a "currency" at a higher level that provides the facilities of pipes and command lines (namely pluggability and configurability). This is partly a language problem -- as long as programs are linear streams of text in multiple languages that are difficult to parse and compose, it is difficult to move above this low-level currency. My best guess is that the answer lies in more reflective, introspective systems, ala TUNES. Today we have scripting (in Scheme, Python, and other languages).
Microsoft is making an effort in this area with Intentional Programming. An explicit goal is to be able to import existing (C, Fortran, ...) code and make good use of it. I have no idea whether their work is worthwhile. But the need to capture intention, no syntactic artifact is clearly one of the things holding back the development of better tools. Direct manipulation GUI's tend to convey even less intention than CLI: at least CLI's can be parameterized with variables and scripted.
"To be a viable computer system, one must honor a huge list of large, and often changing, standards: TCP/IP, HTTP, HTML, XML, CORBA, Unicode, POSIX, NFS, SMB, MIME, POP, IMAP, X, ..."
As you and your colleagues so elegantly wrote in "The Hideous Name" and "The Use of Name Spaces in Plan 9", the proliferation of namespaces in Unix and other operating systems and tools has been one of their greatest failings. And yet committees everywhere are proliferating new namespaces and rotten protocols and APIs. There is only one solution to this, and it is to reuse the work of others. As you know, this is not as great an issue with Plan 9, say, as it can use the resources of another system (say Linux or NT) as if they were local. Again, if something like Plex86 were available, it could reuse them *in the same box*.
I hope that the team at Bell Labs will become more engaged with open projects. Lead, and others will follow. Linus has shown how -- make people stakeholders. Read linux-kernel and you see that despite strong disagreement and nasty flamewars the same people continue to participate. Why? Because they are stakeholders in Linux.
You folks are expert architects and engineers -- create a framework in which others can develop an expanding economy of tools. Allow people to become stakeholders. Help people see your vision. You can design much more than you can build alone. Bell Labs did this in the seventies. It can do it again. Without the lawyers this time. What happened to Inferno is a disaster. Don't repeat that mistake.
Regards,
Bill Rugolsky
rugolsky@ead.dsa.com
[reply][top]
[»] Operating Systems Research
byJacob Hallén - Aug 7th 2000 10:55:35
I find it interesting that so many of the commenters fail to understand what Mr. Pike is saying.
- He is not praising Microsoft. He says that the fact that they have a comparatively high rate of innovation reflects the poor state of OS research.
- He is not talking about application innovation and research. There is plenty of that going around.
The current crop of operating systems are flawed in many ways. They have performance problems, they have reliability problems, and - most important - they make some things very hard to do. For instance, they make component software building so hard to do that it isn't done at all.
A few of these problems have been solved in rather elegant ways in the operating system Plan 9, which Mr. Pike has had a fairly major role in developing. He has thus showed that it is in deed possible and desirable to invest a lot more in basic operating systems research, since it would yield very interesting and useful results. However, Lucent Bell Labs, seem less than keen to go on investing in the field, and the universities are not up to the job.
Humanity would benefit greatly if we found better ways of doing things before we invest heavily in doing them badly. In the OS arena we are still at a very basic level, and quality research would in all likelihood lead to enormous improvements in the way we do things.
[reply][top]
[»] New operating Systems
bymoodfarm - Aug 7th 2000 04:10:41
I think the point has been missed about the embedded OS market, very often in application of a new processor a new OS will be evolved to handle it and be taylored to the end product. I have been involved with this a number of times and know of many more instances.
In this case OS research and application happen in the marketplace and not the academic world, it is not always prudent to publish an OS that has a finite life and usefulness
[reply][top]
[»] Hey, a system is a system...
byFrancois Isabelle - Aug 6th 2000 21:35:53
System research is irrelevant? Ok.
I think you should understand that at this point, innovations in a field such as Operating Systems cannot be driven by research alone, their is a cultural path to follow, and the origin of a mutation will probably be UNIX. But what is a system ?
A system, should it be biological or electrical needs to fill some defined functions. UNIX has the advantage of providing a well known interface for things very important such as process intercommunications and memory allocations. Some modules are still needed to control peripherals... Theese concept might dissappear in the future, but computer hardware uses memory devices and CPUs and this is not about to change. Some new approaches ( taken from distributed systems architecture ) like I20 are being implemented now : that's a way to make improvements. slowly but surely. By the way, clustering and such technics are now common and I think the real improvement in systems will probably rely more on inter-system relations than on single system rework, that's how nature reached current complexity levels! A word on interface
But still, a system must remain useable. That's why some primitive interface like the console terminal, shall remain or be replaced let s say by a voice recognition/syntesis system. A nice way to remain useable and to facilitate the user's life, might be to present him a well known interface, that's why we'll see user's shell for many years to come. Don't you tell anyone : Linux interface is not as good as Windows's. Linux interface is what you want it to be, don't confuse a poorly written or badly configured window manager or shell with Linux.
[reply][top]
[»] Irrelevance is only the first requirement!
bynotWiser - Aug 6th 2000 17:54:18
This article has triggered many thoughts in my mind. By the way of background: I have a Ph. D. in theoretical Physics, but after a few years in the field I decided that I (1) LOVED Physics as a passion and an obsession, but (2) HATED research as a practical career path. So I committed research-suicide by choosing perishment over publication and was (more or less) happy ever after.
I still do research, but on my own terms. I think that the terms under which he conducts research is the key to the author's unhappiness. What, exactly, does a researcher feel when getting his/her paycheck? I don't know about others, but I am totally unable to perform $5,000 or $10,000 worth of research in a fixed time frame. (More precisely: when I was doing research, usually I felt I had done about $0 worth; some other times about $1,000,000, but I couldn't prove it).
To me, the purpose of research is insight. Sometimes you get it, sometimes you don't. In any case, it's always foolish to pay for it. You are not paying for insight; you are paying for research articles. That, to me, is the fundamental reason why research is by definition a lousy career. Either your employer doesn't know what he is doing and you are fooling him (and feeling bad about it), or your employer has an ulterior motive that is not directly related to the depth of your insights, but more to things like the number of graduate students that you are sponsoring etc. (and you are also feeling bad about it). So: get a real job (I don't mean this insultingly -- I mean, get a job where the output can be measured more objectively), think about stuff in your spare time, and -- like the Linux kernel -- publish the results then they are ready. Trust me, you'll know when they are.
[reply][top]
[»] All rhetoric, no substance.
byJEDIDIAH - Aug 6th 2000 17:14:06
This is certainly an interesting subject that needs to be addressed. However, I don't think this author is the one to do it. He merely recycles a lot of tired pro-Microsoft rhetoric.
His comparison of Microsoft software today and Microsoft software 10 years ago does not demonstrate the 'innovation' of Microsoft but rather it's sloth. Apple, Commodore and Atari all had more 'industrially relevant' user interfaces in 1990. Microsoft's 'progress' is more a matter of playing catchup.
That brings me to my next 'computing rhetoric pet peeve': I don't use other non-Microsoft systems because they don't have the Microsoft name attached to them but because they suit me and have suited me better sooner than MSFT chose to address my needs.
This includes MacOS, GEM (atari st) and Unix.
I am a Linux user now because I was a Unix user back when Microsoft's flagship product was a program loader that forced the end user to do manual memory management and Microsoft's GUI product was not yet really ready for actual use. Unix simply did things that Microsoft product didn't. This was true for both the user interface and the core OS. The same was true for MacOS and GEM.
Windows is not an 'industrial response' to MacOS that somehow works better, it is merely the predominant vendor finally addressing a new technology and being able to lag in doing so because of the network effects in terms of device support and 3rd party software support that tend to be the biggest drag on any new OS: regardless of it's pedigree.
Java is highly derivative of C++ and in many ways quite unsuitable. I doubt this is the 'analogy' that the author was attempting to draw between Windows and MacOS but it is the one my experience would tend to lead me to. Another intersting twist of this analogy that I don't think the author quite intended but is implicit in the reality how things are rather than the associated PR: Java and Microsoft both promise an envirment where 'compatibility' need not be an issue but neither entirely delivers. Java implementations have their little quirks and there are enough variants of Windows (and configuration permutations) to make neither as promising as they claim.
I would love GNU to break down the compatibility barriers enough such that in future it is a lot easier for new and strange systems to break into the systems software marketplaces.
Some of us actually care about our tools and how effective they are: versus them being or not being Brand Foo.
[reply][top]
[»] Well, I'm for hire :)
bykorpiq - Aug 6th 2000 07:48:20
"There has been much talk about component architectures but only one true success: Unix pipes. It should be possible to build interactive and distributed applications from piece parts. "
This is something I have been playing around with a bit. Would be interesting to throw a year of full-time research into it. Anybody willing to pay?^)
A basic recipe:
- functional approach (see www.haskell.org or such)
- minimalize a "standard interface set" for all components
(think of tied data types in Perl)
- provide a framework for component implementations
- provide a CLI and a GUI for binding components together
[reply][top]
[»] I disagree with a twist
byMark Veltzer - Aug 6th 2000 07:20:02
I disagree with the author strongly.
The reason is quite contrary to what you'd think.
I think systems programming should be a small corner in CS
and not of major importance.
Do we realy want to write operating systems ? -
Its as boring as hell!!!
I do not mean to offend anyone. Operating systems are a bunch of solutions to a very simple problem - giving the world (of programming and others...) a platform on which to do REAL programming. There are many things to innovate in OS design, but no matter how you look at it most of the code will be a boring hell as it will be dedicated to actual drivers for hardware. An OS is a framework where programmers implement drivers. Sounds funny, but it is. I am not flaming anyone who writes operating systems. I think you are all saints. I realy do. You are freeing other programmers for real work. I admire people who use their talent for the good of others and systems programmers are certainly such people since they receive very little gratitude because what they do usually works behind the scenes.
Now, what do I mean by real CS work?
- Approximating good solutions to NP complete problems.
- Implementing high level attempts at AI solutions to problems.
- Actually implementing hoards of algorithms out there in real computer languages in a way that other programmers (ones who do end user software modules) could use.
- Producing very high level OO and other languages (which are NOT OS related eg Python, Smalltalk etc...). Implementing such frameworks is no less an art than systems programming.
- Producing very high level class libraries which enable software manufacturing. This is very similar in concept to systems programming as a good class library is a framework for programmers.
- Producing high level optimization systems for systems that so far have been optimized by humans.
- Producing research platforms for Physics, Chemistry, Biology, Math, Medicine etc..
- Producing data strorage platforms to free the programmers from the ongoing concern of "can I store this informating and retrieve it effiently ?.."
Systems programming compared to all of these are just the pen and pecil. Now you have to use your system to write the software.
In terms of bulk we already see that the bulk of the system is no longer produced by systems programmers but rather object modelers.
Regarding Linux and Plan-9.
I have to admit I havent tried Plan-9. I am certain that it is a high quality system. The only question I have is this:
Why should I care ?
Even if I am a computer scientist most of my work would be about solving difficult problems (in terms of complexity and algorithms) and not a more general and subtle interface to my video card...
Linux is great in that it is a good development platform for CS while MS windows is not... If we all accept the fact (and Im sure not a lot of people will doubt this fact) that most of the code in the world will eventually be written in high level languages like Java, Python, Smalltalk, Eiffel and the like.. (after we get rid of the ugly C++...) then most CS will be done in those languages. And here is the twist - this indifference to platform by most of the code is what will make systems programming flurish (not too much as this is not central to CS as I have stated). People will be free again to try and implement new systems. Due to Linux APIs will once again be documented and a systems programmer will only need to supply a known number of interfaces. I admit that this is a tough job, but once it is done he'll get all the software that runs on Linux. For example look at the success the *BSD are having due to Linux. You may claim that are no different from Linux in terms of CS and I will agree but still they do present variations in kernel algorithm and testing ground for OS algorithms. Take the HURD as yet another example of a totaly different design which will supply a libc type interface and may yet reach millions of users!!! At least the HURD people know what interface to implement (compare to the dreaded MS windows...).
[reply][top]
[»] Stagnation and ivory tower computing
byBlake Friesen - Aug 6th 2000 03:54:39
Hah! The idea ivory tower computing useless outside of acadamia is absurd. Where do you think all the core ideas that your practical computing is based on are from? The most well-known of these such as the mouse, the GUI, and ethernet from Xerox PARC were never sold commercially initially. When was the last major software innovation? The computer industry is horribly stagnated due to the deathgrip of wintel.
New GUI ideas would be nice. I've thought of some sort of GUI with piped data - different I/O streams with graphical displays and controls perhaps....?
[reply][top]
[»] Agree; go larger
byJeremy Wohl - Aug 6th 2000 01:07:17
I've had thoughts in the same vein for many years, but I think Pike underestimates the issue by narrowing to systems research. I would level coincident argument against most fields of computer science, where something less than research is occuring.
I suppose the factors cited (principly money) have made lazy what work gets published and venued. But I suspect the problem far larger and reasons more insidious -- that is, complex; it's why I left academia.
Here's hoping we're at the end of this (20-year?) cycle.
[reply][top]
[»] Computing for its own sake.
byJon Frisby - Aug 5th 2000 23:23:49
Ivory tower trash. Computing for its own sake, pure and simple. My personal favorite piece being the whining about startups doing "practical" things instead of "new" things. LOL!
If you reject the notion of computing for its own sake, what does that leave you with? Computing to *accomplish* something. There is really one fundamental argument against this whole polemic: Grandma. Yes, Grandma is a cause for systems research to have died. That's *good*. It means that computers are finally useful outside of the ivory tower.
What has the ivory tower done for us? Microsoft and the rest of the commercial computer industry brought computers into over half the homes in the USA. The Internet is something my grandma uses because commercial companies have built it into something *useful* to her. Sure, academia invented TCP/IP and let it languish for 20 years. Netsape made it a must-have for PCs everywhere.
Is (academic) systems research dead? Yes. Do I care? No. The ivory tower has yet to produce something good enough to be useful outside of academia. Let industry do the research and produce something *useful*.
-JF
[reply][top]
[»] I strongly disagree.
bybuckrogers - Aug 5th 2000 20:50:45
I own a tool box. In that tool box is a hammer. Much like any other hammer, probably like one in your toolbox. Funny how no-one does any research into new hammer designs anymore.
55 years ago people were building computers for the very first time in order to break each others cyphers. During this time we have learned a lot of new things. But once you learn something you don't keep on doing basic research into that area, you take what you have learned and give it to the engineers so that they can build actual real things with it that common people can actually use.
Linux is the fruit of the tree of knowledge that was grown over 40 years by numerous scientists and great thinkers. It would be foolish to think that because people are actually using knowledge in a constructive way that they are somehow impeding the growth of more knowledge.
This frees the scientist to go study new areas. Even here you say that people have failed, but I disagree. I think that the 80/20 rule applies to science as well as any other human endevor. That is that the first 80 percent of anything is easy, but the last 20 percent takes 80% of the time.
So, it may take a hundred years to learn 80% of what we are ever going to learn about computers, it will take another millenium before we truely master everything about even the simplist computer.
I look where computers where in the seventies, where you had to toogle in binary and read the results from incendesent lights, to having photographic quality displays that we have now, to where we will be in just a few short years.
In another 50 years it will be impossible to know if the person you are talking to on the other end of the phone line is a human or a computer.
In a thousand years computers may very well surpass us in intelligence.
[reply][top]
[»] What's new and what's limiting
byJosiahBurroughs - Aug 5th 2000 19:36:18
One of the first things I belive that limits people writing new and better OSes is the mass ammount of hardware out there. If I want to write a brand new innovative(tm) OS and I want it to gain widespread acceptance, then I will have to have printer drivers, video drivers, soundcard drivers, dvd drivers, scsi drivers, etc. That's a lot of work that most people can't go through themselves. Even corporations have a hard time with this. If I heard correctly, Microsoft keeps a giant room full of old and abandoned hardware just so their latest OS can be sure to run the 2400 baud modem I have sitting in a bag by my 386.
Only a few operating systems have overcome this hurdle (I'll get back to this), and Linux is still having problems with it. I would have been using Linux full time two years ago if my video card and sound card had worked fully. I had to finally get a new soundcard and replace my evil 56k winmodem with an extra 33.6 I found lying around.
Certain systems have an advantage when it comes to that problem. The Amiga way back in 1985(?) started with brand new stuff that it could write drivers for and is maybe on of the main reasons for the success of the system in that you could "write once run on any amiga" and I believe to be one of the major reasons game companies prefer console systems.
The next hurdle, as you stated, is all the standards running about. If you want a new OS then you better have a web browser that supports most common features, having the web browser requires TCP/IP to be supported, which requires a modem/network card you have to write drivers to.
Even C is a major hurdle that people would have to come over to create a truely new system. The new Amiga, while it supports C and is mostly POSIX complaint is being drowned out by complaints of incompatability because posix threads don't work even though the new Amiga has a far more innovative threading system that works well accross networks or with multiple PCs somewhat like Plan 9.
The scarriest thing I have noticed so far during the recreation of the Amiga, is that the first thing people do when they get a new OS is try to port all their favorite apps. bzip, tar, gzip, zip, lha were some of the first things to go onto the CVS server. One person even commented on the uselessness of the system so far. "If I want to compress files, then I have the best OS in the world, but if I want to do anything else, I'm out of luck." I'm afraid that they are just recreating Unix. Of course, as you said, they are already hard at work on that Unix Emulation Layer!
There still seems to be some research going into language design/compiler technology. Python is a good, easy to learn language that is supposed to be fairly innovative. As one of the discussion points on Advogato.org pointed out, alot of what used to be functions are being written into the languages making them easier to use and faster development tools if you know the language well. The example given on Advogato was a comparision of development time from 1990 and now. I forgot what the program was, but it took the origional person 2 weeks to write something that a Perl programmer whipped up in less then an hour. Granted there were memory concerns from 1990 that the Perl programmer could basically ignore (he ended up using something like 20 times as much memory). For another Amiga example, their new low level language which is a mix between C and assembly is pretty innovative for what they are trying to use it for... a portable low level language that translates quickly and easily to other processors.
You also ignored some of the work being done on the user level applications. While much has stayed the same such as Word processors, ie: people still using vi and Emacs and the fact that Microsoft Word still looks like the 1993 version of Lotus Works 123 or whatever it was called. There have been grand strides in what a Web browser is forced to do which you just wrote off as it still being "Netscape", while saying Windows was innovative even though it still has the name "Windows". Compare HTML from 1990 to the HTML/CSS/XML of today and suddenly a "simple" browser will take 3 years to write from scratch. You also ignore things like ICQ or the newer Jabber which seems to redefine instant messaging and other programs that are popular but are easily looked over such as minor improvement of old concepts Napster, Winamp (can your favorite music player play over 20 types of files), even Outlook.
On the Orthodoxy issue, alot of that may change with the next batch of PhDs coming out. I grew up on DOS/Windows, migrated over to Linux for research reasons, and still search out new operating systems to try out for whatever innovative features they might handle. I've never been satisfied with the choice of languages out there. I started on Pascal, moved to C, then started drifting every which way. Visual Basic, then a major revolt from that to Perl, even did some looking at dead/little used langauges Lisp, Forth, and Smalltalk. I'm now working on learning all the great features of Python.
All this complaining will get little of a solution though. The tunes project (www.tunes.org) is hard at work comparing all the languages, judging them on their verious ascpects trying to come up with something great. They almost have a working High Level Language (Slate) and are still working on the Low Level Language to replace C. It's hard to find a solution that you can get the industry to step over to. I'd love to start a company that could start from scratch and do things "The Right Way"(tm), hardware, software, everything. What should we do, have a planned Software/Hardware obsolecense day every 20 years to create a new batch of stuff that is bigger, smarter, better?
Theres my $23 (2 cents isn't enough for this rant),
Josiah Burroughs
[reply][top]
[»] Correct and Incorrect
byKar - Aug 5th 2000 18:29:03
I beg to differ too. All have to agree that the way research is taking place is different. In the 70s since there was no standard OS and people experimented with various OSs just to see explore the possibilities. Now, I think we are much more clearer as to what we want. Hence, in order to continue from where we are, people are adopting the *nix model to start off.
If u think there is no OS research going on, then what is the OS-Kit? Who r the people using it? Yes, I have to agree that not many outside the academic circles look at these, but there is research in that direction.
I think what has changed from the70s is the direction in which research is done. It is not that research has stopped in the system area.
On the other side, for a student aspiring to do research in this very area like me, it is disheartening to hear such comments from the very people who are heading such research.
KAR.
[reply][top]
[»] Is Systems Software Irrelevant? Yes, but only temporarily
byJulio Cartaya - Aug 5th 2000 14:42:45
Rob Pike's points are valid, but I would like to point at a combination of contributing factors to explain why:
(1) Research is motivated by practical needs, and most of todays' pressing needs can be met by systems that resulted from previous research, even if there is plenty of room for improvement.
(2) Economic pressures on corporations are more intense today than they were when systems like UNIX were actively researched.
(3) Perhaps the real challenges in software are so difficult that we do not have "killer ideas" on how to get started. Just to name a few: natural (visual and oral) human-computer interfaces, sentient systems, systems with initiative, systems capable of self-administration, , systems that can cooperate spontaneously...
(4) As knowledge fields mature, new results are harder to achieve and lots of talented people just start looking elsewhere until some new ideas make the field "hot" again.
[reply][top]
[»] Really like Plan-9, Opened me up to Rob Pikes points presented here
byChris Kennedy - Aug 5th 2000 12:57:10
I recently tried Plan-9, I have been using Linux since 1995 and have tried about every operating system I could. Plan-9 was proof to me of what this article is trying to point out, that not much change has occured to OS's at all since UNIX. I have also come to similar conclusions about the reasons behind this although I must say that Linux is much more viable to me as the current tool to get the job done. I think that this is complicated by the business environment today being stuck with Windows, so alot of talented people are sucked into the 'I only work with Microsoft Products' syndrome. I don't think Linux is the only answer, I really think Linux should adopt the Networking ability Plan-9 has demonstrated. I also think Window managers like 81/2 (too bad Plan-9 is not here to display that properly:) and rio/acme are very amazing in their goals compared to any others out there. But I feel there is a need to tie anything new into the current or else no one will dare adopt it early on. From reading many of the Plan-9 documents I think it is really an amazing thing at Bell Labs to be using Plan-9 in production already which really shows the true talent of people like Rob Pike there. I think more people should look into Plan-9, even if Richard Stallman says the license is evil, since some mind expansion needs to occur in this field. I still emphasize the importance of Linux still in it's performance and functionality which without I don't know if I would be here writing this today.
[reply][top]
[»] Operating Systems?
byRichard Clark - Aug 5th 2000 11:06:02
I beg to differ. I think it is not that there is no more "Systems" research, but that Systems have changed. We have created something, in Linux, and maybe Microsoft have an analog, which represents an Operating System that Works. Its not brilliant, I'm sure Linus and Bill Gates would happily admit that, but it works. The magic is that so much that was formerly "Operating System" territory, such as his example of BSD pipes, video APIs and GUIs, has become application territory, witness Corba, OpenGL, and that beautiful madness that is X windows.
Integration now with the O/S is not, except in some dubious performance-inspired cases, integration with the kernel, its an add on made possible by clever and appropriate system design.
This is, in my humble opinion, the most fantasic thing that could have happened. No longer does a new "Operating system" creator need to write device drivers for every ethernet card, figure out all the peculiarities of the archane architecture that is x86, or try and figure out the vagaries of SCSI, they have a solid base, the Linux/BSD/* kernel, which does all that, which has a team of dedicated people keeping it up to date, making performance improvements that no startup would ever be able to make, and forcing you to only worry about one thing, interfacing with the layer that gives you. The ultimate system abstraction.
Witness many of the new operating systems being made, such as Athena (http://www.rocklyte.com/) and similar, that use the Linux kernel to worry about that kind of stuff, while they get on with making the GUI/API/WhateverElseConstitutesAnOS work on a solid environment.
I think that Systems research is still happening, but it is not happening on the metal, and it is not happening an awful lot in the "Systems Research" departments of acedamia, the efforts of Linux, the GNU team, the Free/OpenBSD groups, has brought the concept of systems design down to the programmers in the field, and they've gone out and solved a vast number of problems, in dozens of different ways.
Why do all operating systems look like Unix? They don't. All computers look like Unix, the operating systems look like Debian, they look like OpenBSD, they look like Athena and BeOS, and that is where the research is.
[reply][top]
[»] Author is correct and incorrect
byFrank V. Castellucci - Aug 5th 2000 10:41:43
He is correct that he has a dark vision.
But, it is clear by his definitions that he has not even done his own research, which is somewhat of the paradox of the document.
Where are the CORBA references? Where are the SemanticWeb references? Yea, if this short-site mood is running through the community, then it is they that will be left behind to churn out verticle domain utilities, and useless invention.
[reply][top]
[»] I would be impressed ...
byMichael T. Babcock - Aug 5th 2000 10:37:33
I would be impressed, and surprised, if anyone started a research project on the design of a completely new operating system, potentially dependant on a new type of interface device system, that broke all the current rules. PCs broke the client-server rules. Windows with direct-x, etc. has broken more rules. Why not start over again and break them all? Keyboards basically suck as an I/O device for computing outside typing letters. I understand that many people in many offices do type letters for a living ... but the mouse is their only 'assistant' to making their computer work. Call it something new. Do it differently. If its too expensive to implement, think of IBM's "I can see the need for a few computers in the world" comments and realise that by the time its ready for market, it'll be cheap.
[reply][top]
[»] When I was your age, we used to do fundamental OS research.
byhari - Aug 5th 2000 10:37:15
Oh come one! This guy's arguments don't make any sense whatsoever. It is laughable to say that there is no reaseach going on because people have standardized on unix/linux. I am amazed that this guy has given MS a clean bill of health in this regard. At the risk seeming to protest too much let me draw your attention, gentle reader, to the following points:
1) Research dollars for fundamental OS research has dried up because none of the funding bodies see any point to throwing money at a project which is doomed to commercial failure from day one. This is because MS owns the OS space, not because unix is a niche player.
2) The gap in pay between industry/academia is so large that all but a handful of top researchers have left the universities. In terms of percentages AND sheer numbers, the CS talent in the universities and other research institutions is a ghost of its former self.
3) In the CS research world, despite the authror's claims to the contrary, there have been several experimental OSs that were developed to demonstrate specific OS concepts. These OSs were merely proof of concept things that illustrated how one particular problem could be solved with a new design. The end goals of such efforts are research journal publications. Such experimental OSs are never developed to any state of utility except to the project guide and his graduate students. The author seems resentful that the world has not beaten a path to the door of these OS authors. Well, a step towards solving the problem would be for these experimental OSs to actually do something useful, like supporting these "externally imposed standards" that the author is so contemptuous about.
4) Necessity is the mother of invention. When hardware was so slow, it was important for the software and OS to be as slim and compact as possible. With machines so powerful these days, that is no longer the case. Who cares if the OS uses up 75% of system resources? Certainly not the average user. For that matter, even the above-average user, like graduate students running number-crunching simulations don't really care. The only guys who seem to care about the OS efficiencies are the high-end nuclear/weather/fluid mechanics boys (they have supercomputers anyway) and the gamers. But the gamers are doomed to be on MS windows anyway. And they don't seem to have any fundamental objection to that.
Hari.
[reply][top]
[»] Sounds about right
byB!nej - Aug 5th 2000 09:26:08
This is really an interesting point - and I'll add a little bit of my own opinion to it. Seems to me that there's two kinds of software packages which get put together:
Software which does somthing we could already do, but does it better or differently. Call this expansion software. It gets bigger, brighter and better but it still does the same thing.
Software which does somthing we've never done with a computer before, or does somthing we've done before in a completely different way. Call this innovation software.
Now it seems that we get a lot of expansion and not much innovation, and GNU, Linux and many others are the great expanders right now. GNU's stated aims (AFAIK) is to replace UNIX. It's doing it bigger, better and smarter and it's giving people freedom, but it's not innovating, it's just expanding. I love GNU, but it's not gonna be a bright new world if that's all there is. (Ditto MS if you want a balanced argument)
The problem as I see it is that many people want one platform, one standard, one programming language, one interface and one system for everything. Now, before anyone raves about all the choice out there (remember, I've seen it too :)), take a step back and ask "How many are really different?". Think about it. Much of the software being made for the Linux market is just expansions of windows and UNIX software.
Now would be a good time for some people to get together, take a step back, and walk off at a right angle to the way we're all facing. Get your blinkers off.
[reply][top]
[»] OS is not hot.
byTei - Sep 3rd 2003 08:40:19
Actually writing new OS is not hot. The market look full. You can't push new OS and have some sucess. Nowdays interesting code whas written in the apps layers, and in the docs layer (web docs). OS is not more the hot stuff!.. and, of course, Hardware is not hot. Live with it, or die. Of course, some people still design new ideas, and advanced features, cool features, byzarre features. But not all people want to drive a experimental flying car, but a normal combustion car. Actually OS need to interface with toons of not-geek people that have not skills to swich is OS, and dont want to re-learn how to use a computer. My english is crap.
[reply][top]