SandeepMathew wrote:Linux is becoming windowish .. I meant with respect to some features
not the UI .. eg Earlier linux was a plain monolithic kernel . Now
it has support for kernel modules ... It has become more modular
like windows ...
At best that's a very passing resemblance. It's like saying two people look alike because they both have brown eyes. It just isn't enough.
Kernel Colonel : I thought Mac Os X was a Microkernel ... It is something
which i read from Silberchatz .. Thanks for the information ... I dont know
much details of the Mac Os X kernel ....
The textbooks seem to be frequently wrong about which modern OSes are microkernels, but I think OS X is especially susceptible to this myth because of its association with Mach.
According to a survey much of the OS crashes occur due to faulty device drivers .. this is where exokernel comes in .. by moving driver code to user space should give more stablity ... In principle there should be less swiches from user mode to kernel mode ... But it seems ...during
the implementation .. same problems will crop up again ...
I think you're conflating exokernels and microkernels. The ideas are somewhat orthogonal. The exokernel concept can be implemented as a microkernel, but it need not be. Xok for example has kernel-mode disk and network drivers, but they are extremely bare-bones and only provide the ability to multiplex disk blocks and packets, respectively. The network driver is especially interesting in that it uses filters sent to it from user space that are expressed in a declarative language designed just for this purpose.
Alboin wrote:Really, language based protection? I like the idea that the programmer should know, to a certain extent, what he is doing. If he needs a babysitter, than he shouldn't be programming to begin with.
You clearly haven't spent any time out there in the real world of software development. People make mistakes all the time. The "babysitting" you speak of has other names -- "unit testing", "code reviews", and if you're lucky, "compiler errors". Guess which one is the cheapest and most effective way to find problems before your code is released into the wild?
To give a concrete example, the project I'm working on right now is all being developed in C++. The core of the team has 5 developers including me, with an average number of years' experience in C++ development of about 5 or 6 (I'm at the high end with over 10 years' experience). Almost
everyone on the team has put segfault-causing bugs in the code,
repeatedly. Except me of course, because I've been burned so many times that I'm extra careful now.
These are capable people who have been around C and C++ for a while, and problems like these still happen. It is really, really difficult to focus on solving the problem at hand when you're constantly battling low-level issues like memory management and type safety.
(Disclaimer: I know it's possible to write very abstract C++ using libraries like Boost and STL... this project I'm working on is signifcantly lower-level than that (lots of raw buffer manipulation and data conversion) and must be portable to *nixes where these libraries don't work well, if at all.)
I like speed.
What makes you think language-based protection is necessarily slow? It does not imply interpretation, or JIT, or reflection, or any of that stuff.
I'd rather have a crash every few years than have a slow system. (I've only had 2 crashes on my Linux box thus far; both of which were due to Beryl trashing my display.)
I'd rather have a fast system that doesn't crash, and I think so would most of the rest of the computer-using populace.
Crazed123 wrote:And I like the idea of not rewriting decades worth of work on compilers just because some jackass has started the Church of the Safe Language and actually thinks he can compile C into something safe.
The way you go on you'd think someone was asking
you, personally to rewrite decades worth of work on compilers. If other people want to do it, or are getting paid to do it,
relax and let them try!
He can't. The machine model implied by the C spec is inherently and implicitly unsafe!
Ditto on C++, Objective C, Object Pascal, assembler, and anything else with pointers.
That's why there are other languages.
The big problem I see with language-based protection is not speed, or the extra compiler R&D, but the lack of choice of languages it seems to imply for application developers. That more than anything else makes me skeptical of the idea, just like the idea of having a universal common language run-time... some languages are just too different for it to make sense.