Cruise Report 3: New Species Discovered at Sea

Doc expands on insights he gained while hanging with A-list kernel maintainers on last month's Linux Lunacy Geek Cruise.


On the Linux Lunacy Geek Cruise last month, I had the pleasure of
getting some very educational hang-time with 2.6 kernel maintainer
Andrew Morton and filesystem guru Ted Ts'o. I came away not only
with enormous respect for Andrew, Ted and the kernel crew, but with
an even deeper realization that Linux is not an OS in the sense we've
been taught to think about one by the commercial marketplace.

Meaning, Linux isn't a product.

Linux is a project, of course. Specifically, it's an open-source
development project. But that's a technical explanation, a noun
compound. As a description, "project" doesn't convey the vast
difference in nature between Linux and commercial platforms such as
Microsoft's Windows and Apple's MacOS. Even when Linux is packaged
into a distro, it's still radically different. That difference is
greater than the one between product and project, commercial and
noncommercial, closed and open. It's a difference in nature itself.

In the past, I've characterized Linux's nature as that of a building
material. Furthermore, I've characterized it as natural building
material, similar to wood or granite. I've said it occurs in human nature,
produced by human thought and work. I still believe all that is true.
But now I believe it goes deeper than that.

This is what I learned from Andrew and Ted. After sitting in on their
sessions (which ran many hours--the cruise was something of a crash
course at sea), I realized that Linux's nature, as a building
material, is akin to that of a species.

As I write this, the creators of commercial OSes are laboring to
anticipate forms of work that nobody is doing yet. Even OS X, which
sits on a base of BSD, is being altered constantly to anticipate
purposes nobody is pursuing yet. It's the same with Windows developers working
on Vista, the Windows successor due for release next year. Theirs
are products that practice Marketing 101 for consumer electronics:
create something that mothers a need. For Apple it might be HD video
editing. For Microsoft it might be multilayered Web services. For
Linux it might be...

How about..."Stamping out bugs"?

That was Andrew Morton's answer to a "What gets you excited?"
question on the boat. His answer was a huge clue for me, because I realized
Andrew's job is almost entirely reactive rather than proactive.

I used to wonder why Linus, when he spoke on earlier cruises, would
answer so many questions with "I don't care", and then explain why he
didn't care by saying "That's user space". At the time I thought this
simply was a way for Linus to confine the scope of his interests and keep distractions
away. Which it was, of course. But it was also something much sharper
than that, something that is becoming clearer to me now.

User space is where commercial products live. Right now, on the
desktop and the laptop, most of the world's users are working on
Windows or OS X. Andrew doesn't doubt that Linux will get there too--eventually.
But he also isn't holding his breath waiting, because he
knows evolution moves at an infuriatingly slow pace.

When I told him about my experiences with different distros on my IBM
ThinkPad T40, and why I ended up going with the Novell (SUSE)
desktop, he dismissed my personal example as a misleading one.
Instead, he put the matter in the deep perspective of evolutionary time:

The server space is much simpler. Basically what you have is the
kernel, a C library and, on top of that, a bunch of proprietary and
legacy resource stuff which is all tightly controlled by the
organization implementing the server.

With a desktop there are many many many more components. And it's
supposed to run on lots and lots and lots of different sorts of
hardware. So the problem is much more complex, both from an
engineering point of view and also from a human point of view.

However, if the desktop were more commercially important, then the
resources would be there, the teams would be there. To some extent I
was hoping that this would come about as Novell switched over--because
Novell has a lot of trading desks and things like that. But
I doubt if Linux, for a top-tier knowledge worker like yourself, is
an area where Novell intends to be aggressive for the next five years.
I think you're a classic case of one of Linux's perception problems, in
that, basically, a machine like (yours) is the last frontier,
in terms of complexity, and the demands you put on it. Connectivity,
plug and play...that's tough...

The way it should proceed for the desktop side should be in simple
areas. The kiosk situation. Point of sale. The trading desks, the
call center desks, those sorts of things. Once we get up to 15%
market share there and become commercially significant,
companies will start investing more in it. Then we'll do really well
in that space and move up to the next tier. Office workers who run a
couple of applications. Nothing free-wheeling yet.

And then more resources start flowing in, and we hit 15%
market share in that space. And we move up another tier at a time.

Eventually we should be able to address this sort of machine. But I
think you're trying to leap too many phases there. For commentators
such as yourself, the expectations are too high. On the plus side,
writers will come along, again and again, and see how much better we've
got. Things are improving.

Kernel development is not about Moore's Law. It's about natural
selection, which is reactive, not proactive. Every patch to the
kernel is adaptive, responding to changes in the environment as well
as to internal imperatives toward general improvements on what the
species is and does.

We might look at each patch, each new kernel version, even the
smallest incremental ones, as a generation slightly better equipped
for the world than its predecessors. Look at each patch submission--or
each demand from a vendor that the kernel adapt to suit its needs in
some way--as input from the environment to which the kernel might adapt.

We might look at the growth of Linux as that of a successful species
that does a good job of adapting, thanks to a reproductive cycle that
shames fruit flies. Operating systems, like other digital life forms,
reproduce exuberantly. One cp command or Ctrl-d, and you've got a
copy, ready to go--often into an environment where the species
might be improved some more, patch by patch. As the population of the
species grows and more patches come in, the kernel adapts and
improves.

These adaptations are reactive more often than proactive. This is
even, or perhaps especially, true for changes that large companies want.
Companies such as IBM and HP, for example, might like to see proactive changes made to
the kernel to better support their commercial applications.

Several years ago, I had a conversation with a Microsoft executive who
told me that Linux had become a project of large commercial vendors,
because so many kernel maintainers and contributors were employed
by those vendors. Yet Andrew went out of his way to make clear,
without irony, that the symbiosis between large vendors and the Linux
kernel puts no commercial pressure on the kernel whatsoever. Each
symbiote has its own responsibilities. To illustrate, he gave the
case of one large company application:

The [application] team don't want to implement [something] until it's
available in the kernel. One of the reasons I'd be reluctant to
implement it in the kernel is that they haven't demonstrated that
it's a significant benefit to serious applications. They haven't done
the work to demonstrate that it will benefit applications. They're
saying, "We're not going to do the work if it's not in the kernel".
And I'm saying ,"I want to see that it will benefit the kernel if we put it
in".

He added, "On the kernel team we are concerned about the long-term
viability and integrity of the code base. We're reluctant to put
stuff in for specific reasons where a commercial company might do
that." He says there is an "organic process" involved in vendor
participation in the kernel.

Earlier this year, I had a conversation with IBM's Dan Frye in which he
said the same thing. He also said that it had taken IBM a number of years to learn
how to adapt to the kernel development process rather than vice versa.
Andrew explains:

Look for example at the IBM engineers that do work on the kernel.
They understand [how it works] now. They are no longer IBM engineers
that work on the kernel. They're kernel developers that work for IBM.
My theory here is that if IBM management came up to one of the kernel
developers and said, "Look, we need to do that", the IBM engineer
would not say, "Oh, the kernel team won't accept that". He'd say "WE
won't accept that". Because now they get it. Now they understand the
overarching concern we have for the coherency and longevity of the
code base.

Given that now these companies have been at it sufficiently long,
they understand what our concerns are about the kernel code base. If
IBM needs a particular feature, the [company] can get down and put it in the
kernel. Just as they would for AIX. There are some constraints about
how they do that, however, and they understand that.

But it has to be good for the kernel. And good for supporting, as
Andrew puts it, "serious applications".

Kernel space is where the Linux species lives. User space is where
Linux gets put to use, along with a lot of other natural building
materials. The division between kernel space and user space is similar
to the division between natural materials and stuff humans make out of
those materials.

More clues came from the language being used. Tree. Fork. Branch.
Bug. Viability. Longevity. Even kernel is a term from nature.

Years ago I got to talking with Jackson Shaw, then of Microsoft,
about embedded operating systems. When he asked me to name the three
top applications for embedded Linux, I countered with, "Name the three
top applications for lumber." I enlarged my point by adding, "Linux
grows on trees." Rather than argue with that, Jackson ran with it,
coming to the conclusion, "Linux doesn't grow on trees. It IS trees."

After hanging out with these wise men for a week on a boat, I'm
starting to understand a bit more about how those trees evolve.
Resources
IT
Conversations - Doc Searls: DIY-IT

"Cruising the
Kernel with Andrew, Ted and the Gang, Part I"

"Cruising
Without a Bruising"

Doc Searls is Senior Editor of Linux Journal, for
which he writes the Linux for Suits column. He also presides over
Doc Searls' IT Garage,
which is published by SSC, the publisher of Linux
Journal
.

______________________

Doc Searls is Senior Editor of Linux Journal

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Development of OS Linux

Serg's picture

In OS Linux there are many functions inaccessible to the users Windows.
It concerns the removed administration and other directions.
The lack of the software constrains a wide circulation Linux.
I hope in the future this situation to change and OS linux will more widely be distributed.

The same old thing.

Anonymous's picture

They perpetuate the attitude that X11 (or any GUI) is an application. Literally just an application.

And its their perogative. But it contributes to a lack of end-user focus in the GNU/Linux world. Almost any user concern meets with the "I don't care" attitude that percolates among the most capable in the community. That is, unless you're a programmer working on server software.

GNU/Linux has pretty icons now. But its not taking common and essential user scenarios into account with any level of consistency.

We always hear how Microsoft has become anti-vendor, killing off their own developer base (Lotus, Wordperfect, Netscape are examples)... Yet on our side of the fence the Linux OS distributor has become king; Users are expected to get all their applications and drivers from "the respository", and the familiar (and valid!) scenario of visiting an app vendor's website, downloading an installer, installing, then expecting AT LEAST one icon available to use the new software... this is simply not honored. Making this possible is not only not a priority, its sometimes actively frowned upon for no reason or BS reasons. And they are BS, as Mac OS X has proven!

And do I need to bring up drivers?

Stop whining that MS products don't conform to certain technical standards and create a standard for the user already.

Jackson Shaw at Vintela

xoddam's picture

I suppose that's the Jackson Shaw who now works for Vintela?

This is a company which helps MCSEs use Windows to do what Linux does better. I suppose somebody has to do it :-)

Not a common LJ article

crysaz's picture

But I liked it though. It really opened up my own point of view.

Webinar
One Click, Universal Protection: Implementing Centralized Security Policies on Linux Systems

As Linux continues to play an ever increasing role in corporate data centers and institutions, ensuring the integrity and protection of these systems must be a priority. With 60% of the world's websites and an increasing share of organization's mission-critical workloads running on Linux, failing to stop malware and other advanced threats on Linux can increasingly impact an organization's reputation and bottom line.

Learn More

Sponsored by Bit9

Webinar
Linux Backup and Recovery Webinar

Most companies incorporate backup procedures for critical data, which can be restored quickly if a loss occurs. However, fewer companies are prepared for catastrophic system failures, in which they lose all data, the entire operating system, applications, settings, patches and more, reducing their system(s) to “bare metal.” After all, before data can be restored to a system, there must be a system to restore it to.

In this one hour webinar, learn how to enhance your existing backup strategies for better disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible bare-metal recovery solution for UNIX and Linux systems.

Learn More

Sponsored by Storix