An Interview with Bjorn Ekwall
Welcome to the unexpurgated version of Linux Journal's Linux Kernel Who's Who. If you haven't yet seen our June 2000 issue, which features 40 profiles of some of the kernel's pioneers (hackers like Lars Wirzenius, Pauline Middlelink and, of course, Linus Torvalds), make sure you get a copy from your nearest newsstand or your nearest Linux Journal web site. If you have already read the profiles, then our unexpurgated versions of the original interviews, which were e-mailed to each major contributor to the Linux kernel, may reveal a few surprises and a lot more detail.
We'll be posting the original interviews here on the Linux Journal web site over the next several weeks. So sit back and enjoy a few words from some of the folks who helped make Linux possible!
Linux Journal: How did you first learn about Linux? What were you doing in your own life at the time?
Bjorn Ekwall: Once upon a time, way, way back in the dark ages (i.e., in the late 1970s), I started using UNIX as a co-sysadmin at the university. I liked the concept so much that I decided I definitely wanted to have such a system all for myself, at home, cheaply.
The Apples and PCs available at that time were just toys, and the CP/M-copy named MS-DOS was not at all what I was looking for; "ugly hack" was my reaction then, as it is now (even when it's called MS-Windows).
I was then also a partner in a company that imported computer kits from SwTPC (6800, 6809 and 68000-based). To be honest, this might have biased me against these "new-fangled upstarts" too ...
I was looking for quality, not easy money. I'm naïve enough to believe that the money will find you if you focus on quality, even if the time lag might be large sometimes. So far, that plan has worked tolerably well ...
I left the university with my Ph.D in the middle of the 1980s, and switched to working as a "UNIX expert" full time. The proprietary Unices available lacked the freedom (and the source) I had learned to love so much from the beginning. So, in my spare time (hah!), I continued to collect and build the parts I needed to reach my goal: my own UNIX machine, will full source.
A big task and little available time made for slow progress ...
I knew about Minix, but it was too small a subset for me to give up my own work. Nor did I really like the license. In late 1992, I spent some time, as usual, looking through the postings in the Usenet newsgroup comp.sources.unix. In a comment/follow-up, I first saw someone mentioning "Linux" and I started to search for more information. With the worthy assistance of ftp-mail, I ended up downloading the floppy images of an early Slackware distribution. I think it was with the 0.99.3 kernel.
It had all the sources! It even had X11! It fit neatly into my new 386SX/25 laptop with 5MB of memory! It was close enough to UNIX for me to give in completely!
LJ: What part of Linux were you personally interested in and working on? How are you still involved with Linux development?
Bjorn: I already owned my own commercially based UNIX system (System/V.3 on a Motorola 68k VME), so I definitely wanted to network my new Linux machine with it. Problem: there was no space for a network card in the laptop, and PCMCIA had hardly been invented yet. The only networking potentially available was with a "dongle" adapter connected to the parallel port.
So, I bought a D-Link DE-600. It didn't have a driver for Linux, of course, so I built one.
I had good help from the Crynwr packet drivers (in assembler for DOS), released by Russel Nelson, when I tried to understand the inner workings of the DE-600. Let me tell you that the cycle think, edit, kernel compile, reboot, test/crash, reboot requires a lot of patience (and time!) when all you have is a 386SX/25 with just 5MB of memory! I think I went through that cycle a hundred times, at least. Most people would call that crazy. I just consider myself stubborn.
When the driver was in a state where I thought it would be usable to others, not just myself, I posted it on tsx11. It got included in the official kernel soon thereafter by the then-resident 'net boss, Fred van Kempen. It might have been kernel 0.99.5 or so. Then the e-mails really started coming in. Most commentators were happy, some had problems. I fixed the problems. Some people had bought Xircom parallel port adapters and asked me to write a driver for that one, too. There was no information available on how it worked. Even Russ Nelson had failed in getting Xircom to change their mind. So I didn't. Then some people started mailing me about a new D-Link adapter: DE-620. It had more money and was a lot "sexier" than the old DE-600.
I thought: okay, it seems to be quite an improvement, and the price is tolerable, too ... and D-Link has released adapter information before. But do I really want to do another couple hundred cycles of think, edit, kernel compile, reboot, test/crash, reboot? No, I don't.
Then lightning hit my house and burned most of my networking equipment to cinders. Okay, I got the message!
So, I bought a D-Link DE-620. It didn't have a driver for Linux &$151; of course. So I decided to build one, again. I got some info from D-Link, indirectly, about the DE-620, in the form of the beginnings of a DOS packet driver. In assembler. Since I'm really quite a lazy person, I tried to find a way to shorten the development cycle. The Linux kernels now had the initial support for kernel-loadable modules. I think it was originally put there for support of ftape, and I decided to try that method for my new driver. I'm glad I did.
I asked for alpha/beta testers on the Linux kernel mailing list, and got a very good set of victims. Joshua Kopper, especially, helped me chase D-Link to get their official approval for releasing my driver for Linux. When we found the right people, there were no problems. Ah, those were good times indeed! With the team evenly spread over the globe, there was always someone awake and doing tests and sending cheerful comments.
The support for kernel-loadable modules helped me a lot, but I felt it could be improved and extended. I added some features I thought it was lacking, and started to modify other network drivers to see if my success with kernel-loadable modules was repeatable. It was.
I sent a snapshot of my kernel patches and the modified tools to Linus for comments. He answered by putting the whole package up for ftp, together with the official kernel sources. Hmmm; I admit I was a bit flattered. A lot, actually.
Suddenly, I had been promoted to an official kernel developer!
LJ: What was most important to you about Linux? What's the very best thing about Linux?
Bjorn: The openness in the acceptance of new ideas and the ease of getting quick and high-quality feedback is definitely the most important thing about Linux, as far as I'm concerned. The basic rule "show me the code!" is the key, since it keeps in check those who have only opinions and no solutions.
LJ: How important was the GNU project, and how did the GNU Hurd factor into your thinking? Should Linux be known as GNU/Linux?
Bjorn: The GNU project has always been quite important to me, especially since it focused on making implementations of the "true UNIX" tools freely available. I looked at the Hurd project for quite some time, but since I'm a rather pragmatic person, the theoretical experiment was not interesting enough for me. In my opinion, the Hurd project also suffered from a lack of the open spirit the Linux community has.
My personal goal was to have access to a powerful environment that I could modify and add to at will. Since UNIX (and thus Linux) is quite close to my ideal, the freely available GNU tools and the freely available Linux kernel is a good match. Valuable additions are the BSD tools and the X Window system and all the other goodies developed independently over time. For me, Linux is the kernel (and, by implication, some parts of libc). GNU is a set of free tools that runs on any reasonable UNIX kernel.
So Linux is Linux and GNU is GNU. There is no need to add a GNU prefix to the name of the kernel, since the kernel was developed outside of the GNU project. If the GNU project wants to use the Linux kernel, then that's okay. It is true that RMS (Richard Stallman) has declared that the goal of the GNU project was to create a complete UNIX-compatible system, including a kernel. That does not mean the GNU project had the monopoly on the idea, and that all free kernels by default belong to the GNU project.
I've said that to RMS, and did get a sensible answer once.
LJ: What was it like to be working with others over the Internet at a time when several computer luminaries thought that organizing successful software development over the Internet was difficult, if not impossible? Did you realize how revolutionary this approach was?
Bjorn: I have never really cared much for what other people, "luminaries" or not, have had to say about what is possible. Such people have been proven wrong so many times before, and they will be, again and again. If you have an idea or if you hear about someone else having an idea that intrigues you, go for it. The only thing you risk losing is your own time, but even then, you have gained a bit of insight into something you didn't know before.
The Internet has made it much easier to find those other people who think more or less like you. The "kick" you get out of solving problems and hammering out designs together with such people is huge. The results are sometimes impressive, too. I'm not so sure there is something really "revolutionary" about this. Ad hoc cooperation by like-minded people has always been a powerful method for developing something new. The "evolutionary" thing with the Internet is the ease and speed with which you can find those other people, and of course, the interaction is so much faster and easier than ever before. TCP/IP is definitely faster than the printing press was, or ever will be.
LJ: What are you doing with your life now? What's a typical day like in your life? How do you find time for work and Linux, and how do you balance free software with the need to make a living (or the desire to become rich)? What do you do for fun?
Bjorn: I have had a lot of spin-off effects from my Linux work, even in my commercial work. It has also given me a lot of valuable connections. Quite a lot of what I am doing at the moment (and in the foreseeable future) is based on Linux. Since 1996, I have been working in several places around the globe, and I will continue to do that. At the moment, one of my primary tasks is a deep involvement in a dot-com in England.
I try to combine my other interests, traveling and music, with my work. I usually concentrate intensely on what I'm doing, which means my workdays tend to be rather long, as do my travels :-). I do "have a life", which definitely includes my two daughters, now aged 9 and 12. We have a lot of fun (when I'm not working, that is).
LJ: Who do you think other than Linus has had the most influence over the Linux community, and why?
Bjorn: It all depends on your definition of "the Linux community". If you talk about the Linux kernel, then there are several key developers who have a very significant influence. Just follow the Linux kernel mailing list for a while, and the current set of main "influencers" will be seen quickly. This set of people has changed somewhat from the early days, which is exactly the way it should be in an open environment. I think I could name several dozen, Alan Cox especially, who have had a very significant effect on kernel development.
For an extended definition of the Linux community, one that's not just the kernel development group, important contributors have been (and are) people like Matt Welsh, who headed the Linux Documentation Project, and Eric Raymond, who describes the meta project of Linux, i.e., the Open Source Bazaar. The reason I think such contributions are important is that they make us all aware that software is not worth anything in itself unless it is useful and understandable. Documentation is therefore a method where a developer has to think about how to transfer the "insight" of the software to others, especially the users.
The description of the Open Source "method" makes it easier for a developer to understand the importance and power that communicating openly with other developers has.
LJ: What do you think is the most important addition or change that is needed by Linux in order for it to succeed further? In what direction does Linux development need to go? Where is Linux's future the brightest? What is the #1 biggest threat to Linux today?
Bjorn: Linux development needs to go where the developers lead it. That's a Zen-like answer, isn't it?
You shouldn't even try to predict what unconstrained developers can achieve; you will almost always be wrong. Instead, you will almost always be amazed and overwhelmed. The only "threat" to Linux is if the world should decide not to communicate openly, especially if it were to stop using open protocols. Linux would survive even that, but its expansion might slow down, at least for a short time.
LJ: How do you feel about Linux's current popularity? Would you have preferred it stayed contained in the hacker community? Would it have survived on the fringes?
Bjorn: I really like the almost exponential rise of the popularity of Linux-based systems, since a good product deserves success and people deserve a good product. Linux would have survived without this "explosion", but it's more fun this way ...
LJ: Would it have survived without the IPOs and financial backing? What impact has the commercialization of Linux had? How do you feel about Linux profiteering and the people who make millions off of other people's volunteered efforts?
Bjorn: I have no problem whatsoever with people making money from Linux-related activities, as long as Linux stays open, which it will. The "popularity explosion" would quite likely not have happened without financial backing. The "hard-core hackers" would have been involved independently of this explosion, but the end users would not.
LJ: How can Linux compete with Microsoft in the desktop sector, and will we be able to hold the commercial sector if we don't take the desktop, as well? Can we take the desktop without ruining the spirit of Linux by dumbing it down? Where will our next areas of growth and expansion be?
Bjorn: To be completely honest, I don't really care. If it's important to other people, then they will make sure Linux is a strong and worthy competitor in whatever area they feel is important to them. It's not important enough for me to spend any significant time on. I'm only interested in getting access to an environment that fills my needs, which is what my Linux-based system does. If I need something completely new in my environment, then I will build it. If that is useful for other people, then that's a nice side effect.
LJ: How do you feel about commercial applications being written for Linux, and proprietary software and protocols in general? Do you run Linux more for philosophical reasons or practical reasons? If something that appeared to be better came along, would people jump ship? Conversely, would we stay with Linux even if it somehow degenerated, took a wrong turn, or stopped progressing?
Bjorn: If people want to create closed proprietary applications, they should. If people want to buy those applications, then both the seller and the buyer will (hopefully) be happy. I don't have a problem with that, as long as it is also possible to create open alternatives. What I would have a problem with is if someone tried to lock competitors out and customers in by creating closed proprietary protocols or file formats, or cheating by other means, such as patenting more or less obvious software algorithms.
Personally, I will almost always prefer an open alternative, since then I can learn new things, and especially since I can fix problems and extend the functionality when and how I decide to. I know what I want and expect from my system, and as long as the Linux-based distributions fill my need for a high-quality, open, UNIX-compatible environment, I will stay on. What the rest of the world decides to do is not really that important to me. It will not have a strong impact on my decisions.
LJ: Do you think the community should support only open-source/free software? How would the community survive hard times if there were a lag or down time in the continuing success of the open-source methodology? Is the free software philosophy strong enough and with enough adherents to pull us through?
Bjorn: The community consists of individuals who have the power and responsibility to make their own choices. If anyone wants to support closed software, let them. Those who feel otherwise will create open software. For some reason, I come to think about Pandora's box: once open, it can't be closed again. So unless you outlaw the Internet and the process of thinking, there is no way of stopping this. It's way past critical mass.
LJ: How do you feel about the different licenses: GPL, LGPL, QPL, etc.?
Bjorn: I admit I do not always fully subscribe to the FSF "political" agenda, since I accept that some software is closed and proprietary. My principles are:
I demand open protocols, APIs and file formats, since I want to decide what communicates with what and how.
I want open tools, since I can then adjust them to do what I want.
I like open applications, since I can then fix bugs without having to wait for the next release.
This means I can live with closed applications as long as they use open protocols, APIs and file formats. I'm a bit ambivalent to the "GPL virus", which means I prefer LGPL for libraries. It should be possible to translate your "business edge" into applications without having to reveal everything to everyone. On the other hand, I like to learn new things by reading the source.
So, if there is a closed application which I feel a great need to modify, and no suitable open-source equivalent, then I will quite likely spend time re-writing it. I will also quite likely make the source freely available. The license I will use will probably be some version of the open-source licenses. The bottom line is that the people who do the release should decide for themselves what license to use. It's nobody else's business.
LJ: Is there a world outside of computers? Are you ever afraid that you'll wake up one day and feel you have wasted your life in front of a computer?
Bjorn: There is definitely a world outside of computers, and I try to enjoy it as much as possible. I have never felt a need to waste any significant part of my time regretting my past actions. I have enjoyed most of what I have done, and have learned something useful from the rest.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- SUSE LLC's SUSE Manager
- Linux Kernel Testing and Debugging
- My +1 Sword of Productivity
- Tech Tip: Really Simple HTTP Server with Python
- Managing Linux Using Puppet
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Returning Values from Bash Functions
- Raspberry Pi: the Perfect Home Server
- Rogue Wave Software's Zend Server
- Non-Linux FOSS: Caffeine!
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide