The Long View on Linux, Part 1
I met Phil in 1990, not long after I met my wife, Joyce. It was kind of a package deal. Joyce collected interesting friends, and Phil was near the top of her list. She had met him earlier in Seattle while she was dating another UNIX geek. Both shared the curse of knowing they were smarter than pretty much everybody else (although Joyce is a bit more modest about it), along with a raft of common interests including travel, business, food and constant repartee spiced with affectionate insults.
Joyce always stayed at Phil's house when she visited Seattle (we live in the Bay Area). And, the fact that he can say "I've been sleeping with your wife longer than you have" and I'll buy him a beer, explains a lot about the man. Remarks like this are pure Phil. He knows where the line is, so he crosses it. Just one thing that makes him a great geek.
Phil is also a polymath. He's not just good at a lot of stuff - he's downright brilliant.
Take auto mechanics. The guy personally keeps a three-decade-old Mercedes diesel on the road, plus an electric car he made himself out of an old Volkswagen Rabbit.
He's an electronics wizard who knows a frightening amount about broadcast electronics. Being a former obsessive radio freak, I thought I knew a thing or two about radio engineering. That was until Phil and I got to talking about his transmitter-hunting days as a teenage geek in Los Angeles. As a kid growing up in New Jersey, I used to go down to the Meadowlands with a transistor radio to solve the mystery of which sets of giant towers belonged to which monster AM radio stations. Meanwhile, Phil and his buddies played go-find-it games with little home-brew transmitters they'd hide around the LA basin, sometimes playing tricks like rigging the things up to railroad tracks, so the locus of radiation would be twenty miles long. Advantage: Phil.
Last summer at our house, Phil fixed an FM transmitter kit that had been vexing me for a year. It took him about three minutes, during which he explained what was wrong with not only my tools, but with the transmitter design, starting with the chip that did most of the work. His own home-brew transmitter (of his own design, I am sure) was better. How could I argue? Why bother?
It's worse with computers, because that's Phil's business. He's been around computing longer than most humans have been on earth. He also knows a significant trend when he sees one, which is why he turned his attention to Linux before most of the world even noticed the Net.
In '93 and '94, he flattered me by including my e-mail address in a list that explored the idea of doing a free-software magazine. I followed as best I could (for a geek who was definitely not in the same caste as the other listees). But we were all surprised when Phil suddenly decided to do a magazine on Linux. What the hell was Linux? Phil knew.
In '94, when the 1.0 kernel came out, so did Linux Journal. I was amazed to see how steadily it grew, and how (as usual) Phil seemed to be right about Linux's inevitable triumph in the marketplace. Even though it was free, business would find it more useful than most of the costly stuff it competed with, he said, just like it did with the Net. And, as with the Net, commercial development was inevitable.
In early '99, Phil showed me Star Office running on a Linux laptop with KDE. It looked like such a Windows-killer that the possibilities blew my mind. Then he asked me to join Linux Journal to expand coverage of business issues, so I did. (This was right around the same time as I started co-writing The Cluetrain Manifesto, which was kind of a double-whammy career move for me.)
Last August, Red Hat went public. Cobalt, VA Linux and Andover followed. Billions of dollars in new Linux wealth was created in less than two seasons. And then, it melted like snow at the end of winter. The mainstream press, especially the old PC rags, have been tarring Linux as last year's fad. But, as Phil is quick to point out, the sum of Linux companies together have a lot more value than they did a year ago. Again, the long view. Since we're both 52, it's one I understand.
I interview a lot of CEOs who are hot for some newsy reason. But interviewing Phil makes sense for other reasons. One, he's a major Linux figure whose modesty has kept him out of his own pages. Another is that he has a perspective that comes from something more than transient good fortune. He was smart to begin with, but he's learned from experience, and he's had more than just about everybody else.
Doc Searls: You were an old hand at UNIX when Linus was still a kid messing around with a Commodore VIC-20. What can you tell us about UNIX that seems to be getting forgotten as the Linux industry moves into enterprises where the most familiar computing worlds are Microsoft NT and Windows?
Phil Hughes: My first UNIX exposure was in 1980, when I convinced the company I worked for that a UNIX system and the C language would solve all our problems. This was with an engineering-based company that manufactured semiconductor test equipment.
What we needed was a software development environment where a handful of engineers and programmers were writing code to control the equipment we were designing. The reason UNIX was the right answer is that we needed a cooperative environment where multiple users could share files; we needed compilers and assemblers; and we needed the building blocks to put together some support tools, such as programs that could reformat and download code into the machines we were building.
Someone, possibly Dennis Ritchie, described UNIX as a toolbox full of tools. When you were presented with a problem, you would divide it up into a set of smaller problems and then see if there were tools available to solve each smaller problem. It wasn't uncommon to write a little C code, maybe an awk or sed script, and a few calls to utilities such as sort and uniq and glue the whole thing together with a shell script.
This is much different from today's GUI approach, where you find one huge monolithic program and try to coax it into doing what you want. Maybe you can -- but, if you can't, you are at a dead end.
Doc: This is the stuff I learned (which might be an exaggeration, but let's pretend) in the excellent "Intro to Linux" class I took at SGI a few weeks ago. What amazed me was that the toolbox nature of UNIX made both creating tools and solving problems extremely easy. Yeah, it was certainly a brainer to learn shell commands, but it was a revelation to learn how easy it is to make and move directories, to view and change permissions, to rename and move a file at the same time. Same thing with searching through grep and regular expressions. Sure, you can do sophisticated searches with the GUI Find command in KDE, Gnome and the PC operating systems, but the speed, flexibility and innate sophistication just aren't there. And they're slow.
Phil: This is always an issue when we hire new people. They can be productive faster if we hand them Star Office, but investing in them and teaching them shell commands, vi, troff and such, makes them much more productive in the long run. The best example is in editorial, where everyone uses vi to prepare articles for layout. It would take so much longer if they used a word processor instead of vi. People like Darcy who have been with us a long time are proof of this.
Doc: I'm wondering about two things: 1) how long it takes people to learn Linux basics such as the ones you just listed, and 2) how much people who are not geeks feel empowered to solve their own problems. Some context: I've noticed in most businesses that people tend to learn five things each about Microsoft Office: how to format a document, how to do sums in a spreadsheet, how to build a slide, how to write and send e-mail and how to call a tech for help when the system fails.
Phil: Some of the people we have hired were UNIX people, so they had all the background they needed. The majority, however, didn't. Those who were most successful at learning Linux started out in jobs where they could be fairly productive with a minimum of Linux knowledge. Darcy is an example. She started here five years ago as a shipping person. She needed to learn only our proprietary database system and e-mail, as far as computers was concerned. As she learned more, she got to do more.
We see few geeks who decide they can solve their own problems on a serious level. Some have shown an interest in learning a lot more. This is why we virtually gave some systems to a few employees.
Doc: So GUI computing is secondary at Linux Journal. For the most part, the company runs in command-line mode.
Phil: Yes, but I'm not saying that GUI computing is bad, just that it isn't always the solution. My big concern is that it is changing the way people think. Rather than looking at a problem and logically addressing it, I see people deciding that a spreadsheet is the solution to every problem because all they understand is spreadsheets.
Or, to go back to a common analogy, everything looks like a nail if the only tool you have is a hammer. You can keep adding features to your hammer, but you still have a hammer.
Doc: I see it as something like auto mechanics. You can do so much more if you know how the computer really works. And too few people do, just as too few people know how a car works.
Case in point. A few days ago, a friend dropped us off at the airport in my old Subaru wagon. On the way home, she panicked as it gradually lost power and finally seemed to blow up, with big bad noises and white smoke billowing out from under the hood. It was a harrowing story as she told it, shaded by her certainty that the motor had exploded. I've never been a professional mechanic, but for years I drove nothing but cars that required a toolbox in the trunk, and I've done a lot of problem-solving work on a lot of cars. When I listened carefully to her report, it was clear that we were dealing with a cooling issue here. Sure enough, I found a broken plastic part in the middle of a hose line between the heater and the rest of the cooling system. So, I took the short tube that survived from the broken plastic part, shoved it into the ends of the two hoses, clamped them down and had the mother back on the road again in about 20 minutes. The difference is that my friend didn't understand how cars worked, or even how to fix simple problems. She never even looked at the water temperature gauge on the dashboard.
I think a UNIX jock looks at the OS in the same way a mechanically inclined driver looks at a car. If problems develop, chances are they're exposed and fixable.
Phil: It's exactly the same thing. You can even continue down this same line with the car. You don't have to understand physics to drive a car, but if you do, you will be better prepared to deal with situations such as when you are driving on slick pavement.
Doc: I think you also understand when something is a major failure, or a problem you can keep an eye on or even overlook. Something that blew my mind, as a GUI guy, about Linux (and UNIX) was that it was capable of sustaining all kinds of program and other failures, and carrying on anyway. Failures are rocks in a wide stream of processes that just keep flowing. To follow the automotive metaphor, in a sealed-hood system such as Windows or Macintosh, minor failures bring the whole thing down. The fact that Windows finally evolved enough to handle multitasking did nothing to reduce the number of what NASA calls Crit-One failures: fatal problems there's no way around. I'm told Windows 2000, the latest NT actually, is better at this, but there is no way logically (it seems to me) that it can compete with Linux for reliability, simply because too much of it is off-limits to mechanics. It's in the secret bits, sealed in concrete.
Yet the PC press doesn't ever talk about this, simply (I think) because they're used to driving fancy cars with automatic transmissions and sealed hoods. The UNIX world -- including the whole Net, and the way it works -- is a whole that cannot be understood in terms of a few personal parts, no matter how fancy they are.
Phil: NT was a new system written from the ground up. Thus, it has some advantages in that it could be written to address problems with older MS products. But, there are two problems. The first is compatibility. NT has a lot of stuff in it to make it almost compatible with Windows9x. That means it has to support some of the shortcomings of Windows9x. Second, it has an amazing number of new lines of code. That guarantees a certain number of bugs.
Doc: One of the virtues of a Linux box running a GUI such as KDE or GNOME is that you can go into terminal mode and work in the command line. You can't do that with Windows. And you can't do that with a Macintosh, although the next version of the Mac GUI will sit on open-source BSD with a Mac kernel and command-line access. If you want, you can open a shell, get in there and do real computing stuff. I'll be interested in seeing if that makes any difference in the marketplace. Apple is already quietly running most of its heavy servers on the new OS. This may be the result of Steve Jobs spending ten years in the UNIX world at NeXT.
Phil: You can get a shell on a Windows box, but there are many fewer capabilities and a lot less sophistication than with Linux. I seldom use Windows, but every time I do, the first thing I do is bring up a shell in case I want to do anything real, which I define as removing a file, moving a file, copying a file ... you get the idea. Note that the most common thing I end up doing is copying files to floppy disks so I can take them to a Linux system and use tools such as sed and awk to work on them. vi isn't a problem; I always put vi on Windows boxes.
Doc: I remember Linux Journal growing out of an email conversation -- which I was improbably part of -- about starting a free-software magazine. A couple questions here: 1) who else was in that original group? and 2) what exactly happened?
Phil: In 1993, I had an idea to do a free-software magazine. While SSC was doing pocket references for UNIX and UNIX-related programs, I decided I wanted to do this magazine independent of SSC. There were six or seven of us involved in talking about the original idea. Early players included Arnold Robbins, author of some of SSC's products; Laurie Tucker, a friend in Atlanta; Gerry Hammons, a longtime friend and co-worker from years before; David Keppel, a UW computer science grad student; and Melinda McBride, another co-worker from a previous job.
I had set up a mailing list so we could all keep in touch. One day, I realized that to do a good free-software magazine, you would have to be like Consumer Reports and have no advertising. I did a little arithmetic, and realized that we were about $9,999,900 short of the $10 million I estimated it would take to start the magazine. I posted to our list what I initially thought was a joke: why don't we just cover Linux?
Everyone thought it was a great idea. So, I decided to think seriously about it, and agreed. It had balance: it wouldn't cost much to do, but there wasn't much market either. Not sure if that was good balance, but it was balance.
I posted some questions on comp.os.linux, the only Linux Usenet newsgroup at the time, and received very favorable responses. Thus, it looked like we had an audience, so I set the wheels in motion.
Things looked pretty good, and we were almost ready to dive in full-force when some disasters struck. The most significant was that my friend Gerry died. He was doing some programming for the infrastructure we needed, plus he was going to invest. It was obvious that we needed to put the project on hold, and I made another Usenet post to let people know there was a very large bump in the road.
At the time, Bob Young had started a publication called New York Unix. Someone saw the post about our problems, and gave it to Bob. Bob contacted me, and suggested that he could be publisher if I could take on the task of editor. We decided to go for it.
After two issues, it was clear to both Bob and me that this wasn't the right relationship. We split the responsibilities, with him assuming the debt for printing already done, and I took the subscription obligation -- the 926 readers who had paid for 10 to 12 more issues of the magazine. I rolled Linux Journal into SSC, and we ran with it.
Doc: So it's true you gave Bob Young his start with Linux?
Doc: That checks out. When I talked to Bob a couple weeks ago, he gave you all kinds of credit for getting his Linux career started. When I told him "Phil says he taught you how to spell Linux," he said, "It's true! I owe a lot to Phil. He even gave me one of my favorite metaphors: that closed source is like a car with the hood welded shut." (Note: I have this conversation on tape, so I can use the original verbiage, which is close to this. He also offered to help promote your next book, whatever it might be.)
Phil: While he gives me credit for the hood metaphor, I really don't remember using it. But, I do like it.
Doc: That's like the line that produced the name for The Cluetrain Manifesto: "the clue train stopped there four times a day for ten years and they never took delivery." The guy who told me that has no memory of saying it.
Anyway, who's still involved?
Phil: Laurie Tucker is on the SSC staff, with the title Special Projects. She edits articles, writes code and is currently responsible for the LJ Buyer's Guide. Arnold Robbins wrote for us for a while, but we couldn't afford to pay what he was worth at the time so he went on to other things. David Keppel got his Ph.D. and now works for Transmeta.
Doc: One of the most interesting things to me about Linux Journal is how it seems to be written, to a large extent, like Linux itself. Many of the features and columns come from readers -- members of the Linux community. And most of these are people solving real problems. I get a sense with both Linux and Linux Journal that we're all working together to raise a barn.
Phil: It has exactly that feeling. In the early days, many authors were surprised that we actually paid them to write for us. Some just donated the money to the Free Software Foundation or other projects.
Doc: Y'know, the e-world is full of all these new acronyms around commerce: B2B for business-to-business, B2C for business-to-consumer, B2E for business-to-enterprise. None have dialogue to them. All are just new names for the old conveyor-belt model of business we've had ever since our ancestors with craft surnames like Smith, Miller, Farmer and Baker got pulled out of their shops and given a job with an industrial supplier in a mine or a factory. Business has been out of touch with real markets and real customers for a good two centuries. And most of what I read in the business press, which is itself a huge new industry, is about leveraging that model: taking what we know about industry -- this conveyor belt, a "value chain" from the few to the many -- and installing it in the technology world which, frankly, geeks built ... and not for corporate purposes.
So I have an acronym for Linux Journal: G2G for geek-to-geek. Because that's the model that is making the new world, and the great irony is that the New Economy folks haven't got a clue about it. Which makes it that much more subversive.
Phil: This isn't new. Remember, UNIX was born because 30 years ago, a couple of geeks wanted to play a computer game. I wonder if 30 years from now, most people will not even know that Linux was a student project. Actually, I wonder how many don't know that now.
Doc: How do you see the free/open movement today? How has it changed? Is it for the better?
Phil: There are two things here: open and free. We had open and free software on mainframes in the '60s and '70s. If you bought a mainframe, it came with an operating system complete with source code. When you were shelling out millions of dollars for the computer, or more commonly, leasing a multi-million-dollar computer system, why not give you the code? It wasn't like you would go to Radio Shack, buy another computer and copy the OS.
A combination of the price drop in hardware, generic hardware and Amdahl Corporation forced a change. First, Gene Amdahl, who had designed IBM's mainframes, started his own company to make machines like the IBM mainframes. By "like", I mean they had the same instruction set and could, therefore, run the same operating system. IBM now had to unbundle and charge for copies of the OS so they could make money off the sales to Amdahl users.
Doc: I've never heard this analysis before. It's really fascinating. What you're saying is that, historically at least, software really did want to be free. And Bill Gates invented an industry that didn't need to be there. This is what Neal Stephenson suggests in his great little book, In the Beginning was the Command Line.
Phil: Yes. The OS was something the computer manufacturer had to include with their hardware. Without it, they couldn't sell the hardware.
Doc: You think IBM is going back in that direction with Linux? They're communicating rather clearly that they don't much give a shit any more about selling OSes, maybe because that was never the idea in the first place. They sell iron. Why not sell iron that's easier to deploy because it runs a universal OS?
Phil: Yes, I do. IBM abandoned their web server in favor of Apache. I think they know people don't want to buy an OS -- they want to buy a solution. Actually, I think they knew this when the PC came out. I feel that Microsoft managed to confuse the issue and IBM fell for it for a while.
The first cheap and generic computer was the IBM PC. Other manufacturers jumped on the bandwagon to make inexpensive clones. That meant the OS cost again could not be bundled with the hardware cost. Besides the end of "free" in terms of price, the openness of the past was gone because there was direct competition.
The GPL is an attempt to force freedom back into the mix. It is only one example, with the BSD license being another. Each has its advantages and disadvantages, but my point is that we used to have this freedom, but because hardware changed, we now had to do something different in order to get free -- both in terms of price and freedom -- back into computing.
Explaining the BSD license to a businessperson is fairly easy. They will remain skeptical, but it is easy to tell them you can get something for free, do whatever you want with it and then sell it.
Doc: Because you have to credit only the originator, no?
Phil: Exactly. You can build a proprietary product, and don't have to pass along your additions or improvements.
The GPL is a lot harder. They don't understand why it makes sense to make the changes and then give them away.
Or that they are expected to give away whatever changes they make. In other words, not to act like they own it exclusively.
In a business sense and out of context, it doesn't. But, that context is the issue. It's the same as IBM giving you the OS with the hardware; you need to show that businessperson how giving away the OS will sell something else. Sorta like giving away a keychain at the car dealer, and then hoping you buy a car to go with it.
As each new vendor enters the Linux space, you need to educate them. You need to show them success stories. Eventually, they will get it.
Doc Searls is Senior Editor of Linux Journal
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- Paranoid Penguin - Building a Secure Squid Web Proxy, Part IV
- SUSE LLC's SUSE Manager
- Google's SwiftShader Released
- Managing Linux Using Puppet
- My +1 Sword of Productivity
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Non-Linux FOSS: Caffeine!
- SourceClear Open
- SuperTuxKart 0.9.2 Released
- Parsing an RSS News Feed with a Bash Script
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide