Linux on Alpha: A Strategic Choice
“Leyenooks?,” I asked, “What is that?” I must admit that I was skeptical. Although the young man in front of me seemed amicable enough, it was hard to imagine that he headed up an effort to create a freeware Unix-like operating system. However, Kurt Reisler was enthusiastic about him, and after ten years of association with Kurt as the chairman of the Unix Special Interest Group (UNISIG) of the Digital Equipment Corporation User's Society (DECUS), instinct told me to go along with his ideas. That is why I asked my management to fund Linus Torvalds' first trip to DECUS in New Orleans (spring, 1994), and to fund some equipment at the show to demonstrate Linux.
I had my doubts about this funding as Kurt struggled to get Linux installed on that first PC, but after some able assistance from Linus, he did get it working. I had my first look at the operating system running and in less than ten minutes I had convinced myself that “this was Unix enough for me.” Instinct told me, “this is good.”
Later that week Linus joined a few of us for a ride on the Nachez, a steam boat that plies the Mississippi River. As we rode up and down the river I started thinking about what Linux might mean for the educational community, and what it might mean for Digital.
Twenty-five years ago I was a student at a university in Philadelphia. Although we had a large computer system, it was kept behind glass doors, and batch jobs on computer cards were passed through a narrow opening in the wall, with printouts coming back over a 24-hour period.
Trying to learn operating systems design on such a system meant using an emulator, and the process of using that emulator through punched cards was really painful, about like having a root canal without anesthetic.
Fortunately at the same school was a little minicomputer lab, and in that lab were three small PDP-8 machines from Digital. It was on these machines, with the aid of some free architecture books given to me by the Digital salesman, and some freeware software that came from DECUS, that I really started to learn about how computer systems worked. I always remembered that lab, those machines, and that Digital salesman.
Years later, after working on large IBM mainframes, heading the department at a small two-year technical college (using Digital's equipment once again), and learning Unix at Bell Laboratories (on Digital's VAX machines) I had an offer to work with Digital's Unix group in Nashua, New Hampshire. I took that offer, in hopes of being able to contribute to the same environment that had helped me learn computers in those early years.
Working in the Unix group, I often heard about universities, colleges, and even high schools that wanted to use Unix to teach computer science. Despite the origins of Unix as a research tool, and the vast contributions to Unix made by the University of California Berkeley and other schools, the licensing terms of our product did not make it easy to share source code.
As a commercial Unix system, we license technology from a variety of companies and integrate that technology into our sources. Some licensing agreements required us to keep the source code private unless the requesting customer had a license agreement directly with the supplier of the technology. Over time, this meant that to get all of the sources to our Unix products, fifteen separate licenses were necessary, at a cost of thousands of dollars, and even then the sources were restricted to a “need to know” basis and were not for consumption by curious students.
A second issue was cost. Schools had been moving towards PCs and Macintosh computers over the years, mostly because of the low cost of the hardware, operating system, and applications. While these machines were fine to do reports on, or to do other types of “application” work, the lack of sources to the operating system, networking, and compilers made them less useful for teaching operating system design. Workstations, on the other hand, tended to use more expensive components, larger amounts of main memory, and were generally outside the price-band of most schools trying to teach computer science to large numbers of students.
As I stood on the deck of the Nachez, several thoughts ran through my mind. I knew that Digital was developing some low-cost Alpha single-board computers which used industry standard buses (PCI and ISA). I also knew that the Alpha (with its 64-bit architecture) had unique capabilities for doing computer science research into large address-space utilization. The Alpha processor was a true RISC system, which would test the portability of the Linux kernel, and the availability of Linux on Alpha would help develop new concepts for better using the Alpha's blazing speed, currently 1 billion instructions per second (BIPS), even in our commercial Unix product. So I asked Linus if he had ever considered doing a port to the Alpha. “Yes,” he said, “but the Helsinki office of Digital has been having problems locating a system for me, so I may have to do the PowerPC instead.”
My fellow employees tell me that I howled like a wounded hound at that point, and (I find this hard to believe) dropped my Hurricane (a fine New Orleans drink). It was then that I knew I had to help get Linux on Alpha.
The next day I flew back to New Hampshire, and that morning I was on the phone to Bill Jackson, a marketing manager in our Personal Worksystems Group. I explained the situation, and why I felt this was a good thing for Digital. Bill and I had known each other for a long time, and just as I had faith in Kurt Reisler, Bill had faith in me. “maddog,” he said, “I only have a Jensen (a code name for an early Alpha workstation, which had an EISA bus), but it has 96MB of main memory, Ethernet, and 2.5GB of SCSI disk space.” “Throw in a CD-ROM drive and you have a deal,” I said (being a tough negotiator), “my cost center will pay the shipping.”
I still had to really formalize the agreement with Linus. Fortunately he was attending the summer USENIX in Boston, so I took the paperwork loaning him the computer system down to Boston. “How long is the loan?” asked Linus (while munching on a hot dog). “As long as you need it,” I replied, “or until we can get you an even faster system.”
The next week the system was shipped to Helsinki, via the Digital office there.
About the same time I heard about a group of engineers inside of Digital who were also working to port Linux to Alpha. I got them a system identical to the one I had obtained for Linus.
Since Digital was now somewhat in the “Linux market”, I thought it was time to formally write up the value of Linux to Digital, and to give some real thought as to why Digital should support a porting effort. I also had to think of how Linux systems on Alpha might affect the sales of Digital's own product, DEC OSF/1 (since renamed Digital Unix to reflect having been branded as “Unix” by X/Open, Inc.).
I quickly concluded that there were markets for Linux on Alpha, and these markets have some of (but not all) the same characteristics:
They need the source code for:
They have more time and manpower than money
They do not need huge numbers of commercial applications (yet)
I also found a market that would probably not want Linux (yet), and these are some of their characteristics:
They resemble my Mom & Pop (computer illiterate, and proud of it)
They want some entity to guarantee the operating system
They are mostly dependent on commercial applications
The markets here are not all black and white. For example, one market that is typically thought of as a “Linux market” is the “computer hobbyist” market. To a lot of commercial computer vendors, this market uses older PCs, cast off from other applications, to fuel a “hobby”, much like the Radio Amateurs of ARRL fame. However, if you really look at this market, you see some of these people buying very sophisticated gear, trying to reach “an edge”. This can be an interesting (but relatively small) market.
Another (much larger) market is the computer science education market. Universities, colleges, and even grade schools teach students to interact with computers, and many teach computer science. With Linux as the operating system, and either PCs or low-cost Alpha processors as the platform, these customers can now actively teach computer science, with access to source code for students to modify and try on their individual systems. Research in computer science (particularly with large address spaces, or with RISC instruction sets) can easily be facilitated with Linux, and the copyleft licenses encourage free exchange of the research results.
The fallout of this is that larger systems (funded, perhaps, by research grants) may also be sold. Or the purchaser of 100 Linux desktop systems might appreciate a server machine to hold the student's files, do the printing, handle mail, etc. It makes sense that the server machine be either of the same architecture as the clients, or at least have data compatibility on the binary level. Intel machines and Alpha machines are both “little endian”, and even if the Intel architecture is only 32 bit, it is relatively easy to make them data compatible.
Another market where Linux would shine is turnkey systems—places where a large number of systems would be purchased, mostly for one application, such as point-of-sale terminals, or for large user-written applications which have to run on a large number of discrete systems. In this case the savings in operating system license royalty payments might pay for a system programmer to do the integration and support of Linux on system boxes. Also, since there tend to be fewer applications to run on these turnkey systems, it might be easier to stay with one version of Linux, versus having to keep upgrading to newer versions.
Finally, there is another side to the freeware operating system market, and that is what the commercial software developers can gleen from what the freeware people develop. By looking at what features the Linux community puts into their systems, commercial systems can improve by emulating the design decisions made in freeware operating systems. Not every design decision will be followed, but certainly some good ideas have already been seen in Linux, and should be studied by developers of commercial code. We no longer have the luxury of re-inventing the wheel.
I am often asked how Digital Unix (Digital's commercial Unix product) would fit with Linux on Alpha. I see no conflict. Some people want such features as certifiable C2 security, “cluster” style systems (multiple systems working together as a single image), SMP scalable to large numbers of processors, soft realtime support (which I keep asking Linus to put into Linux), large log-based file systems, etc. These features (and more) are all things that Digital Unix has today, and which may or may not show up in Linux in the future, depending on what the Linux developers find “interesting”.
On the other hand, I believe that Digital should continue to work to make Digital Unix more and more compatible with Linux and netBSD (which has also been ported to the Alpha processor, and is available across the network), and provide diskless and dataless support for both Intel and Alpha Linux systems. Already there are some interesting possibilities, since Linux Alpha can run binaries statically linked on Digital Unix systems, and I assume that the same may be true of taking statically linked Linux binaries and running them on Digital Unix systems. By working on binary compatibility between the two operating systems, Digital would help facilitate a large number of applications that would run on both operating systems.
Digital's present plans do not include shipping a CD-ROM created by Digital with Linux “Alphabits” on it. We feel that the current companies and groups are doing a fine job, and we will work with those groups to have the Alphabits put on their distributions. Likewise, we will continue to put our contributions out on the Internet for the Linux community to use. Digital's goal is simply to have the best, easiest, and fastest hardware to host the Linux operating system, whether it be Intel-based PCs or systems using the Alpha 64-bit processor.
Jon “maddog” Hall is the Senior Manager of the Unix Software Group at Digital Equipment Corporation. He has been with Digital for 12 years, all in the Unix group, and for four years before that he was a senior Unix systems administrator at Bell Labs. He can be reached via e-mail as firstname.lastname@example.org
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Non-Linux FOSS: Caffeine!
- Managing Linux Using Puppet
- Doing for User Space What We Did for Kernel Space
- Tech Tip: Really Simple HTTP Server with Python
- SuperTuxKart 0.9.2 Released
- Parsing an RSS News Feed with a Bash Script
- Google's SwiftShader Released
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide