Linux in the Trenches
I still remember the lone dollar prompt springing to life on the monitor of a lowly 80386sx in my office as it booted Linux for the first time. The appearance of that single dollar sign has proven to be possibly one of the most seminal moments in the history of the Roger Maris Cancer Center, and of my professional career. The birth of this free operating system in an environment previously dominated by commercial software was to have profound ramifications on both the quality and efficiency of the care our staff provides to our patients.
I had watched with interest the early announcements Linus made on the networks about a Unix clone he was writing called Linux. Consumed by work and development pressures, I did not allow myself the indulgence of playing around with the new offering for fear of engaging in yet another eternal timesink. Eventually, overcome by curiosity, I decided to take the plunge and investigate Linux. Our experiences have far exceeded our expectations; Linux certainly is a wonderful timesink...
The actual incentive and decision to move toward an free/open systems approach to our information processing needs occurred at a much earlier time. I had been recruited in 1988 to serve on a team whose goal was to develop a comprehensive cancer care facility. The only perk I asked for was an 80386-based computer, since my instincts told me that the advent of this microprocessor into the personal computer market was going to revolutionize desktop computing.
Our medical director, Dr. Paul Etzell, was intrigued by my request and wanted to know why I thought this. I told him, “Because it can multi-task and handle large amounts of memory the way it is supposed to.'' He was excited and told me that he had always dreamed of a specialized patient information system for his cancer center. Armed with this mandate, a 16Mhz 80386sx ALR, and a Xenix 2.3.3 development system we set out to design new solutions to problems in the delivery of outpatient cancer care. We deliver care that allows patients to have their treatment and still conduct a somewhat 'normal' lifestyle.
That was in 1988. The Roger Maris Cancer Center has now grown to be a major regional treatment and referral center. Medical care, chemotherapy, and radiation therapy are provided to approximately 125 patients with a diagnosis of either cancer or hematologic (blood) diseases each day. A total of 13 physicians deliver cancer care supported by a staff of approximately 100 people and 25 Linux workstations. The Cancer Center is a distinct service line of the MeritCare Medical Group which consists of approximately 250 physicians and 2000+ employees supporting one central clinic, a 400 bed hospital and 25 regional clinics. For everyone with aspirations for Linux in the commercial environment, we are about as commercial as things get.
When the first Linux dollar prompt appeared, our Cancer Center was not naive about free software. I had already ported a number of the GNU tools to Xenix in an attempt to produce a suitable development environment. By this time our Cancer Center was supported by two 33Mhz 80386dx's (Gateway-2000's), the ALR having been mercifully put to rest.
However, compared to compilers, editors and utilities, an operating system is a completely different beast. As I sat staring at the screen contemplating the fact that a machine that had been previously sentenced to MS-DOS was now quietly multi-tasking, I couldn't even believe what was in front of me. Could something that I had ftp'ed for free from some distant spot in Finland really be capable of playing a role in our development plans?
At our morning coffee I looked across the table at Dr. Etzell and quietly commented that I might have run across something that could have a profound impact on the Cancer Center.
Linux has had that profound impact because of the doors which it has opened for us. Our medical director, a true visionary, knew that the face of medicine would be radically changing. True to his prediction, the medical community in the United States is facing extreme pressure to deliver high-quality, cost-effective care. If our Cancer Center was to survive, let alone thrive in this environment it would be critical to innovate and adapt quickly. The reason that Linux has had such a profound impact is that it has provided us with the tools necessary for this evolutionary process.
It probably goes without saying that a decentralized system of peer-to-peer networked workstations was not the same vision that corporate data processing possessed for us. Based exclusively on centralized mainframe processing, the corporate wheels were turning toward network solutions, but a network whose main purpose was to deliver datastreams from mainframes to diskless workstations; expensive terminals connected by expensive wiring. Our vision was of independent workstations capable of functioning without network connections if need be, but also capable of utilizing networking to enjoy extreme synergism in their daily tasks.
If this was to be our gospel then it was up to Linux to preach it for us. The dichotomy in vision ensured that there would be no funds available for our work. Linux and free software provided the tools to implement our strategy in the absence of official support. With Xenix and no money for upgrades there was no potential for X-windows, and most importantly no money for the TCP/IP networking which was central to our development strategy.
This is not to say that Linux was a magical solution that revolutionized our method of patient care overnight. The first dollar prompt was actually from a bootdisk/rootdisk combination (I am dating myself here) that was version 0.95a (one of those really weird version numbers) in early 1992. At that point in its development Linux was far from the workstation contender that it now is. I developed application software furiously on the Xenix boxes while I nursed a stable Linux environment into existence. The bulk of our application code was in PERL/TeX/C which I knew would be available in only a matter of time on Linux.
All the development on Linux was done on a 20Mhz 80386sx (Gateway-2000) with a 120MB disk drive. This machine was tied back-to-back through serial ports with one of the Xenix machines whose 300MB hard drive provided a safe haven for the development sources. It is a true testimony to Linus and the other developers that we never experienced a filesystem crash or lost data during the development process. In fact, the original 20Mhz machine is still in service, still using the original Minix root filesystem written to its hard drive with 0.96a.
It wasn't until about 0.96c that Linux actually began to shoulder operational responsibilities. This was mainly due to the fact that 0.96c was the first release whose serial drivers would reliably withstand a UUCICO session. Our concept of 'networking' at that point was file relay and remote execution via UUX, making reliable serial communications essential.
During the 0.95-0.96 stages our development efforts were focused on putting together what would be called a distribution in today's parlance. Reality dictated that a reliable means of replicating and updating Linux would be required if multiple machines were to be committed to service. During this time I wrote the first version of StopAlop [StopAlop stands for ”Stop Alopocea“ (a medical term for hair loss) or ”Stop A lot of problems“]. StopAlop allowed us to package Linux into a series of modules which could be installed, verified and updated independently, reliably, and with version control. Our installations centered on a 'base' module which was everything needed for a standalone UUCP-capable Unix workstation. Additional modules provided our appli-cation code, emacs, a development environment, text processing (TeX), and ultimately networking and X11.
When reliable kernels arrived we were prepared with a well-tested, reliable system environment. Our application software was already maturing and in operation under Xenix. As we began deploying Linux, our medical director began negotiations which ultimately provided the hardware resources necessary to formulate our peer-to-peer distributed computing environment. No small part of his efforts were focused on negotiating for corporate survival of our fledgling project.
His efforts were enhanced by the profound impact that Linux began to have on our operations. No longer encumbered by the restriction of two licensed copies of an operating system, we were able to immediately deploy four additional workstations. Our application software hosted on these machines provided patient information, patient tracking, drug history databases, pharmacy support and automated patient charting to patient care environments which were largely non-automated.
Our most powerful argument to detractors (who were vocal) was that the systems being put into place were functioning well, were tuned to our application needs, were substantially increasing productivity, and had cost nothing over and above the hardware costs. The most astounding point made was that the workstations accomplishing these feats had been rescued from the fate of being diskless DOS network clients...
From this point we have progressed, passing some notable milestones. Six months after the first Linux machine began tracking patients, a major user-interface move was made by changing from using multiple virtual consoles to X. Serial connections were established between the Prime computer, which hosted laboratory information, and the Linux workstations. Receptionists and staff members who sometimes had three different computers and/or terminals to contend with were now interacting with different information sources via multiple xterms on the same display.
The most significant milestone occurred the day the concentrators were plugged in connecting all the workstations via 10baseT Ethernet. With high-speed TCP/IP connections between the workstations, our goal and vision of a peer-to-peer distributed information support environment was complete. The gospel according to Linux was named Perceptions; a name chosen to personify the design goal of our adventure, to provide a unified resource which would enable staff members at the Roger Maris Cancer Center to have a clear information picture of the patients they were taking care of. In December of 1993 the 'big red switch' was pulled on the last Xenix machine and our Cancer Center and its operations became totally dependent on Linux.
The story of Linux and Perceptions is the truest example of why I feel that 'free' software can play an important role in the commercial marketplace. The speed at which we were able to accomplish our goal is a direct result of our ability to innovate and respond to environmental needs. The impediments to our accomplishments were reduced to only what we were individually capable of developing and supporting. We called the shots, we made the decisions, if something didn't work we fixed it. It was these experiences that caused us to form the motto: I would rather spend 10 hours reading someone else's source code than 10 minutes listening to Musak waiting for technical support which isn't.''
The real basis of our success is, in part, the large, diffuse network of Linux Activists who made what we did possible. It is always easy to make good design decisions when you have good data to make those decisisons with. I read literally thousands of Usenet articles to make sure that I knew everything there was to know about the stability of kernels, what software worked and what didn't, and most importantly, how to fix what didn't work. On the basis of the experiences of hundreds or thousands of other people I made good design decisions. Good design decisions lead to successful implementations, which was really the bottom line of our survival. As Dr. Etzell so aptly put it, ”It's always difficult to argue with something that works.''
The story of Linux and Perceptions is far from over. Only the basics of our total design plan have been implemented. One of the fundamental design tenets of our information system is parallel database concurrency across multiple hosts. Our systems are designed to be capable of withstanding (and have withstood) severing of network connections without end-users knowing the event happened. Additional work with locking and updates across a wide-area network are a current research interest. An extreme area of interest and activity is digital document storage and retrieval. We are secure in meeting future challenges in health care with the knowledge that we have a stable open operating environment on which to formulate our information strategies.
So the story of Linux at the Roger Maris Cancer Center is really the story and testimony of the success of Linus Torvalds and the Linux Activist movement. I would hope that everyone in the movement shares the same sense of accomplishment that I do. It is with extreme pleasure that I am able to tell visitors that we take better care of cancer patients because of an experiment in protected mode programming conducted by a (then) 23-year-old computer science student from Finland.
1. StopAlop stands for “Stop Alopocea” (a medical term for hair loss) or “Stop A lot of problems”.
Greg Wettstein (email@example.com) is a pharmacist who chose the profession because it required only two quarters of math. He is now a PhD in quantum chemistry and theoretical drug designer who won't leave the Upper Midwest because he likes team roping and raising cattle more than wavefunctions and the Hansch equation, and who spends 'spare time' at his best friend's construction company driving payloaders and reel trucks (into holes).
He can be reached as: G.W. Wettstein, Ph.D., Oncology Research Division Computing Facility, Roger Maris Cancer Center, Fargo, ND 58122.
|Designing Electronics with Linux||May 22, 2013|
|Dynamic DNS—an Object Lesson in Problem Solving||May 21, 2013|
|Using Salt Stack and Vagrant for Drupal Development||May 20, 2013|
|Making Linux and Android Get Along (It's Not as Hard as It Sounds)||May 16, 2013|
|Drupal Is a Framework: Why Everyone Needs to Understand This||May 15, 2013|
|Home, My Backup Data Center||May 13, 2013|
- New Products
- Linux Systems Administrator
- Senior Perl Developer
- Technical Support Rep
- UX Designer
- Designing Electronics with Linux
- Dynamic DNS—an Object Lesson in Problem Solving
- Making Linux and Android Get Along (It's Not as Hard as It Sounds)
- Using Salt Stack and Vagrant for Drupal Development
- Nice article, thanks for the
7 hours 12 min ago
- I once had a better way I
12 hours 58 min ago
- Not only you I too assumed
13 hours 15 min ago
- another very interesting
15 hours 8 min ago
- Reply to comment | Linux Journal
17 hours 2 min ago
- Reply to comment | Linux Journal
23 hours 56 min ago
- Reply to comment | Linux Journal
1 day 12 min ago
- Favorite (and easily brute-forced) pw's
1 day 2 hours ago
- Have you tried Boxen? It's a
1 day 7 hours ago
- seo services in india
1 day 12 hours ago
Enter to Win an Adafruit Pi Cobbler Breakout Kit for Raspberry Pi
It's Raspberry Pi month at Linux Journal. Each week in May, Adafruit will be giving away a Pi-related prize to a lucky, randomly drawn LJ reader. Winners will be announced weekly.
Fill out the fields below to enter to win this week's prize-- a Pi Cobbler Breakout Kit for Raspberry Pi.
Congratulations to our winners so far:
- 5-8-13, Pi Starter Pack: Jack Davis
- 5-15-13, Pi Model B 512MB RAM: Patrick Dunn
- 5-21-13, Prototyping Pi Plate Kit: Philip Kirby
- Next winner announced on 5-27-13!
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?