Linux System Administration in the New ERA

by Tom Adelstein

The success of Mozilla's Firefox and Openoffice.org's productivity suite has breathed life into people's aspirations about Desktop Linux. As a result, the vast majority of articles published today focus there and ignore the strides made on the Linux server. Unlike the Linux server of the past, today's version supports rocket science and its gains far exceed those of the Desktop.

As a former editor-in-chief of a Linux news site, I had the unusual privilege of finding out from our own logs and from colleagues the approximate number of unique visitors dropping by the major US Linux news wires each month. Back when I started using Linux, John Hall of Linux International estimated that the operating system had two million deployments globally. At the time we thought the number a little high, but we accepted it.

Today, we see twice that number or approximately four million unique IP addresses reading articles on Linux news sites. That does not include sites outside the US, Slashdot or Digg. So, a significant number of people read articles about Linux and they represent only a percentage of all its users.

Unfortunately, we failed to determine the demographics of Linux readers. We simply had difficulty finding out what kind of readers came to our site. We didn't know if our visitors represented CIOs, CTOs, developers, Linux users or the just curious.

We did find out that pure Linux articles received more than triple the number of page views as articles about Open Source companies and their technologies. We also made a living off of another kind of story. Anything that appeared as a threat to that monopoly in Redmond garnered ten times the number of page views as the most read Linux articles.

Getting to the point, when we sorted out the type of Linux articles that went through our queue, Desktop Linux had a whopping 90 percent of the stories. Either people stopped writing about the server or Linus has it in hiding. The Linux server's press agent let it get over shadowed by the desktop.

What about the server?

I refer to the current Linux infrastructure play as the "New Era". Linux has matured rapidly and far surpassed the expectations of the smartest analysts I know. Basically, the Linux server kicks butt.

The advancements also come quickly. Some long-time Linux system administrators I know, have some difficulty keeping up with all the advancements and innovations today. Sometimes, they argue with me about why I would do things the way I do them. Inevitably, they go ahead and try something new with an accompanying "wow, that's neat!"

When I began using Linux as a system integrator, we had only a few places to operate. Those included serving web pages with Apache, managing DNS, relaying email as a MTA, interfering with NT 4.0 using Samba and developing applications with Richard Stallman's wonderful compilers, tools, etc.

We had a mature 2.035 kernel and from where Linux started that seemed remarkable. But, we did not have a journaling file system, had lousy multi-processor capabilities and little to no desktop. We lacked deployment tools, a real web browser, a reasonable productivity suite and our hardware compatibility stunk. That's not to diminish the remarkable efforts of the people who gave a big part of their lives to Linux. It's just where we stood in comparison to AIX, Solaris, HP-UX and some others back then.

So, as I discuss the applications and tools freely available for Linux now, please understand where I started. The old days of pride around the 2.035 kernel look continents away from here.

What's new?

Linux has a dominant position in enterprise computing. Many mainstream applications used on Solaris, for example, have made their way to Red Hat and Novell Linux. Aside from the scientific tools you see on the Space Shuttle and 256 node clusters that run sonar arrays on nuclear submarines, Linux runs the largest web sites in the world. The problems that plagued distributed directory services have gone away and run on large blade server farms. These represent a tiny fraction of the uses of Linux.

Linux not only works for enterprise computing, it also gives smaller users a decided advantage in the marketplace. Linux levels the playing field for small to medium-sized businesses and lets them compete with the big boys. Everything from ERP systems to customer service apps run on Linux and puts those applications in reach of the little guy. That's what helps propel its adoption, which analysts put at 40 million deployments.

To exemplify my point about the little guy, just recently, I configured a Debian server with Xen 3.01 getting it production ready in two hours. The majority of that time involved compiling code. I doubt I could have afforded the software if I used proprietary goods. And I got to use some advanced computing applications.

So why did I need Xen virtual machines?

I needed to deploy several applications including a secure LDAP directory with mail, a secondary DNS server, several virtual web sites, a content management system and a database driven federated identity management system. The virtual servers helped me put those into production without having to buy expensive hardware.

In the old era, I would have dedicated a separate server to each of those applications considering the number of users involved. In the new era, we can use commodity hardware add gigabytes of RAM and additional disks to achieve higher CPU capacity on a single machine without creating more server sprawl. Xen made it affordable for me to get into business.

Last month, a friend of mine, Falko Timme, wrote a howto about setting up a load balanced high-availability Apache cluster using free software. He used Debian Sarge, Ultra Monkey's Heartbeat and ldirectord. Ultra Monkey uses software primarily from Linux Virtual Server and Heartbeat. Falko set up a five node cluster and to keep from having to match hardware, he used Xen on different kinds of server hardware. In March, Falko wrote another tutorial for building a five mode MySQL load balanced cluster using the same technology. He did all of this with commodity hardware.

Falko also writes howtos on howtoforge.com about technology like MyDNS, a server that uses a MySQL database as a backend instead of Bind or djbdns flat files. MyDNS simply reads DNS records from a database and does not require a restart when DNS records change or when you create, edit or delete zones. MyDNS provides a major advantage to organizations that deal with massive numbers of domains. It runs on Linux.

Just two years ago, we faced a number of problems in the Linux community deploying large numbers of Linux systems. We felt like paupers attempting take on even medium-sized projects like the City of Munich. If you wanted deployment tools, you had to buy the expensive closed source tools. Today, all that has changed.

From a project started at VA Linux a few years ago, Brian Finley and his team has produced a robust tool for automating Linux installs, software distribution and production deployment. The tool known as SystemImager allows deployments of ISP and database server farms, high performance clusters, computer labs, and corporate desktop environments. SystemInstaller, a related project, can install a system with any Linux distribution. It works with SystemImager and SystemConfigurator, an installation and management application framework. Together the tools work to build clusters. Oh, did I mention it's free software?

Linux also shines in the area of high performance, high availability computing power. For example, the NASA Center for Computational Sciences (NCCS) at the Goddard Space Flight Center deploys HPC Linux clusters designed to increase throughput for applications ranging from studying weather and climate variability to simulating astrophysical phenomena. Linux supplements NCCS architecture designed to scale to as many as 40 trillion floating-point operations per second (TFLOPS) in its full configuration.

According to Forbes, Linux runs more of the world's top supercomputers than any other operating system. In fact, at this writing Linux runs 60% of the top 500 super computers on the planet. According to departments heads at the Lawrence Livermore National Laboratory in Livermore, Calif., Linux runs ten of their machines, which are all on the Top 500 list, including Blue Gene/L, the world's most powerful supercomputer, and Thunder, which ranks fifth.

And that's just the start of the conversation about the Linux server. It manages water wells in Jordan, provides logistics and supply chain applications for governments and businesses and more. From a small server running the ext2 file system with supposedly zero scalability, the Linux server has come a long way. And while I use Linux for my desktop, the server intrigues me most.

Interested?

Ask yourself if you would like to work with any of the above projects or technologies. Demand exists for the skill sets involved. Do you consider yourself trained and ready to get started? Do you have the system administrative skills to function in the new era in the above environments? If not what should we do?

The game has changed and if we want to move forward we will need some familiarity with the new advances and innovative technologies emerging from the Linux camp. Since most of us Linux guys learn this stuff ourselves, perhaps the time has arrived for some mid-career change over.

I don't see any of the folks at the Open Source Development Lab slowing down, so time's a wastin! To use a term we probably invented in Texas. OK, back on topic.

Moving forward in the new era requires a choice. If you choose to move forward, remember that free software only requires a download. If it's free you can use it and you do not have to ask anybody's permission. Great documentation exists to get started. So as many people say, enjoy!

Load Disqus comments