Linux in Government: How Linux Reins in Server Sprawl
People write a lot about utility computing these days. The interest seems high. VMware gave a seminar in Dallas this past week and had 850 attendees. That followed a well-attended seminar by IBM's business development group on "On-Demand Business". Yet even with the high visibility in the media over the past year, many IT managers seem lost when I discuss utility computing with them.
I realize buzzwords come and go. People find it so easy to dismiss "utility computing" as another fad. Even after noting its undeniable benefits, people's eyes glaze over when one attempts to discuss this topic. I think many of my colleagues avoid the subject, because some vendors have said they want to sell IT as an independent service, similar to water or telephone service.
I personally find that objectionable. One can see the benefit to the vendor but not to IT departments. Within the context of cost containment and efficient use of resources, utility computing doesn't mean install a meter.
When I think of utility computing I think of frugality. I want to get the most out of what I already have. In business, we often say, if it ain't broke don't fix it. In other words, don't rip and replace the technologies that work. Instead, acquire tools that pull resources together and allow us to manage and consolidate, become more productive and eliminate duplication of effort. Linux has addressed this area more than any other operating system.
In typical data centers, you find one application tied to one or more physical servers. Most applications require different computing power based on use. In the past, we always sized hardware based on peak usage. This habit has resulted in what analysts called server sprawl. You may only reach peak usage one day a year. The rest of the time, usage goes down. That concept works great for electric companies, but not for computing.
Ultimately, dedicated servers create the "silo" effect we discussed in last week's article. Silos do not provide for efficient use of hardware resources. Many server utilization rates run around 10-15 percent overall for an organization. Obviously, the ROI on such environments becomes unacceptable, especially to stakeholders.
Blame process automation on the situation we have today. A decade ago, capturing and managing transactions and eliminating processes that did not add value brought on the prominence of enterprise resource programs. As we collected transactional data, the number of ways to store it grew proportionately. That has given rise to products such as network attached storage and storage area networks, NAS and SAN, respectively.
Ultimately, we used technology to create efficiencies, and those technologies became our next inefficiencies. Some business theorists used to say that the solution to the problem becomes the next problem. That has happened within the enterprise.
Numerous studies exist discussing server utilization rates. Companies such as IBM and HP tell us that Intel server utilizations run in the frightening low area of 10 to 15%. We easily can see how application silos syndrome results in these low rates and high costs of storage. We also can find numerous case studies that demonstrate how to raise rates, consolidate hardware and integrate processes across numerous silos.
Linux virtualization has become the primary technology in use by major solution providers today. Linux and virtualization technology, including VMware, allow for
a consolidation ratio of four to five workloads per CPU or higher
decreased capital and operational costs
improvements in server management
more robust infrastructures
In previous times, we solved the problem of needing dedicated resources that grew and shrunk on mainframes using VM/370. Linux on the IBM S/390 and zSeries mainframes rekindled the concept. Then, about three years ago, IBM and VMware got together and co-marketed a solution using IBM's xSeries 440 and VMware ESX Server.
Note: You can find a downloadable Redbook on the subject (note the date) here.
Little did we know that IBM and VMware were starting an industry. According to Dan Kuznetsky of IDC, "The switch to commodity-based servers has resulted in more companies pursuing a virtualization strategy." Referring to overall virtualization software revenue, he said, "It's growing three times faster than the revenue growth for operating system software."
- Readers' Choice Awards 2013
- Mars Needs Women
- IBM Will Minimize Impact of Future Disasters
- Sublime Text: One Editor to Rule Them All?
- RSS Feeds
- December 2013 Issue of Linux Journal: Readers' Choice
- Raspberry Pi: the Perfect Home Server
- Tech Tip: Really Simple HTTP Server with Python
- Linux Systems Administrator
- Web Administration Scripts
- Sublime Is Brilliant!
2 hours 19 min ago
2 hours 39 min ago
- Rapid[Disk,Cache] better than native ram caching?
3 hours 4 min ago
- Nothing is perfect
3 hours 17 min ago
- Mixtapes Community
8 hours 56 min ago
- KDE is one true DE
9 hours 30 min ago
- Command Line Shells (Bash, Zsh, etc.) are 2nd place
9 hours 59 min ago
11 hours 54 min ago
- yes it's Jupiter Broadcasting
13 hours 13 min ago
- nice to see PClinuxOS finally
15 hours 47 min ago