From the Glass House to the Glass Cloud
We were goofing around in the Linux Journal IRC room the other day and I commented that I needed to find a topic for discussion. One of the local denizens suggested I talk about release cycles. A timely topic actually, given Susan Linton’s coverage of Mandriva’s recent release issues and the back and forth from our friends at openSUSE , not to mention Redmond finally putting a stake in at least one version of Windows XP. But as I tried to frame the article, I found myself going in circles. And sometimes, when you find yourself going in circles, you discover the light bulb has come on.
I am coming up on 20 years in Information Technology, and closer to 30 years of being exposed to it (my dad was big in IT as well), and over those years, with a few exceptions, the business of IT like all aspects of business, has gone around in a circle. Here in 2010 we are talking about the current big deal thing: cloud computing. To make cloud computing a reality, you need things like virtualization, and software as a service and …insert sound of needle being scrapped across a record
We have gone in a great big circle. For those of you who were only just being born in the 1980s, let’s step back a bit. A company called Xerox, which is still around, set up a research facility in California. One of its goals was to make data entry easier for the legions of secretaries who were being trained on the newfangled, commercially available virtual terminal systems that were beginning to replace the typewriter in more and more companies, not just the shops that could afford the big iron that had been around since the mid to late 60s. A number of items came out of that decade that we look back on fondly: the laptop, made popular by Osbourn computers. Of course they were the size of a suitcase, but hey, you could do your work if you had a magnifying glass to read the screen with. We also saw the mouse come out of Xerox PARC, along with some gentlemen who, in the 1990s, would change the way we thought about IT. It was the dawn of the personal computer, and it would be a huge, profitable business, employing millions of people, many of us just trying to keep the systems running.
The goal of the personal computer, and Windows, and the MacOS, and Linux, was to break the stranglehold that IT departments had on business as well as to make it easier for companies to be more nimble, flexible, and respond faster. After all, as the grey beards will tell you, programming on punch cards is not a simple or easy process. And from the middle 1990s through the middle 2000s, the personal computer reigned as the solution to all of the problems that were perceived to be wrong with the glass house model of the era before. Since the mid part of this decade though, something strange has been occurring, spurred on by the almost global economic collapse. The idea of the glass house has returned, only now it is a glass cloud.
If you think virtualization is new, you might want to go and talk with the folks at IBM, who invented it back in the dark ages. Mainframe systems are the grandfathers of virtualization. There are some differences, but not enough to quibble over. What companies like VMWare and technologies like KVM are doing today is making it possible for today’s machines to run efficiently. And that is what IBM did with its big iron. The fact that a modern server has more computational capacity today than a room full of old mainframes is just the application of Moore’s Law". Software as a service or hosted applications? Virtual desktops? All new ways of describing what was done with green screens and green bar paper in the glass house days of IT. And as the complexity and costs of maintaining individual desktops continues to increase, almost exponentially, is it any wonder we are trying to … well … consolidate?
I am not saying that this is a good trend or a bad trend. But it is the natural culmination of the cycle. You can point out that with high speed bandwidth we can do things today that we could not do 20 or 30 years ago, and you would be correct. But the point is that the era of the personal computer, from a business perspective, is coming to a close. And it will be interesting to watch what the next big innovation is that starts the cycle again and watching to see if Moore’s Law also applies to the IT business cycle.
Shameless plug: I will be in Boston in August for the 2010 LinuxCon (August 10-12). I will be wearing my call sign, and a red Linux Journal polo so feel free to say hello!
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide
- Sony Settles in Linux Battle
- Download "Linux Management with Red Hat Satellite: Measuring Business Impact and ROI"
- Libarchive Security Flaw Discovered
- Profiles and RC Files
- Maru OS Brings Debian to Your Phone
- Understanding Ceph and Its Place in the Market
- Snappy Moves to New Platforms
- The Giant Zero, Part 0.x
- Git 2.9 Released
- Astronomy for KDE