From the Glass House to the Glass Cloud
We were goofing around in the Linux Journal IRC room the other day and I commented that I needed to find a topic for discussion. One of the local denizens suggested I talk about release cycles. A timely topic actually, given Susan Linton’s coverage of Mandriva’s recent release issues and the back and forth from our friends at openSUSE , not to mention Redmond finally putting a stake in at least one version of Windows XP. But as I tried to frame the article, I found myself going in circles. And sometimes, when you find yourself going in circles, you discover the light bulb has come on.
I am coming up on 20 years in Information Technology, and closer to 30 years of being exposed to it (my dad was big in IT as well), and over those years, with a few exceptions, the business of IT like all aspects of business, has gone around in a circle. Here in 2010 we are talking about the current big deal thing: cloud computing. To make cloud computing a reality, you need things like virtualization, and software as a service and …insert sound of needle being scrapped across a record
We have gone in a great big circle. For those of you who were only just being born in the 1980s, let’s step back a bit. A company called Xerox, which is still around, set up a research facility in California. One of its goals was to make data entry easier for the legions of secretaries who were being trained on the newfangled, commercially available virtual terminal systems that were beginning to replace the typewriter in more and more companies, not just the shops that could afford the big iron that had been around since the mid to late 60s. A number of items came out of that decade that we look back on fondly: the laptop, made popular by Osbourn computers. Of course they were the size of a suitcase, but hey, you could do your work if you had a magnifying glass to read the screen with. We also saw the mouse come out of Xerox PARC, along with some gentlemen who, in the 1990s, would change the way we thought about IT. It was the dawn of the personal computer, and it would be a huge, profitable business, employing millions of people, many of us just trying to keep the systems running.
The goal of the personal computer, and Windows, and the MacOS, and Linux, was to break the stranglehold that IT departments had on business as well as to make it easier for companies to be more nimble, flexible, and respond faster. After all, as the grey beards will tell you, programming on punch cards is not a simple or easy process. And from the middle 1990s through the middle 2000s, the personal computer reigned as the solution to all of the problems that were perceived to be wrong with the glass house model of the era before. Since the mid part of this decade though, something strange has been occurring, spurred on by the almost global economic collapse. The idea of the glass house has returned, only now it is a glass cloud.
If you think virtualization is new, you might want to go and talk with the folks at IBM, who invented it back in the dark ages. Mainframe systems are the grandfathers of virtualization. There are some differences, but not enough to quibble over. What companies like VMWare and technologies like KVM are doing today is making it possible for today’s machines to run efficiently. And that is what IBM did with its big iron. The fact that a modern server has more computational capacity today than a room full of old mainframes is just the application of Moore’s Law". Software as a service or hosted applications? Virtual desktops? All new ways of describing what was done with green screens and green bar paper in the glass house days of IT. And as the complexity and costs of maintaining individual desktops continues to increase, almost exponentially, is it any wonder we are trying to … well … consolidate?
I am not saying that this is a good trend or a bad trend. But it is the natural culmination of the cycle. You can point out that with high speed bandwidth we can do things today that we could not do 20 or 30 years ago, and you would be correct. But the point is that the era of the personal computer, from a business perspective, is coming to a close. And it will be interesting to watch what the next big innovation is that starts the cycle again and watching to see if Moore’s Law also applies to the IT business cycle.
Shameless plug: I will be in Boston in August for the 2010 LinuxCon (August 10-12). I will be wearing my call sign, and a red Linux Journal polo so feel free to say hello!
Getting Started with DevOps - Including New Data on IT Performance from Puppet Labs 2015 State of DevOps Report
August 27, 2015
12:00 PM CDT
DevOps represents a profound change from the way most IT departments have traditionally worked: from siloed teams and high-anxiety releases to everyone collaborating on uneventful and more frequent releases of higher-quality code. It doesn't matter how large or small an organization is, or even whether it's historically slow moving or risk averse — there are ways to adopt DevOps sanely, and get measurable results in just weeks.
Free to Linux Journal readers.Register Now!
- August 2015 Issue of Linux Journal: Programming
- Django Models and Migrations
- Hacking a Safe with Bash
- Secure Server Deployments in Hostile Territory, Part II
- The Controversy Behind Canonical's Intellectual Property Policy
- Huge Package Overhaul for Debian and Ubuntu
- Shashlik - a Tasty New Android Simulator
- Embed Linux in Monitoring and Control Systems
- KDE Reveals Plasma Mobile
- diff -u: What's New in Kernel Development