From the Glass House to the Glass Cloud
We were goofing around in the Linux Journal IRC room the other day and I commented that I needed to find a topic for discussion. One of the local denizens suggested I talk about release cycles. A timely topic actually, given Susan Linton’s coverage of Mandriva’s recent release issues and the back and forth from our friends at openSUSE , not to mention Redmond finally putting a stake in at least one version of Windows XP. But as I tried to frame the article, I found myself going in circles. And sometimes, when you find yourself going in circles, you discover the light bulb has come on.
I am coming up on 20 years in Information Technology, and closer to 30 years of being exposed to it (my dad was big in IT as well), and over those years, with a few exceptions, the business of IT like all aspects of business, has gone around in a circle. Here in 2010 we are talking about the current big deal thing: cloud computing. To make cloud computing a reality, you need things like virtualization, and software as a service and …insert sound of needle being scrapped across a record
We have gone in a great big circle. For those of you who were only just being born in the 1980s, let’s step back a bit. A company called Xerox, which is still around, set up a research facility in California. One of its goals was to make data entry easier for the legions of secretaries who were being trained on the newfangled, commercially available virtual terminal systems that were beginning to replace the typewriter in more and more companies, not just the shops that could afford the big iron that had been around since the mid to late 60s. A number of items came out of that decade that we look back on fondly: the laptop, made popular by Osbourn computers. Of course they were the size of a suitcase, but hey, you could do your work if you had a magnifying glass to read the screen with. We also saw the mouse come out of Xerox PARC, along with some gentlemen who, in the 1990s, would change the way we thought about IT. It was the dawn of the personal computer, and it would be a huge, profitable business, employing millions of people, many of us just trying to keep the systems running.
The goal of the personal computer, and Windows, and the MacOS, and Linux, was to break the stranglehold that IT departments had on business as well as to make it easier for companies to be more nimble, flexible, and respond faster. After all, as the grey beards will tell you, programming on punch cards is not a simple or easy process. And from the middle 1990s through the middle 2000s, the personal computer reigned as the solution to all of the problems that were perceived to be wrong with the glass house model of the era before. Since the mid part of this decade though, something strange has been occurring, spurred on by the almost global economic collapse. The idea of the glass house has returned, only now it is a glass cloud.
If you think virtualization is new, you might want to go and talk with the folks at IBM, who invented it back in the dark ages. Mainframe systems are the grandfathers of virtualization. There are some differences, but not enough to quibble over. What companies like VMWare and technologies like KVM are doing today is making it possible for today’s machines to run efficiently. And that is what IBM did with its big iron. The fact that a modern server has more computational capacity today than a room full of old mainframes is just the application of Moore’s Law". Software as a service or hosted applications? Virtual desktops? All new ways of describing what was done with green screens and green bar paper in the glass house days of IT. And as the complexity and costs of maintaining individual desktops continues to increase, almost exponentially, is it any wonder we are trying to … well … consolidate?
I am not saying that this is a good trend or a bad trend. But it is the natural culmination of the cycle. You can point out that with high speed bandwidth we can do things today that we could not do 20 or 30 years ago, and you would be correct. But the point is that the era of the personal computer, from a business perspective, is coming to a close. And it will be interesting to watch what the next big innovation is that starts the cycle again and watching to see if Moore’s Law also applies to the IT business cycle.
Shameless plug: I will be in Boston in August for the 2010 LinuxCon (August 10-12). I will be wearing my call sign, and a red Linux Journal polo so feel free to say hello!
Until recently, IBM’s Power Platform was looked upon as being the system that hosted IBM’s flavor of UNIX and proprietary operating system called IBM i. These servers often are found in medium-size businesses running ERP, CRM and financials for on-premise customers. By enabling the Power platform to run the Linux OS, IBM now has positioned Power to be the platform of choice for those already running Linux that are facing scalability issues, especially customers looking at analytics, big data or cloud computing.
￼Running Linux on IBM’s Power hardware offers some obvious benefits, including improved processing speed and memory bandwidth, inherent security, and simpler deployment and management. But if you look beyond the impressive architecture, you’ll also find an open ecosystem that has given rise to a strong, innovative community, as well as an inventory of system and network management applications that really help leverage the benefits offered by running Linux on Power.Get the Guide
- Download "Linux Management with Red Hat Satellite: Measuring Business Impact and ROI"
- ServersCheck's Thermal Imaging Camera Sensor
- The Italian Army Switches to LibreOffice
- Linux Mint 18
- Petros Koutoupis' RapidDisk
- Oracle vs. Google: Round 2
- The FBI and the Mozilla Foundation Lock Horns over Known Security Hole
- Privacy and the New Math
- Ben Rady's Serverless Single Page Apps (The Pragmatic Programmers)