Observation: Cloud computing is nothing new
Cloud computing is not only the latest buzz term, it might well be the model of computing that powers the 21st century. However, it’s easy to forget that personal computing, in which each user has a standalone system that can operate without a network, is itself a relatively new approach.
The first practical computers were enormous behemoths composed of clicking relays and vacuum tubes. Much of the early development of these multi-ton monsters had been spurred by the allied code-breaking effort during World War II. For the first thirty years of the history of general purpose computers, computer time was the exclusive privilege of large institutions and governments.
One of the first breakthroughs in bringing down the cost of computer access was the concept of a time-sharing system. In such a system, multiple operators can access the resources of the computer through the use of remote terminals. Here, in the form of early Teletype terminals, and later, video terminals, we see the emergence of a network topology in which computing horsepower is located in a central computer, away from the user.
It was the era of the mainframe and the dumb-terminal. Typically, these dumb terminals would lack storage or computation capability, as they were simply a display with a keyboard. By the 1970s, an operator (usually wearing flared trousers, if the textbooks I’ve seen are accurate), would sit in front of an amber or green screened terminal, thankful that he no longer needed to wait in line in to hand in a box of carefully arranged punch-cards.
Fast forward to the late 70s and a new paradigm was beginning to gain favour. If you’ve seen the film The Pirates of Silicon Valley, a dramatisation of the early years of Apple Computers, you may remember a scene in which the young Steve Wozniak is compelled to show his prototype personal computer to his employer, Hewlett Packard. In the scene that I’m talking about, Steve fears that his bosses will take his idea from him. The exchange goes something like this:
Steve, it is Steve isn’t it?
Steve, you say that this... gadget... of yours is for ordinary people. What on earth would ordinary people want with computers?
The idea that was being mooted was that of a personal computer, that is, a self-contained computer that only requires an electrical power supply in order to operate. Singular computers that did not need to be connected to a larger computer in order to run went on to become the popular face of computing for the remainder of the 20th century.
Ever since its establishment, the personal computer suffered a minor, organised assault by companies who had started calling terminals thin clients. These companies, such as Oracle and Sun, met with only limited success over the course of the 1990s. However, sometimes a good technological idea comes along, but suffers because it arrived at the wrong time. For example, consider Apple’s first attempt at a hand held computer, the ARM powered, touch screen equipped Newton. People accuse Apple of simply repackaging existing ideas in the form of the iPad, but they were pioneers in hand held computing 15 years ago.
The latest incarnation of the overall idea, of separating the storage and processing power from the user's point of access, is called cloud computing. Cloud computing will probably be successful to some degree because it benefits from the most powerful but mundane natural force there is: evolution. The computing environment has changed and people have decided that they want what cloud computing has to offer. What’s more, they’re willing to give up some of the benefits of true personal computers to get it. It will take a while, but already, people are starting to recognise the advantages of a cloud style solution such as Google Mail and Google Docs.
So, take my advice: in a few years time, when a young, hip kid tells you about the new idea in computing, to have self contained computers with local storage and processing power, try to look surprised.
UK based freelance writer Michael Reed writes about technology, retro computing, geek culture and gender politics.
Free DevOps eBooks, Videos, and more!
Regardless of where you are in your DevOps process, Linux Journal can help!
We offer here the DEFINITIVE DevOps for Dummies, a mobile Application Development Primer, and advice & help from the expert sources like:
- Linux Journal
- High-Availability Storage with HA-LVM
- DNSMasq, the Pint-Sized Super Dæmon!
- Real-Time Rogue Wireless Access Point Detection with the Raspberry Pi
- Localhost DNS Cache
- March 2015 Issue of Linux Journal: System Administration
- Days Between Dates: the Counting
- Resurrecting the Armadillo
- The Usability of GNOME
- Linux for Astronomers
- You're the Boss with UBOS