Observation: Cloud computing is nothing new
Cloud computing is not only the latest buzz term, it might well be the model of computing that powers the 21st century. However, it’s easy to forget that personal computing, in which each user has a standalone system that can operate without a network, is itself a relatively new approach.
The first practical computers were enormous behemoths composed of clicking relays and vacuum tubes. Much of the early development of these multi-ton monsters had been spurred by the allied code-breaking effort during World War II. For the first thirty years of the history of general purpose computers, computer time was the exclusive privilege of large institutions and governments.
One of the first breakthroughs in bringing down the cost of computer access was the concept of a time-sharing system. In such a system, multiple operators can access the resources of the computer through the use of remote terminals. Here, in the form of early Teletype terminals, and later, video terminals, we see the emergence of a network topology in which computing horsepower is located in a central computer, away from the user.
It was the era of the mainframe and the dumb-terminal. Typically, these dumb terminals would lack storage or computation capability, as they were simply a display with a keyboard. By the 1970s, an operator (usually wearing flared trousers, if the textbooks I’ve seen are accurate), would sit in front of an amber or green screened terminal, thankful that he no longer needed to wait in line in to hand in a box of carefully arranged punch-cards.
Fast forward to the late 70s and a new paradigm was beginning to gain favour. If you’ve seen the film The Pirates of Silicon Valley, a dramatisation of the early years of Apple Computers, you may remember a scene in which the young Steve Wozniak is compelled to show his prototype personal computer to his employer, Hewlett Packard. In the scene that I’m talking about, Steve fears that his bosses will take his idea from him. The exchange goes something like this:
Steve, it is Steve isn’t it?
Steve, you say that this... gadget... of yours is for ordinary people. What on earth would ordinary people want with computers?
The idea that was being mooted was that of a personal computer, that is, a self-contained computer that only requires an electrical power supply in order to operate. Singular computers that did not need to be connected to a larger computer in order to run went on to become the popular face of computing for the remainder of the 20th century.
Ever since its establishment, the personal computer suffered a minor, organised assault by companies who had started calling terminals thin clients. These companies, such as Oracle and Sun, met with only limited success over the course of the 1990s. However, sometimes a good technological idea comes along, but suffers because it arrived at the wrong time. For example, consider Apple’s first attempt at a hand held computer, the ARM powered, touch screen equipped Newton. People accuse Apple of simply repackaging existing ideas in the form of the iPad, but they were pioneers in hand held computing 15 years ago.
The latest incarnation of the overall idea, of separating the storage and processing power from the user's point of access, is called cloud computing. Cloud computing will probably be successful to some degree because it benefits from the most powerful but mundane natural force there is: evolution. The computing environment has changed and people have decided that they want what cloud computing has to offer. What’s more, they’re willing to give up some of the benefits of true personal computers to get it. It will take a while, but already, people are starting to recognise the advantages of a cloud style solution such as Google Mail and Google Docs.
So, take my advice: in a few years time, when a young, hip kid tells you about the new idea in computing, to have self contained computers with local storage and processing power, try to look surprised.
UK based freelance writer Michael Reed writes about technology, retro computing, geek culture and gender politics.
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide
|Nativ Disc||Sep 23, 2016|
|Android Browser Security--What You Haven't Been Told||Sep 22, 2016|
|The Many Paths to a Solution||Sep 21, 2016|
|Synopsys' Coverity||Sep 20, 2016|
|Naztech's Roadstar 5 Car Charger||Sep 16, 2016|
|RPi-Powered pi-topCEED Makes the Case as a Low-Cost Modular Learning Desktop||Sep 15, 2016|
- Readers' Choice Awards 2013
- Android Browser Security--What You Haven't Been Told
- Download "Linux Management with Red Hat Satellite: Measuring Business Impact and ROI"
- The Many Paths to a Solution
- Nativ Disc
- Synopsys' Coverity
- Naztech's Roadstar 5 Car Charger
- Securing the Programmer
- RPi-Powered pi-topCEED Makes the Case as a Low-Cost Modular Learning Desktop
- CodeLathe FileCloud Google Chrome Extension