Observation: Cloud computing is nothing new
Cloud computing is not only the latest buzz term, it might well be the model of computing that powers the 21st century. However, it’s easy to forget that personal computing, in which each user has a standalone system that can operate without a network, is itself a relatively new approach.
The first practical computers were enormous behemoths composed of clicking relays and vacuum tubes. Much of the early development of these multi-ton monsters had been spurred by the allied code-breaking effort during World War II. For the first thirty years of the history of general purpose computers, computer time was the exclusive privilege of large institutions and governments.
One of the first breakthroughs in bringing down the cost of computer access was the concept of a time-sharing system. In such a system, multiple operators can access the resources of the computer through the use of remote terminals. Here, in the form of early Teletype terminals, and later, video terminals, we see the emergence of a network topology in which computing horsepower is located in a central computer, away from the user.
It was the era of the mainframe and the dumb-terminal. Typically, these dumb terminals would lack storage or computation capability, as they were simply a display with a keyboard. By the 1970s, an operator (usually wearing flared trousers, if the textbooks I’ve seen are accurate), would sit in front of an amber or green screened terminal, thankful that he no longer needed to wait in line in to hand in a box of carefully arranged punch-cards.
Fast forward to the late 70s and a new paradigm was beginning to gain favour. If you’ve seen the film The Pirates of Silicon Valley, a dramatisation of the early years of Apple Computers, you may remember a scene in which the young Steve Wozniak is compelled to show his prototype personal computer to his employer, Hewlett Packard. In the scene that I’m talking about, Steve fears that his bosses will take his idea from him. The exchange goes something like this:
Steve, it is Steve isn’t it?
Steve, you say that this... gadget... of yours is for ordinary people. What on earth would ordinary people want with computers?
The idea that was being mooted was that of a personal computer, that is, a self-contained computer that only requires an electrical power supply in order to operate. Singular computers that did not need to be connected to a larger computer in order to run went on to become the popular face of computing for the remainder of the 20th century.
Ever since its establishment, the personal computer suffered a minor, organised assault by companies who had started calling terminals thin clients. These companies, such as Oracle and Sun, met with only limited success over the course of the 1990s. However, sometimes a good technological idea comes along, but suffers because it arrived at the wrong time. For example, consider Apple’s first attempt at a hand held computer, the ARM powered, touch screen equipped Newton. People accuse Apple of simply repackaging existing ideas in the form of the iPad, but they were pioneers in hand held computing 15 years ago.
The latest incarnation of the overall idea, of separating the storage and processing power from the user's point of access, is called cloud computing. Cloud computing will probably be successful to some degree because it benefits from the most powerful but mundane natural force there is: evolution. The computing environment has changed and people have decided that they want what cloud computing has to offer. What’s more, they’re willing to give up some of the benefits of true personal computers to get it. It will take a while, but already, people are starting to recognise the advantages of a cloud style solution such as Google Mail and Google Docs.
So, take my advice: in a few years time, when a young, hip kid tells you about the new idea in computing, to have self contained computers with local storage and processing power, try to look surprised.
UK based freelance writer Michael Reed writes about technology, retro computing, geek culture and gender politics.
Pick up any e-commerce web or mobile app today, and you’ll be holding a mashup of interconnected applications and services from a variety of different providers. For instance, when you connect to Amazon’s e-commerce app, cookies, tags and pixels that are monitored by solutions like Exact Target, BazaarVoice, Bing, Shopzilla, Liveramp and Google Tag Manager track every action you take. You’re presented with special offers and coupons based on your viewing and buying patterns. If you find something you want for your birthday, a third party manages your wish list, which you can share through multiple social- media outlets or email to a friend. When you select something to buy, you find yourself presented with similar items as kind suggestions. And when you finally check out, you’re offered the ability to pay with promo codes, gifts cards, PayPal or a variety of credit cards.Get the Guide
|Omesh Tickoo and Ravi Iyer's Making Sense of Sensors (Apress)||Apr 21, 2017|
|Low Power Wireless: 6LoWPAN, IEEE802.15.4 and the Raspberry Pi||Apr 20, 2017|
|CodeLathe's Tonido Personal Cloud||Apr 19, 2017|
|Wrapping Up the Mars Lander||Apr 18, 2017|
|MultiTaction's MT Canvus-Connect||Apr 17, 2017|
|Android Candy: Facebook Everything?!?!||Apr 14, 2017|
- Teradici's Cloud Access Platform: "Plug & Play" Cloud for the Enterprise
- Low Power Wireless: 6LoWPAN, IEEE802.15.4 and the Raspberry Pi
- The Weather Outside Is Frightful (Or Is It?)
- Simple Server Hardening
- Understanding Firewalld in Multi-Zone Configurations
- Gordon H. Williams' Making Things Smart (Maker Media, Inc.)
- Non-Linux FOSS: Control Web-Based Music!
- Bash Shell Script: Building a Better March Madness Bracket
- Buddy Platform Limited's Parse on Buddy Service