The Internet has revolutionized the way people use their computers. Applications no longer need to be installed on local machines and Web browsers marginalize pesky hardware requirements. The changes we now enjoy didn't happen overnight. Join us as we take a brief look at cloud computing history.
What is Cloud Computing?
Cloud computing refers to the provisioning of hardware and software services remotely via high speed network and Internet connections. Together with virtualization, the rendering of hardware services in the form of a software virtual machine image, cloud computing has transformed our dependence on local hardware resources to a remarkably efficient, secure, and productive dependence on remote servers and software applications.
Although cloud computing now is commonplace, there was a time when most computing was done at the desktop, requiring large hardware and software expenditures and intense administration and support. Now, many computers are little more than thin clients that access applications, run programs, and store data all on machines located in the cloud. How did this happen? Let's take a look at cloud computing history.
Image Credit: Wikimedia Commons/百楽兎
It's hard to imagine that what we now consider bleeding edge technology actually began decades ago, long before many current technology workers were ever born. Still, it's true. A man who became known as the father of artificial intelligence, John McCarthy, predicted that computing would one day become a distributed resource much like a public utility, except two-way. A book detailing the future of computing as a utility was published in the mid-1960s by another man, Douglas Parkhill. True to these predictions, computer users now plug into a grid much like the power and water networks to access information rather than rely on localized computing power.
Although the abstract concept of cloud computing has been evolving for decades, the term itself is rather new, dating back to 1997 when it was borrowed from the telecommunications industry to refer to hosted software and hardware services managed off of the customer's premises.
Of all the technologies that had to converge to make cloud computing possible, high speed networking was perhaps the most important. Without the ability to get huge amounts of data transferred between clients and hosts, cloud computing would always be too slow to be practical. Widespread deployment of broadband Internet access and an upgraded Internet backbone have made it possible to run computer applications through a Web browser almost as fast as they could run if they were installed directly on the client computer.
Other technologies including processor architecture and virtualized servers also have been vital to the development of cloud computing.
Through cloud computing, customers became responsible for getting "plugged in" to the cloud while service providers became responsible for making sure that software and hardware services are available when they are needed.
Although many people still recognize Amazon as a large bookseller, Amazon is a cloud computing company and a key player in cloud computing history. Being an early adopter of cloud computing technology in its corporate network, the company branched out to lease out space on its own network through the cloud. The idea was met with extremely high demand by customers who needed a scalable service that would only charge them for the computing power that they actually used.
Amazon Web Services (AWS), perhaps the first viable commercial cloud based computing service became available in 2006, providing the impetus for the rapid acceleration of the development and adoption of cloud computing that we now witness. With AWS, users can purchase computing hardware services, create scalable virtual servers, and then provision them for software services, making it one of the largest and most flexible cloud computing platforms that are available.
Google et al
Academic entities joined with Google and IBM in the late 2000s to research cloud computing. through this collaboration, private clouds were developed and eventually publicly accessible cloud services (beyond Web mail) were spawned. Before the end of the decade, market researchers such as Gartner were predicting an increased IT trend toward cloud computing in the coming years. So far, those predictions have come true.
Google continues to be a pioneer in cloud computing, and is one of the top distributors of cloud services through its application marketplace. Other players such Salesforce's AppExchange are serving growing numbers of private, corporate, and government computing needs.
The Future of Cloud Computing
Some computer experts have witnessed a shift from centralized to distributed, to localized computing and back again. These people speculate that the pendulum will again swing back toward the localization of computer software and hardware resources as latency and security issues become increasingly problematic. Right now there is little indication of any industry reversal of the cloud computing trend. In fact, it appears to be intensifying.
Google recently unveiled its Chrome operating system, the first computer OS to be entirely dependent on the cloud. With minimal software and hardware overhead, Chrome-based machines boot very quickly, have little or no internal storage, and use cloud-based applications for everything they do. If this type of architecture catches on, cloud computing will likely be here to stay.