Cloud Computing – The Most Overhyped Buzzword of 2010
Where Cloud Computing is Headed
At what point did industry experts decide that it was better to use an online and remote computer to process our basic daily tasks? Why did this become the trend when computers are constantly getting more advanced and more powerful? It is a mystery why so many experts claim advantages to cloud computing when the inherent disadvantages are so crippling that any person who relies on a fully cloud dependent system is at an immediate disadvantage. The draw of cloud computing comes from a fantasy about cost savings and shared resources, but this ignores how powerful and inexpensive computers are in modern times. On top of that, it is important to remember that this revolution has happened once before. To be sure, it was on a much less publicized scale, but it happened nonetheless.
Where Cloud Computing Was Born
In the late 80’s, around the time computers first started their decline in price, many larger companies, understanding the power of using a PC to assist daily functions, were unwilling to pay the price for hundreds of individual computers. The cost would be astronomical and computers were very costly to maintain as well, only adding to the overall price. Enter the mainframe computing scheme. The infancy of cloud computing was born 20 years ago with the advent of thin clients, or dumb terminals, and a mainframe to which that were all connected. The idea was, buy one expensive mainframe and have it do the number crunching and data manipulation that a single computer could do, then outfit the users with dumb terminals that did nothing more than power on and display the output of the mainframe. This served a valuable service at this time as a cost saver and easy management of a company’s infrastructure. However, as computers became more powerful and less expensive, the idea of a mainframe system lost its luster.
Modern Cloud Computing
As the price of computers declined steadily, and most companies starting to outfit users with newer, more capable PC’s, the usage of mainframe setups became less popular. After the turn of the century, the cost of a computer was such that it was cheaper to replace a PC than to actually fix it. This shift has been steady and still remains today, even with the idea that cloud computing brings some imagined advantages. While older cloud computing was designed to use all local resources, cloud computing now uses the web and remote systems all over the world. The plan is the same, using shared computing resources to accomplish a task by offloading the tasks to an online cloud of systems. However, this is where to advantages become incredibly weak.
Where the Hype About Cloud Computing Fails
The primary dependency of cloud computing is an always on an Internet connection. Your connection is the lifeline to the cloud, and it allows you to access the shared resources online. Therein lies the problem - how can your entire computing experience be dependent on something as fickle as a cable modem? Internet connections go down, ISP’s have problems, and even your computer’s hardware can prevent you from being online. Once this happens, if you are dependent on cloud computing, you instantly are unable to do even the smallest task. Also, what if you are a user limited by bandwidth usage constraints? Certain cellular modems are saddled with a 5GB limit per month, which with cloud computing’s requirement, severely limits is usefulness. Bearing in mind that mobile data devices are becoming more popular by the day. Lastly, regarding cost, it is imperative that users remember how inexpensive computers are in modern times. Never before in history can anyone get an amazing amount of power for very little investment. Just that one fact alone nullifies the supposed cost advantages of cloud computing and shows how flawed that trend truly is. With cheap computers, unreliable internet connections and bandwidth limitations, it is a wonder that cloud computing is even defended by its proponents in this day and age.