History of Virtualization
Virtualization is, in essence, a very simple concept. Think of it as something similar to the oven in your kitchen being replaced by a centralized oven used by multiple residents in an Apartments complex, or using the laundromat instead of using your own washing machine. The concept behind this is improved utilization of a resource, with the attendant benefit to the users in terms of cost saving as a byproduct. In this sense Virtualization, in the context of computing, was first developed in 1960 to partition large mainframe hardware for better hardware utilization. Virtualization was first implemented more than 30 years ago by IBM as a way to logically partition mainframe computers into separate virtual machines. These partitions allowed mainframes to “multitask": run multiple applications and processes at the same time. Since mainframes were expensive resources at the time, they were designed for partitioning as a way to fully leverage the investment.
However, with the advent of inexpensive x86 desktops, serves and thin clients paving the way for distributed computing during the 1980s and 1990s, the need for multitasking was gradually eliminated. Again, with the broad adoption of Windows and the emergence of Linux as server operating systems, Virtualization was effectively abandoned.
The challenges cropping up with these x86 deployments included Low infrastructure utilization, Increasing physical infrastructure costs and IT management costs, insufficient failover and disaster protection, and high maintenance enterprise and end-user desktops. In 1999 VMware invented Virtualization for the x86 platform to address these challenges.