written by: allychevalier•edited by: Rebecca Scudder•updated: 2/3/2010
Desktop computers are the bread and butter of the computer industry, and have been for decades. Virtually all advances in computing can be traced back to technology developed for better desktops, be it better graphic cards for computer games or more memory to store all your files.
slide 1 of 3
What Made Desktop Computers Possible
The integrated circuit and the microprocessor are two key pieces of technology that allowed computers to stop being the size of a room and start being manageable sizes, that is, something you could fit on a desk. This led them to be called “microcomputers", a term since dropped as they have largely become the norm. It also made computers far cheaper, and thus marketable to your average consumer.
As a consequence of these developments, the first desktop system for consumers came out in 1974, called Altair. It wasn't something you'd see the in the stores: it was mail order only, and you had to assemble it yourself. It appealed primarily to computer enthusiasts at a time. Still, the proverbial foot was in the door. Demand for these desktop computers expanded, and innovations quickly followed, the addition of keyboards and those bulky CRT monitors (remember those days before we had LCD screens?)
slide 2 of 3
The Rise of Desktops
In 1977 Apple was founded, and commenced selling the original Apple 1. They brought along innovations such as a color monitor and a graphical user interface to make it more user friendly. Further advances, such as the computer mouse made it all the more intuitive to use for consumers, furthering the usability and thus mainstream popularity of the desktop.
Early disk drive development, especially the floppy, made it possible to easily transfer data, and with it, software. Games, word processing programs, the primitive software enabled by the current hardware fueled consumer desire for ever-more complex software, and thus, hardware. A cycle of development formed, where the demand for more complex software fueled the demand for hardware that could handle it.
All the individual components that make up an individual desktop have been variously made completely modular, allowing for easy customization and upgrading on the behalf of the consumer, such as many Windows PCs, and completely integrated (“single unit"), allowing for greater convenience and a cheaper price, such as Apple's desktops. Which of these two paradigms will dominate is still in question.
With the rise of laptops, many argue that the future of desktops are called into question. As the technology required for today's ever-more-complex programs gets smaller, laptops increasingly seem like a good alternative. After all, if a device that is smaller and more portable can do the same thing as something that is bulky and immobile, then the average consumer is probably going to choose the former.
However, it is likely that there will always be a place for desktops, for new sophisticated software pieces that will always require more speed and more memory than what you can fit even in cutting edge laptops. Computer modeling is an excellent example of this. By not putting such a strict limit on the amount of space you can use for your hardware, or on the specific form factors, it's much easier to customize and upgrade a desktop computer, making it simply more convenient for many enthusiasts, and often a cheaper alternative.
Computer technology is constantly advancing at a breakneck pace, and doesn't seem like it will be slowing down anytime soon. Desktops, as the sum of current technology, will reflect these advances for years to come.