4. The Video Card
Unlike the floppy, the video card is showing far less signs of dying out. However, with modern consoles surpassing the PC in terms of general acceptance and compatibility, most people today would rather buy an Xbox instead of upgrading their PC to a gaming rig with an appropriate video card.
The first computers did not have enough power to spare for video and game processing, therefore, a pretty obvious idea came to pass. Why not have the graphics re-routed to an external processor? This idea became the video card, a computer-in-a-computer that is dedicated solely to rendering graphics and video.
For such an obvious idea, it took the good people at Amiga and IBM quite a while to come up with a card that was capable of rendering both 2D and 3D images, videos, and games on the same card. In the 90s, this card finally became a reality through the rivalry of two camps – Nvidia and ATI. These two companies are still neck-and-neck today, fiercely trying to create something that the other one won’t be able to replicate so quickly.
The important idea behind the video card was that it managed to push videos and games into the future. Without the video card, the gaming industry would not be the monster industry it is today, raking in billions and billions of dollars. All consoles and PCs use the proprietary technology that Nvidia and ATI initially dreamed up of, therefore, the video card is a direct contributor to the console gaming revolution as well as the PC gaming revolution. From text adventures, we suddenly were capable of playing games like Super Mario 64 using the mouse and keyboard. This also generated the FPS genre as we know it today, which was eventually ported to consoles.
Continue reading this series to find out our picks for the top three hardware innovations of all time.