Pin Me

The Future of GPUs: The (C)GPU

written by: M.S. Smith•edited by: J. F. Amprimoz•updated: 5/26/2011

The last year has seen the price of GPUs reach incredible lows. The future promises to hold much of the same. But the future of GPUs will also be the ground of a general computing prize-fight between some major industry players.

  • slide 1 of 5

    Graphics for Less

    There is change in the air.

    The prices of graphics cards have been falling like a stone. Performance, on the other had, has been increasing continually. GPU-compute initiatives like Open CL, CUDA, and DirectX 11's Compute Shade have been gaining momentum. The fastest cards available today currently have more than one GPU on board. But where are GPUs heading? The decreases in prices and the increase in power seem to indicate that no matter where GPUs end up, it will be good for consumers.

  • slide 2 of 5

    Accepting the Low End

    Perhaps the most under-reported trend currently occurring in the computer graphics industry is the difference between the power of GPUs and the power required of GPUs. This is a trend that replicates what has occurred recently in the CPU industry. But while the CPU industry seems extremely aware of the fact that most applications due not require the fastest processors which can be currently produced, the GPU industry still seems tied down to the idea that only the fastest GPUs are worthwhile.

    Or at least that's the popular image. Actions, however, seem to indicate that at least ATI is very well aware that the fastest GPUs may not necessarily be the best. This became apparent with the Radeon 4800 series. These cards and their GPUs, while not as quick as the fastest ones that are available form Nvidia, are close enough. Their chips are also far smaller than Nvidia's newest series of GPUs, so they are undoubtedly less expensive to manufacture. This would likely explain why Nvidia has kept its GTX200 series mostly limited to higher-end cards while ATI Radeon 4000 series products are covering nearly every segment of the market. Nvidia has not been unresponsive, however - its older G92 products cover the lower end of the market and do provide some competition.

  • slide 3 of 5

    A Small Price in the Big Picture

    The end result of this is that there are extremely adequate graphics cards available under $100 dollars. The Radeon 4770 is a good example, as is the older 9600GT. These cards should be capable of running the majority of games on shelves today and are certainly more than capable of maxing out details on games like The Sims 3 and World Of Warcraft. These cards also aren't nearly as large as the fastest cards on the market, nor as demanding, which means they can be inserted into relatively mundane PCs without issue.

    These cards haven't received the lions share of coverage, but they are undoubtedly where the market is heading. While it was nice that ATI and Nvidia recently released the speedy and well priced Radeon 4890 and GTX 275, the coverage such cards receive in the popular media well exceeds their market share. Cheap, sub-$100 dollar graphics cards are where the market is at, and unless there is some sudden burst of popular games which require high powered graphics hardware it is unlikely that this will change. While ATI and Nvidia has traditionally fought using high-powered cards aimed at enthusiasts, the future will see the low-price mainstream market become the focus of both companies.

  • slide 4 of 5

    Harnessing and Brewing an Install Base

    The prevalence of adequate low-cost GPUs couldn't come at a better time. Both ATI and Nvidia have made it clear that they want to expand the roll of their graphics card from that of displaying graphics to a more general-purpose computing role. There is certainly reasons why this would be welcomed, as graphics cards are incredibly good at certain tasks. For example, Nvidia has been flaunting a program called Badaboom which can transcode video at speeds many times in excess of what a CPU can accomplish. There is a double benefit here, as well - the GPU can not only perform the task faster, it also leaves the CPU almost entirely free, which means that transcoding video does not result in a slow-down of normal computing tasks.

    Such impressive performance is useful, but would be pointless if there aren't enough video cards around to run them on. But Nvidia is clearly not aiming Badaboom at high-end cards. Its own press about the program usually uses low-end cards like the 9800GTX and even the older 8800GT rather than the fastest, most expensive new cards. Considering the number of low-cost GPUs available, it seems that the attempt to transforms GPUs into GPGPUs could not come at a better time. However, while it seems certain that the use of GPUs for more general computing tasks will become prominent in the next five years, it is uncertain which avenue of advancement will be used. Nvidia has CUDA, ATI has Stream, and Intel wishes to break into the market with an x86 compatible graphics card, codnamed Larrabee. Open CL and the DirectX 11 compute shader also show promise.

  • slide 5 of 5

    The Turtle and the Hare

    While the trends towards cheaper GPUs and the rise of GPGPUs seem predictable and even certain, the speed of adoption is another matter altogether. Setting standards in the PC industry can be difficult, and there is little indication that the various players currently in or attempting to enter the market for GPUs have any intention to play nice. It is also not entirely clear if Nvidia, which still has a larger discrete graphics card market share than ATI, wants to allow the graphics card market to trend towards cheap, adequate GPUs.

    It will ultimately be cooperation which determines how quickly the future arrives. If a standard for GPGPU use was agreed upon tomorrow then applications which use the potential of GPGPUs would become extremely prevalent in a short period of time. But there is an inherent disagreement between Intel and ATI/Nvidia, which makes the adoption of a standard unlikely. Anything except the adoption of x86 would be bad for Intel and the adoption of x86 would be bad for ATI and Nvidia. Someone will have to blink eventually, but this is one staring match which could last for a long, long time.

The Future of...

Taking a look at what the future may hold for us as Memory, Motherboards, and GPUs develop with advancements in technological precision.
  1. The Future of Motherboards
  2. The Future of Memory
  3. The Future of Graphics Cards
  4. The Future of GPUs: The (C)GPU