Intel's Westmere: The Tides Of A Graphics War
To some surprise, Intel recently announced that its first 32nm processors will also integrate graphics onto the CPU itself. This change, while not unprecedented, is an important one, and is already causing a battle between Intel and Nvidia.
Clash Of The Titans
It is no secret that Intel and Nvidia are involved in a bit of a quarrel, and have been for the last year or two. Intel, the seemingly invincible giant of the CPU scene, has been slowly changing its goals to include graphics solutions as part of its overall strategy. This came to a head when Intel announced Larrabee, a discrete graphics solution currently in development by Intel which could allow it to compete with companies like Nvidia and AMD. Nvidia, which (despite AMD's wonderful new product line) is the king of GPU sales, has not taken these encroachments kindly. Tempers have flared, feelings have been hurt - and now things look to be taking a turn for the worse.
What's Wrong With Westmere (For Nvidia)
Traditionally, the relationship between Nvidia and Intel has been an open one. Intel has allowed Nvidia to design and sell chipsets for their processors, and Nvidia has gladly done so, as it offers Nvidia the chance to promote their graphics solutions and their brand, in addition to simply selling chipsets. The advantage of Nvidia's chipsets is currently their superior integrated graphics chips. The modern chipsets, featuring 9300 and 9400 series integrated GPUs, are excellent all around products, providing a much more robust graphics solution, which is invaluable if you use your PC for media or want to crank up Vista Aero's options without worry. They leave Intel's IGP, the 4500 GMA, dead in the water.
Westmere, however, threatens to throw a wrench into the long-standing relationship. I won't go into the details of Westmere, as we already have an article which covers the architecture of the chip. However, if you haven't heard anything about it, let me summarize: Westmere is an Intel processor which places the integrated graphics (and a few other things) onto the processor itself. With Westmere, Intel integrated graphics won't merely be part of the motherboard. They will be physically attached to the processor itself.
In terms of performance, this is likely to have some advantages. But performance is probably not the only reason that Intel wants to make this move. By integrating graphics onto Westmere, Intel takes a jab at its chipset competitor, Nvidia. It also is attempting to undermine the marketshare of a company that many people believe will soon be a direct competitor of Intel's. Intel clearly wants to force Nvidia out of the integrated market, and integrating graphics on Westmere is a clear part of their plan.
Nvidia and Intel have been trading many words lately. Although Nvidia still remains the champion of GPUs (with AMD ready to take the throne at a moment's notice) and Intel's graphics solutions remain weak, Intel clearly is planning to combat Nvidia at some point in the future. One step in Intel's plan appears to be cutting Nvidia out of the integrated graphics market, using a new 32nm CPU code-named Westmere.
What Does This Mean For You?
The largest question surrounding Westmere is how an integrated graphics solution which is actually placed on board the CPU will interact with discrete graphics solutions. At this point, it can only be speculated. But the fact that Intel has chosen to pursue a lawsuit which argues that Nvidia does not have the legal right to build chipsets for Nehalem processors, including the Core i7 and upcoming Westmere, is telling.
It may be that Intel feels that once graphics are integrated onto the processor itself, the company should no longer have to support integrated graphics solutions placed on motherboards. If Intel can successfully argue that Nvidia does not have the legal right to produce chipsets for Intel's newest products (and thus wins their case), then it eliminates the need to be concerned about compatibility. Intel, and only Intel, would be providing chipsets for its processors, which would result in the eventual vanishing of motherboard-integrated GPUs from Intel-based machines. An upcoming article will discuss Intel's attempt to knock Nvidia out of the chipset game in further detail.
For the consumer, this results in both advantages and disadvantages. On the plus side, the integration of a graphics solution onto the CPU is another step towards what appears to be one of Intel's long term-goals, which is moving towards building "systems on a chip" - processors which include most of the functionality which is traditionally the responsibility of the motherboard. Doing this could result in lower power requirements and physically smaller hardware, which would in turn reduce costs and make it easier to build small, light, and thin mobile computers which are also reasonably powerful.
On the downside, this would allow Intel to prevent any company from providing competition to Intel in the integrated graphics and chipset market. This is not a good result for consumers, because Intel's integrated graphics have always been the least appetizing of any integrated solution available. Intel has always seemed unwilling to push any significant improvements in its integrated graphics products, and getting rid of Intel's only competition in that arena would only lessen Intel's incentive to think up better products.
If you're not an enthusiast, this would affect you directly, as most pre-built computers which do not include discrete graphics would, by default, be left with Intel integrated graphics. If you are an enthusiast, this change would have less of an immediate effect. However, the fact that all Intel-based computers would ship with an Intel integrated graphics as standard would create a certain base-line level of graphics performance. Considering how poor Intel's graphic solutions perform, this sort of base-line could very well hinder efforts to create better operating systems with more 3D elements. It would also do nothing good for the PC gaming industry. It might help out AMD though, since they would be the only company to offer a CPU that will run on a platform with decent integrated graphics.
News To Watch For
The results of the lawsuit which Intel currently has filed again Nvidia will significantly determine how the future plays out. If Intel can deny Nvidia the chance to make chipsets for their new processors, then Intel will be clear to proceed with its plan to make integrated graphics part of the processor, not the motherboard. If, however, Intel loses or decides not to follow up on the lawsuit, then Intel will have to compromise with Nvidia. In such a situation, the market probably would not change significantly, although it is possible that Intel might try to press harder on Nvidia by refusing to cooperate with their attempts to make their own chipsets for new Intel products.
It will also be interesting, in the long term, to see how AMD weighs in on this debate. Unlike Nvidia or Intel, they span the gap between CPUs and GPUs, as they offer both products. Their plans for CPU-integrated graphics, codenamed Fusion, have been delayed to 2011. Obviously, AMD does not make chipsets which are compatible with Intel products. But any change in the dynamic between Intel and Nvidia could result in a response from AMD, perhaps getting the latter pair to circle their wagons together.
More To Explore