War for Visual Computing: Why AMD Could Have the Best Chance
The Trouble with NVIDIA and Intel
In the world of the PC, you'd have to be both blind and deaf to miss last week's riveting drama that became known instantly as the "NVIDIA and Intel Battle". In nearly back to back swings, both Intel and NVIDIA executives landed metaphorical punches to the gut of one another in attempt to show analysts, stock holders and, we assume they still matter, PC users, why their companies are going to be the technology leaders for the future computing. There were days of bickering between NVIDIA and Intel and though we'll hold our opinions on the particulars of this battle for another day, there was one surprising omission from the debate:
Yes the company is having financial difficulties that many have used as bait for a potential takeover or buy out, but I am going to spend this time to discuss why I think both Intel and NVIDIA are overlooking a still-competitive opponent which could turn out to be a drastic mistake.
NVIDIA vs. Intel: A Brief Synopsis
Though I did say I would be discussing the NVIDIA and Intel battle royale in a separate piece I need to give you some context for my coming opinions on the entity that is AMD. It is important to see what both Intel and NVIDIA are actively saying and doing in order to get a better feel for how AMD's current philosophies matter more than you might initially admit. The relationship between Intel and NVIDIA has always been rocky as evident by the fact that NVIDIA's chipsets for Intel processors were consistent late to the party (nForce4 Intel Edition) and were hampered by bus licensing costs that raised motherboard prices. Yes, NVIDIA's 680i series of chipsets should be considered a success during a time when Intel's chipset team faltered and left the 975X chipset as the sole competition for MUCH too long a period. But as Intel's new P35, X38 and even X48 chipsets have shown, the company is back in the saddle and making very high quality core logic.
Part of Intel's revived chipset division
What is really setting off this once timid partnership to a blood-hungry hunt is that in recent months Intel has made no attempt to hide the fact that it intends to enter the discrete graphics market. The Larrabee architecture, which we have detailed in many articles in the past, is going to vary greatly from current GPU architectures but Intel still intends to become a market leader; considering that this has been NVIDIA's turf for the last 2-3 years (even with moderate competition from ATI/AMD) we can see why the green giant would be piping mad. The discrete graphics market, both in consumer level graphics and workstation/development graphics, is the meat and potatoes of NVIDIA's revenue with the chipset, mobile and other fringe products simply there as icing on the proverbial cake.
Even with Intel's entry into the discrete graphics market will likely come as late as Q4 2009 or early 2010 both companies are hard at work blasting one another. NVIDIA claims that the GPU is going to make waves into other forms of computing that just graphics: a market dubbed "visual computing" that encompasses things from 3D rendering to video encoding/decoding, scientific and medical simulations, finance, imaging and more. When NVIDIA first introduced the CUDA architecture in early 2007, it was meant for just this purpose: to bring the power of the highly-parallel GPU architecture to applications that can utilize its speed. Intel has spent it's time pushing ray tracer as the future of rendering (you can see articles from Intel's Daniel Pohl here at PC Perspective as well as interviews with top industry minds on the topic) while also saying that both discrete graphics and integrated graphics have a future pointing to its Larrabee for add-in cards and the Nehalem CPU with integrated GPU on die as examples.
The battle between them becomes one of relevance: can NVIDIA, a company with no presence in the mainstream processor market continue to find a way to be relevant in a world of all-inclusive platforms created by Intel; and of course AMD as we'll see.