An interesting little article over at The Inq takes some time to analyze the opinions of research firm “GC Research” that has decided to comment on the growing trend of GPU computing.  Let’s look at some excerpts:

Boffins from market research outfit, GC Research, reckon the age of disruptive parallel computing has arrived and has the potential to change the competitive landscape of the industry in a big way. Not that this should come as a surprise, computer scientists and chip firms have been talking about it for several years.

GC reckons that even AMD’s purchase of ATI a few years back goes to prove how vital it was, and still is, for Daamit to push towards a more cooperative form of computing, not only for the benefit of consumer multimedia applications, but also for affordable scientific computing.

Indeed, while at the time many analysts scolded AMD for a purchase it really couldn’t afford, we have always known that there was good reason for the conglomerate to exist as it currently does.  Even more so, though the numbers and performance on current CPUs doesn’t show it, AMD might have the best chance in this long term battle:

Littler chipper, currently somewhat behind rival Intel in X86 performance, is already making strides towards that very goal, with the rollout of its Tigris notebook platform and Fusion marketing campaign. Both efforts put the firm firmly on track to hit its target of having a 32nm-based Fusion platform – integrating the CPU and GPU onto a single piece of silicon – by 2011.

Ah, yes, that wonderful term: Fusion.  Never gets old does it?  Actually, yeah, after as many delays as we have seen, it kind of does.  But the point is correct still: AMD is the only CURRENT tech company that really do both sides of the computing coin very well.  There is a little thing called Larrabee coming from Intel sometime in 2010 and that could really shake things up IF (a big if) it is successful in creating a new programming paradigm.  If it simply is left to emulate current GPU technology (like DX Compute) forever it will have a rough time. 

Analyst predicts that GPGPU computing will be important in the future - Graphics Cards 2
An example of Havok Cloth

As for Chipzilla, it too has been chipping away at Larrabee, the high-end discrete graphics engine it hopes will serve as a GPGPU engine. The Intel master plan is to have lower-end desktop and notebook offerings – including a multi-chip packaged CPU on 32nm and a GPU on 45nm – by the first half of 2010 and move the graphics engine to the same process nodes as the CPU by 2011.

This means Intel may well still come out a winner in GPGPUs over the longer term. But GC thinks replacing profitable X86 cycles with cheaper and less lucrative GPGPU cycles may hurt the chip giant’s revenue growth a smidgen, especially in the consumer and scientific computing segments.

Either way, we have known (see here, here and here) GPGPU computing, or derivatives like OpenCL, were going to be very important in the future for ALL consumers not just high-end gamers, and it’s nice to the mainstream analysts picking it up…