Nvidia aren’t known for their demure and shy antics, so I wasn’t surprised at a press briefing a few weeks ago when they launched an attack on what some quarters – namely Intel with their new Larrabee GPU – who have identified ray-tracing as the future of graphics.

They spent a great deal of time assuring the assembled members of the IT press that it was a waste of time – every game since before the turn of the Millennium (indeed, since the demise of voxels) because every game is made using polygons and that developers wouldn’t want to alter their techniques and systems around a new, somewhat experimental technology.
So, why have Nvidia gone and bought a ray-tracing company?
It’s certainly a bit of a strange move for a company that’s previously denounced the system as pretty worthless for Nvidia’s main market: games. Then again, they did buy Ageia and their Physx technology, which has barely made a ripple in the virtual oceans of games like Crysis and Oblivion, so they do have a history of odd investments.
But, for all this prophesising, their position has recently take a bit of an about-face. The company’s CTO, David Kirk, claimed that ray-tracing was suddenly part of their plans – and that they could integrate it with traditional rendering techniques to make games and graphical applications look even better. Even so, he still doesn’t sound entirely convinced, emphasising that ray-tracing is only ‘part of the future‘, admitting that, at the moment, ‘ray tracing is currently significantly slower than rasterization’.
So, what’s caused such a dramatic turnaround?
We’ve theorised before that Nvidia, for all their bluster, seem to be making various loud, angry noises out of fear. Sure, very little is known about Larrabee, but it’s another competitor in Nvidia’s main marketplace which, aside from ATI, they’ve had little recent competition in.
This just seems like more evidence to support this line of thought – if Intel are going to be incorporating their own ray-tracing and physics technology in their new GPU – developed in-house – then Nvidia need to compete. What better way, then, than by buying up companies already specialised in these things?
Or is there another method to Nvidia’s (apparent) madness?
Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.