Sunday, February 17, 2008

Make a VPU Socket Already! Get It Over With!

My lands. If the floating point unit took this long to become mainstream I'd be using a Core 2 Duo with a math co-processor still.

Not to go all Halfhill or anything, but it has appeared that a vector co-processor or VPU's on-die were an immediate need for at least the past two or three years. Both Nvidia and AMD are bringing GPU's closer to the CPU, and it at once appeared that AMD's multi-core platform has included VPU/FPU/integer math/memory controller/CISC/RISC/misc./bacon&swiss together to take many types of tasks and integrate them under one die's roof.

And now that Nvidia has wisely acquired AGEIA and their PhysX platform it seems a general purpose vector processing platform is getting closer. A standalone PhysX never took off on the consumer marketplace, and purchasing another Radeon or GeForce just for physics processing (as both AMD and Nvidia were touting at electronics expo's) never caught on either. But a generalized, open-platform physics API that takes advantage of unused GPU cycles would definitely catch on. Spin your GPU fan faster and get real-time smoke effects... sign me up.

Nvidia has been extremely forward-thinking with their Linux drivers, and I hope they continue to be trend setters with the PhysX API. The PhysX engine and SDK was made freely available for Windows and Linux systems prior to Nvidia's acquisition, but hardware acceleration currently only works within Windows. Since Nvidia is porting PhysX to their own CUDA programming interface, it seems entirely probable that the Linux API would plugin to Nvidia's binary-only driver. And why not release the PhysX API under GPL? They could port to CUDA (whose specification is already open, available and widely used) then reduce their future development efforts by letting a wide swath of interested engineers maintain the codebase as needed.

Widely available drivers, development kits and APIs will help drive hardware sales in an era where Vista w/ DirectX 10 adoption isn't exactly stellar. I won't invest in being able to run Crysis in DX10 under native resolution for a 22" LCD, but I will invest to get more particle effects or more dynamic geoms. At that point you're adding to the whole gameplay proposition instead of polishing up aesthetics, with continuously diminishing results.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.