New Ageia PhysX processor on PCIe pictured

Saturday 08th September 2007, 12:12:00 PM, written by Rys

X-bit labs, via Newhua, have pictures of the next generation Ageia PhysX processor on PCI Express.

The company has had record quarterly sales of the existing physics accelerator, something Xbit puts down to good exposure at the recent Games Convention.

The pictures show a large die (which looks to be ~250mm2 at a glance) on a x8 PCI Express PCB, with the cooler from old GeForce 7-series models you might recognise. That cooler was a noisy nuisance, so here's hoping any production models use something a bit less offensive to the ear.

There's no real world on performance levels, price or a release date, so we'll have to wait for Ageia to give up the goods there. Lastly, we wonder what the split connected on the board edge is for. Chaining multiple PhysX processors is the likely reason.

Thanks to X-bit Labs for the tip.

Discuss on the forums

Tagging

graphics ± ageia, physx, next, generation, pci, express


Latest Thread Comments (24 total)
Posted by Bouncing Zabaglione Bros. on Sunday, 09-Sep-07 11:53:24 UTC
Quoting AlStrong
mm.... One thing that is of interest is Epic's (somewhat) recent incorporation of PhysX into UE3.0, though I don't know if that is only for their own games or if licensees get that as well. Given how popular UE3.0 seems to be, the latter case may be enough to present competition with Havok. But then again, maybe Epic used PhysX because it was cheaper and not necessarily better in any way compared to Havok's software. :|
Looks like it's going to be a UT3 mod (http://uk.theinquirer.net/?article=41883), so again very limited support that won't be standard.Given that the vast majority of the online community that Epic will try to build with UT3 won't have PhysX boards, I don't see it having much impact.I don't see why any gamer would spend money on Ageia ahead of a faster CPU with more cores, or a second GPU, or more memory, etc, and these will all have a bigger impact on the whole gameplaying experience.

Posted by AlexV on Sunday, 09-Sep-07 11:57:13 UTC
Quoting AlStrong
mm....

One thing that is of interest is Epic's (somewhat) recent incorporation of PhysX into UE3.0, though I don't know if that is only for their own games or if licensees get that as well. Given how popular UE3.0 seems to be, the latter case may be enough to present competition with Havok. But then again, maybe Epic used PhysX because it was cheaper and not necessarily better in any way compared to Havok's software. :|
My understanding is that their Novodex(sp?) stuff is actually quite solid(that's the name of their SDK AFAIR), asides from being free, and it actually does a number of things better than Havok does. The trouble for Ageia, as someone pointed out above, is that whilst their SDK is solid, it's also free, and a developer using it isn't exactly forced to code for their card, which is their primary source of income.

I'm still not totally sold on the GPU doing physics...I haven't yet seen a single current or upcoming implementation of this approach. It should do it quite well, but the question remains wheter or not there's going to be GPU muscle available to spare for physics?Multi-core CPUs are likely to be king-of-the-hill, if only for the fact that a CPU upgrade is a no-brainer and everybody is likely to have one, thus giving the largest possible installed-base.

Posted by Novum on Sunday, 09-Sep-07 16:20:09 UTC
The PhysX SDK is only free for noncommercial use.

Posted by Andrew Lauritzen on Sunday, 09-Sep-07 21:40:41 UTC
Quoting Morgoth the Dark Enemy
I'm still not totally sold on the GPU doing physics...I haven't yet seen a single current or upcoming implementation of this approach.
HavokFX has had production-quality stuff for years now. Although they only use it for "effect physics", there's no real reason why it couldn't be used for everything (they'd need to make it a bit more full-featured, but that wouldn't be hard). ATI/AMD has also been demoing full broad+narrow phase collision detection on the GPU for several years.

The upcoming game "Hellgate: London" will use a lot of GPU physics, both in terms of HavokFX and also real-time fluid sim (a la NVIDIA smoke demo) and more. It'll be out in the next few months IIRC.

Quoting Morgoth the Dark Enemy
It should do it quite well, but the question remains wheter or not there's going to be GPU muscle available to spare for physics?
If you need more power, add another GPU (or get a faster one, or both) :) I've never understood this argument about "how can the GPU do physics if it's doing graphics?" since conceptually it's no different from the CPU doing bits of both, and indeed the GPU is already much *more* parallel.

Quoting Morgoth the Dark Enemy
Multi-core CPUs are likely to be king-of-the-hill, if only for the fact that a CPU upgrade is a no-brainer and everybody is likely to have one, thus giving the largest possible installed-base.
This I can certainly agree with... with quad-core CPUs being

Posted by Tim Murray on Sunday, 09-Sep-07 22:00:18 UTC
Hypothesis:

SLI/Crossfire among heterogeneous chipsets will result in serious adoption of GPU physics, since you can then either get physics or faster graphics.

Posted by Techzen on Monday, 10-Sep-07 06:34:49 UTC
Very interesting conversation. I've been following these cards since the first came out and been watching the current models fall in price. As a gamer I was temped to get one a few times but never found enough convincing arguement to make the jump to purchasing one.

For me the flaws were lack of PCI-E (soon to be a non issue). The many reports of massive heat issues even when the user was not playing a game that would be using the card or not even playing a game. The high price of the card when I could put that $ towards another Vid card or other system upgrades with more overall system gain. And major lack of games using the engine to begin with.

My interest was again sparked when I came across the following while looking for some info about Havok (http://www.havok.com/content/view/187/77/):
Does Havok FX Support AGEIA?

Havok FX supports all hardware that can execute standard OpenGL and Direct3D code at the Shader Model 3.0 level. If the AGEIA card and drivers adopt and support Shader Model 3.0 industry standard, Havok FX support will be possible.

For me this is the cards greatest chance of becoming mainstream. If they can make the hardware support other engines then it has a chance in the gamer community. Havok support would mean HL2, Fear,Bioshock and a long list of other great games could use this card.

Dell is using the cards and even added a version to some laptops from what i have heard, so I am not sure where this company is going with this. But I hope it could turn into another usefull ,affordable link in the chain to making the gaming experience a better one. One can only wait and see.

Thanks again for the great info.

Posted by AlexV on Monday, 10-Sep-07 10:46:21 UTC
[QUOTE=AndyTX;1061767]HavokFX has had production-quality stuff for years now. Although they only use it for "effect physics", there's no real reason why it couldn't be used for everything (they'd need to make it a bit more full-featured, but that wouldn't be hard). ATI/AMD has also been demoing full broad+narrow phase collision detection on the GPU for several years.

The upcoming game "Hellgate: London" will use a lot of GPU physics, both in terms of HavokFX and also real-time fluid sim (a la NVIDIA smoke demo) and more. It'll be out in the next few months IIRC.


If you need more power, add another GPU (or get a faster one, or both) :) I've never understood this argument about "how can the GPU do physics if it's doing graphics?" since conceptually it's no different from the CPU doing bits of both, and indeed the GPU is already much *more* parallel.


This I can certainly agree with... with quad-core CPUs being

Posted by pax on Saturday, 15-Sep-07 11:41:32 UTC
Well now that havok has been bought by intel I guess ageia potentially just got a shot in the arm. I mean will amd and nvidia wanna pay royalties to intel on havok tech?

Posted by Bouncing Zabaglione Bros. on Saturday, 15-Sep-07 11:59:55 UTC
Quoting pax
Well now that havok has been bought by intel I guess ageia potentially just got a shot in the arm. I mean will amd and nvidia wanna pay royalties to intel on havok tech?
Havok sell to game developers, so why would AMD or Nvidia care one way or another, as neither of them make games?

Posted by pax on Saturday, 15-Sep-07 12:19:42 UTC
I thought both nvidia and ati were to license havok to do physics on gpu. Plus the possibility that havok would be optimized for intel cpus and gpus vs the competition.


Add your comment in the forums

Related graphics News

Diving into Anti-Aliasing
Travelling in Style: Beyond3D's C++ AMP contest
Beyond Programmable Shading CS448s first slides available
Khronos release OpenGL 3.3 and 4.0
Mazatech release AmanithVG 4.0, supporting OpenVG 1.1
OpenGL 3.0 is here (finally)
[Analysis] TSMC 40G to deliver up to 3.76x the perf/mm^2 of 65G & Power Implications
Old News: AMD CTO resigns, NVIDIA CFO retires, DDR3 for MCP7A, S3, etc.
SwiftShader 2.0: A DX9 Software Rasterizer that runs Crysis
S3 launches DirectX 10.1 Chrome 400 GPUs