nVidia to Add PhysX to Geforce 8-Cards

During Nvidia's fourth-quarter financial results conference call, Nvidia shed a little more light on its acquisition of Ageia and what it plans to do with the firm's PhysX technology. CEO Jen-Hsun Huang revealed that Nvidia's strategy is to take the PhysX engine and port it onto CUDA (Compute Unified Device Architecture), a C-like application programming interface Nvidia developed to let programmers write general-purpose applications that can run on GPUs. All of Nvidia's existing GeForce 8 graphics processors already support CUDA, and Huang confirmed that the cards will be able to run PhysX.

"We're working toward the physics-engine-to-CUDA port as we speak. And we intend to throw a lot of resources at it. You know, I wouldn't be surprised if it helps our GPU sales even in advance of [the port's completion]. The reason is, [it's] just gonna be a software download. Every single GPU that is CUDA-enabled will be able to run the physics engine when it comes. . . . Every one of our GeForce 8-series GPUs runs CUDA."

Report a problem with article
Previous Story

Norton Ghost 14: Screenshots & Download

Next Story

Wal-Mart Goes Blue

45 Comments

Commenting is disabled on this article.

in the short term, this allows me to use my x38 mobo with a pair of 8800's so the second x16 pci express slot doesnt go to waste... since it isnt going to do sli anyway, might as well make the other card do physics....

i wonder, when the card is doing physics, will it output anything at all? it might show really funky stuff on the screen...

reminds me of this http://www.nvidia.com/object/tesla_gpu_processor.html ... its basically a 8800 gtx , with no video outputs and twice as much ram... tesla + cuda + physx = ?
didnt nvidia also have something called the 'quantum effects technology' that uses the g80 to do physics?

also, in general the cuda thing should also allow the supported graphics cards to be more flexible, like the distributed computing projects should be able to use the nvidia gpus , up to now people like folding@home could only use ati's gpus

also, unlike the physx cards , the gpus are general purpose and not specialized for physics...

I'm thinking that this will help nvidia push their new Hybrid SLI technology as well. Since the onboard GPU for Hybrid SLI is supposed to be like a 8200 series gpu, it will be able to support CUDA as well. I'm not sure how intensive the physics are though. I'm also not sure how the performance/power of the 8200 compares to a standard PhysX card (it may be quite a bit more powerful for all we know). So this may give a new reason to look into Hybrid SLI. Guess we just have to wait and see what nvidia says about it.

As far as I am aware Cuda is a separate part of the 8 series chip, so essentially it's free physics.
I am still waiting for the software to use the hardware mp4 encoding supposedly built into my 8 series though.

Unfortunately this is not the case. CUDA allows programmers to write programs which actually run on the GPU itself. I think the point here is that an 8-series card can now be used as a physics card, rather than a video card. So most likely you will need 2 8-series cards, or 3 if you want SLI+physics.

http://techreport.com/discussions.x/14147

Read the second quotation.

(WICKO said @ #20.1)
Unfortunately this is not the case. CUDA allows programmers to write programs which actually run on the GPU itself. I think the point here is that an 8-series card can now be used as a physics card, rather than a video card. So most likely you will need 2 8-series cards, or 3 if you want SLI+physics.

http://techreport.com/discussions.x/14147

Read the second quotation.

i think you missed the word "Potentially" when you read that! :suspicious:

(ZombieFly said @ #20.2)

i think you missed the word "Potentially" when you read that! :suspicious:

What? Even if it is a possibility, it still doesn't change the fact that CUDA is *not* a second chip which will process physics. I think you missed my point ;)

In fact, even the mention of it being a possibility (and how they said they think it will encourage people to buy a second GPU) implies that a video card can only do one or the other.

It's gonna be a nice addition to current cards. I really am beginning to think ATi is going to have to have a miracle to ever be able to catch up to the nvidia price/performance ratio and just catch up w/ speed overall.

I personally don't see this being a hugeeee bonus for current card owners. By the time NVidia has this out and games are making use of the new feature I'm sure the 8800's will be at a point where you'll probably be looking to disable it to maintain a framerate anyway.

(Smigit said @ #18)
I personally don't see this being a hugeeee bonus for current card owners. By the time NVidia has this out and games are making use of the new feature I'm sure the 8800's will be at a point where you'll probably be looking to disable it to maintain a framerate anyway.

this feature has been out for what i think 2 years in the ageia hardware add-on but now it is coming to the GF8 cars via software from nvidia and i doubt anyone would disable it as it would free up physx being done on the CPU for the most part on games made for the Physx add-on hardware

Sure but how many current games really take alot of advantage of Ageia currently? I can think of maybe a handful but not to many. Any future games are likely to strain the 8800 cards as it is given that they havent really had any upgrade in the past 18 months performance wise (GTX is still more or less the top tier product besides the slightly faster Ultra). With dual cores and now quad core CPU's becoming more mainstream, I think that for gaming currently the video card is the main bottle neck (especially as you scale the resolution up to higher resolutions) in which case I'd rather have the CPU handling the physics if I had to pick between the CPU and GPU.

I realise that it's getting a solution that could only be done in hardware up to now onto a software platform more people can make use of it (to which end it's good), but I feel theres not many current games that take advantage of it and for future titles the 8800's will likely struggle enough without the extra workload. For a card lineup that's 18 months old I'm unsure how useful it's ultimately going to be and the real benefits will be seen on future cards (to which end I'd love to see them incorporate it at a hardware level).

this sounds good to me, saves you £150 on one of those PhysX cards i guess... we'll just have to wait and see performance wise but i expect it shouldnt be too bad if at all

that is good news.
the main thing here will be the performance/eye candy hit current cards would take when doing physics...

my 8000gts 320mb has already returned the investment but this could do a little more for free.

Bad move! hehe....

(From http://www.next-gen.biz/index.php?option=c...&Itemid=61)

Edge: You’ve more or less already placed your cards on the table about this, but what do you make of discrete physics cards, like Ageia’s PhysX?

Gabe Newell: I think that’s a horrible idea. At the same time that the distinction between the GPU and CPU is going away the PPU guys want to come in and define a new set of abstractions, where we have memory and data that’s really far away from the CPU and GPU... How do I tell when something breaks, or gets pushed by a monster? All these decisions I have on my CPU have to sit around until they are resolved on the PPU and GPU, and you end up with a physics decelerator. This is the reason you want a homogenous architecture.

(IntelliMoo said @ #13)
Bad move! hehe....

(From http://www.next-gen.biz/index.php?option=c...&Itemid=61)

Edge: You’ve more or less already placed your cards on the table about this, but what do you make of discrete physics cards, like Ageia’s PhysX?

Gabe Newell: I think that’s a horrible idea. At the same time that the distinction between the GPU and CPU is going away the PPU guys want to come in and define a new set of abstractions, where we have memory and data that’s really far away from the CPU and GPU... How do I tell when something breaks, or gets pushed by a monster? All these decisions I have on my CPU have to sit around until they are resolved on the PPU and GPU, and you end up with a physics decelerator. This is the reason you want a homogenous architecture.

Gabe isn't the be-all and end-all of games. Plus, this is about using the GPU to perform physics, removing the need for a separate card, so it doesn't really apply anyway.

(IntelliMoo said @ #13)
Gabe Newell: I think that’s a horrible idea. At the same time that the distinction between the GPU and CPU is going away the PPU guys want to come in and define a new set of abstractions, where we have memory and data that’s really far away from the CPU and GPU... How do I tell when something breaks, or gets pushed by a monster? All these decisions I have on my CPU have to sit around until they are resolved on the PPU and GPU, and you end up with a physics decelerator. This is the reason you want a homogenous architecture.
Well with NVidia putting this all onto the Video card I guess it's going in the direction of being more homogenious as Gabe put it.

it will probably be a CHIP in the future, for now they will enable physics in game with a software update...

you dont have to run physics in the games if you dont want too, or you could use a lower resolution and use physics...

be happy you will get something for free that you can try out... if not you have not lost anything.

Wasn't it better to have a seperate chip doing all the physics work rather than putting that load onto the GPU? Seems a bit like a step backwards to me, but I could be wrong I guess.

(Skyfrog said @ #10)
Wasn't it better to have a seperate chip doing all the physics work rather than putting that load onto the GPU? Seems a bit like a step backwards to me, but I could be wrong I guess.

SLI.

yeah but they never implement it at the gpu level until the 8 series, i am wondering what kind of performance hit this will have, especially on a 8800gt which i have

Sooo does this mean that PhysX enabled games will now let you turn up the physics when this comes? And all games that support it will be able to give u added eye physics candy?

If so, then thats great. Something good came out of them buying PhysX. =)

My 8800 GT smiled a lot. The only problem is with games like Crysis, where these cards can't even run it at max smoothly.

(Citrusleak said @ #2.6)
My 8800 GT smiled a lot. The only problem is with games like Crysis, where these cards can't even run it at max smoothly.

8800 GT 512 Golden Sample (G92) runs it at maz just fine.

love this card.

That sounds cool, I'm guessing this will only be useful on a 8800 series GPU though. My 8600GTS struggles enough as it is just doing the graphics, doing physics as well would bring it to a crawl!