John Carmack: Not a Big Fan of PhysX


Recommended Posts

I am just surprised MS hasn't built physics into DirectX yet, it would be a logical step seeing as how widely supported DirectX is now.

Because then you would have various physics developers / hardware manufactures crying monopoly and taking Microsoft to court. I think Microsoft should just let DirectX be what it is, a programming interface and then let developers and hardware manufactures take advantage of this 'interface', that should leave most people happy. Besides, adding physics to your game isn't the most demanding of tasks these days and most developers tend to rely on Havoc for physics, seems to be the industry standard in fast, easy and reliable physics.

Link to comment
Share on other sites

Then add physics to OpenGL, or make a completely new open source physics API, and call it something along the lines of OpenPhyshics.

Link to comment
Share on other sites

So basically a convoluted way of prancing around the point and actually not saying anything I didn't know - It does physics, animations and human mimicking stunningly good and that is the point I made and it's a point that is hard to argue.

my point was that it only really does very limited physics, i.e ragdoll. You still need something else for everything else

I am just surprised MS hasn't built physics into DirectX yet, it would be a logical step seeing as how widely supported DirectX is now.

They are most likely working on it, but you don't make a proper physics engine overnight. if it's built into DX it needs to be solid, and better than PhysX or Havok.

and due to the new thing MShas going with the graphics cards maker, they also have to work more closely with them on new developements to DX so they can support it with new hardware almost at release, while they need to follow the DX standard and not keep on adding manufacturer specific functions for compatibility hell on windows games,

Link to comment
Share on other sites

Direct Compute in DX11 allow developers to run things like physics on the GPU, perhaps newer versions on PhysX and Havoc will take advantage of that

Link to comment
Share on other sites

Well, the thing is, as everyone knows, physics has been in video games for as long as video games has been around, so it's really nothing new, granted it was basic use of physics and nothing like we have today.

Link to comment
Share on other sites

You also have to think about the time, cost and use of this possible 'OpenPhysics'. Creating a physics engine is no easy task, it takes a long time to create and perfect such an engine - Just look at Havoc, they have been constantly improving their 'engine' over the last many years and still much could be done to the engine. Now who would benefit from spending a ton of their time creating an engine worthy of competing in the current market and offering it for free?

Direct Compute in DX11 allow developers to run things like physics on the GPU, perhaps newer versions on PhysX and Havoc will take advantage of that

PhysX is being run through CUDA which taps into the GPU, so it IS GPGPU processing - A DX11 feature. Although the process could probably be optimized.

my point was that it only really does very limited physics, i.e ragdoll. You still need something else for everything else

Doing the complete human physics isn't exactly limited but you are right, you need another engine 'for everything else' but most games just need the human physics to excel because static death animations are dire. A Havoc ( Objects ) Euphoria ( Human ) Physics mesh would be awesome in say a shooter or pretty much any game for that matter :)

Link to comment
Share on other sites

Direct Compute in DX11 allow developers to run things like physics on the GPU, perhaps newer versions on PhysX and Havoc will take advantage of that

I posted an article about DirectX Compute, so far it only seems nVIDIA is onboard.

You also have to think about the time, cost and use of this possible 'OpenPhysics'. Creating a physics engine is no easy task, it takes a long time to create and perfect such an engine - Just look at Havoc, they have been constantly improving their 'engine' over the last many years and still much could be done to the engine. Now who would benefit from spending a ton of their time creating an engine worthy of competing in the current market and offering it for free?

I know that, and nothing happens overnight, but a completely open physics engine would be a thing to aim for, instead of always having to pay a licensing fee to a company just to use something in a game.

Link to comment
Share on other sites

I know that, and nothing happens overnight, but a completely open physics engine would be a thing to aim for, instead of always having to pay a licensing fee to a company just to use something in a game.

But I ask again, who would have a real interest in releasing such a thing and spend the required time? There is little to no gain in doing this unless they are looking to get picked up by another company which just starts this evil circle all over again.

Link to comment
Share on other sites

But I ask again, who would have a real interest in releasing such a thing and spend the required time? There is little to no gain in doing this unless they are looking to get picked up by another company which just starts this evil circle all over again.

You never know, and there are already people out there working on an open API for physics, so who knows what'll happen in the future.

Link to comment
Share on other sites

I think something like that is aimed at amateur developers, I highly doubt something like that would make its way into the big-leagues and surpass the top brands and whatever engine becomes the industry standard when DirectX 11 / GPGPU hits. Again, only time will tell :)

Link to comment
Share on other sites

Well, you've got to start somewhere, Havok didn't get to where it is today without someone starting it, so an open physics API is possible.

Link to comment
Share on other sites

No but Havoc wasn't exactly entering a bogged down market either, most developers have picked their bases and a major overhaul is coming up - I doubt any amateur newcomers would be able to release a product for free that had a high enough quality to compete.

Link to comment
Share on other sites

I agree with John - right from the start, PhysX was nothing but a stupid gimmick, and as such, Ageia going belly-up and being sold out was inevitable.

The only reason nVidia bought it is that they needed something to point against Ati's DX 10.1, which they constantly failed to implement in their cards and thus were looking pretty bad.

PhysX is mostly a lot of hot air and marketing propaganda, there's only little actual substance to it - which is why nVidia could implement it that easily per driver even on older cards that were never designed to run it.

Link to comment
Share on other sites

I agree with John - right from the start, PhysX was nothing but a stupid gimmick, and as such, Ageia going belly-up and being sold out was inevitable.

The only reason nVidia bought it is that they needed something to point against Ati's DX 10.1, which they constantly failed to implement in their cards and thus were looking pretty bad.

PhysX is mostly a lot of hot air and marketing propaganda, there's only little actual substance to it - which is why nVidia could implement it that easily per driver even on older cards that were never designed to run it.

PhysX as a separate card was always going to be a failure. Ageia knew it and every user should have known it, it was never going to take of, it was a company created for the sole reason of being bought out, and have physX implemented on graphics card as a physX chip or on the GPU itself.

PhysX as a product however there's nothign wrong with, it's a very good physics simulator, fast, efficient and accurate.

As for why NVidia could implement it so easily, the PhysX chip was essentially a GPU. and Nvidia already had their CUDA interface thing going.

Link to comment
Share on other sites

physx is stupid, the gpu is usually being used more than the cpu in games anyway, why tax it more for physics?. My 9600gt can't handle physx in any game, you either have to have a top end card or a dedicated physx card (which needs a mb with two PCI-e slots)

Link to comment
Share on other sites

PhysX was nothing more than a marketing gimmick. It should be something all relatively modern cards should be doing. But instead of giving us a nifty new driver feature or something, they just tried to sell us another gadget to slap into our already overcrowded PCs. Give me a break! There have been at least a dozen games over the past few years that had some kind of software physics built in, so why would anyone need another overpriced card just for that? You mean to tell me the big boys, like Nvidia and ATI, can get our graphics cards to render just about everything (except feature films), but can't produce realistic "PhysX"? For crying out loud...

Link to comment
Share on other sites

physx is stupid, the gpu is usually being used more than the cpu in games anyway, why tax it more for physics?. My 9600gt can't handle physx in any game, you either have to have a top end card or a dedicated physx card (which needs a mb with two PCI-e slots)

one of the solutions/ideas is that as you upgrade your system and get better video cards you might have extra cards 'laying around'. these extra cards can be thrown into your new system to do dedicated physics.

alternatively, people with sli can also spread the load of physics across their cards (or at least that's how it appears to me in the nvidia control panel).

the whole physx thing was pretty stupid. one year at quakecon someone asked them why we couldnt just do this with video cards.. and they're response was something to the effect that video cards just couldnt do that kind of processing. it's laughable because it all comes down to the optimized instruction sets that video cards (and the physx cards) have which is to essentially do high performance vector math. graphics transformations and lighting are basically done using fancy (linear algebra) vector math which is the same exact thing that is used when simulating physics. they didnt stand a chance.

Edited by nvme
Link to comment
Share on other sites

physx is stupid, the gpu is usually being used more than the cpu in games anyway, why tax it more for physics?. My 9600gt can't handle physx in any game, you either have to have a top end card or a dedicated physx card (which needs a mb with two PCI-e slots)

Seeing how the last big game to push the GPU was Crysis and that no other game today will do that again as the big companies are turning into consolefans. So games on PC will be limited by console graphics mostly. And at the same time NVidia and ATI is coming with new generation of DX11 cards that will focus a lot on the GPGPU giving us big power for that. So the GPU will not be used as much in future games, possibly leaving a lot of room for GPGPU tasks like AI and physics (ATI has a demo with pathfinding AI done on the GPU).

Also NVidia should really have used a different name or tag to the name. Because some people are just generally slagging PhysX, when in reality it is a good physics API. It is the part of hardware accelerated PhysX gimmicks that we can question. There are many games using PhysX for physics on the CPU and looking good. Like for example Trine. Maybe NVidia should have called GPU accelerated part something else like PhysXG so it doesn't get mixed up.

Looking at topics like this "John Carmack: Not a Big Fan of PhysX" and the other link "AMD says PhysX will die" tells me that NVidia should have used another name for it. The general PhysX API for physics is good. As I'm not a developer I can't say it is better or worse than Havok. They seem to be fairly similar but with different tools and focus.

Edited by Buio
Link to comment
Share on other sites

Just to clear up what I meant above with having two different names;

PhysX - General PhysX physics API which works on every computer and console using CPU mainly.

PhysXG (or whatever name that isn't the same) - PhysX functions that requires a GPU.

I just don't like the confusion and people generally telling PhysX is crap, when it is in reality (almost all times at least) the GPU-accelerated extra goodies that people don't like.

Link to comment
Share on other sites

Seeing how the last big game to push the GPU was Crysis and that no other game today will do that again as the big companies are turning into consolefans. So games on PC will be limited by console graphics mostly. And at the same time NVidia and ATI is coming with new generation of DX11 cards that will focus a lot on the GPGPU giving us big power for that. So the GPU will not be used as much in future games, possibly leaving a lot of room for GPGPU tasks like AI and physics (ATI has a demo with pathfinding AI done on the GPU).

Do you have a link to this? As a dev, I'm curious :p

Link to comment
Share on other sites

I have not tested it myself, but saw the video. Also I'm a user, so I do not know if it is good, or even offer any advantages over a CPU based version. But at least it shows a possibility. And with Larrabee and future offerings from AMD/NVidia, it seems the general use of the GPU will be much wider than it is today.

It is available here

http://developer.amd.com/samples/demos/pages/froblins.aspx

They describe it as;

Artificial Intelligence with Dynamic Path-finding on the GPU

The Froblins demo employs state-of-the-art, massively parallel artificial intelligence computation for dynamic path finding and local avoidance on the GPU. The froblins busily move from goal to goal while avoiding treacherous regions of the terrain. The characters spend time working at gold mines, foraging wild mushrooms, and napping at their camp sites. The user can explore every corner of this virtual world by flying around the environment using a variety of input paradigms. The user may also influence the behavior of the froblins by placing new goals in the environment and even adding new obstacles such as dangerous poison fields and summoning frightening ghost froblins! As new goals and obstacles are placed in the environment, the froblins adapt by dynamically changing their paths.

Edited by Buio
Link to comment
Share on other sites

Seeing how the last big game to push the GPU was Crysis and that no other game today will do that again as the big companies are turning into consolefans. So games on PC will be limited by console graphics mostly.

That is only the case where a developer focuses on console first, and then ports that code to the PC. Lots of companies do the reverse and they develop on the PC, that code is then somewhat easily ported to 360 with dumbed down features, then you have the cluster**** of the PS3 which is a far more intensive porting process.

Look at Batman Arkham's Asylum, or Sacred 2. The PC versions of the game have much better graphics, PhysX support, and all sorts of stuff that the console versions can't do due to limited horsepower.

It's typically quite easy to tell if the game was developed first for PC because you will have actual graphic options to control various things like shading, lighting, etc. Whereas a console port will have nothing but standard resolution and brightness settings.

Link to comment
Share on other sites

Normally all PC versions have higher resolution textures and resolution making for better overall graphics. But it doesn't mean that developers are pushing the PC limits. Batman uses Unreal Engine and Epic now seems to focus fully on consoles after GOW and UT3 flops on the PC. It doesn't hinder a game from looking really good though, like Batman Arkham Asylum.

But what game will push G300/RV870?

Link to comment
Share on other sites

NVIDIA seems to be aggressively pushing PhysX. It's already in big titles like Mirror's Edge, Sacred 2, and Batman: Arkham Asylum (see the full list here). I don't see PhysX going away anytime soon. For now, we can only ponder about how DX11 features would be used for physics and AI. And if it's really good, it might push PhysX over the edge of the cliff.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.