Leo Natan Posted February 3, 2010 Share Posted February 3, 2010 Read the label at the bottom right. :p Link to comment Share on other sites More sharing options...
C++vid Posted February 3, 2010 Share Posted February 3, 2010 ATI have had Tesselation since 2001 (called TrueForm). In software, just like Nvidia now with Fermi. http://blogs.amd.com/play/tag/tessellation/ That's good and all. All I'm saying is that the folks over at OpenGL were the first ones to actually implement it to their API. In fact, I'd be willing to bet that the same people that worked/developed Tessellation for Microsoft DirectX are the same ones that first implemented it back in ~2007 for OpenGL (before the whole Microsoft FUD campaign against OpenGL just around the release of Windows Vista). Link to comment Share on other sites More sharing options...
Buio Posted February 3, 2010 Share Posted February 3, 2010 I know what PhysX is, and I had a GPU that supported GPU accelerated PhysX, an 8600 GT Yes, but in the message I quoted you where trashing PhysX, when it was about GPU-accelerated PhysX. PhysX runs fine in many games only using CPU calculated physics. So the API in general is fine. Well unless you think that it sucks too, but then I have to disagree. Link to comment Share on other sites More sharing options...
Leo Natan Posted February 3, 2010 Share Posted February 3, 2010 So true... :rofl: Link to comment Share on other sites More sharing options...
Subject Delta Posted February 3, 2010 Share Posted February 3, 2010 Yes, but in the message I quoted you where trashing PhysX, when it was about GPU-accelerated PhysX. PhysX runs fine in many games only using CPU calculated physics. So the API in general is fine. Well unless you think that it sucks too, but then I have to disagree. Well GPU accelerated PhysX sucks then. In general I think PhysX is a waste of time though, I have never noticed any real increase in visual quality from using it, but then I may not have used it in as many titles as you have Link to comment Share on other sites More sharing options...
Vice Posted February 4, 2010 Share Posted February 4, 2010 Well GPU accelerated PhysX sucks then. In general I think PhysX is a waste of time though, I have never noticed any real increase in visual quality from using it, but then I may not have used it in as many titles as you have The PhysX API is actually quite good but the way its presented to the user makes it seem less than it is. For example if you play a first person shooter that uses the PhysX API everything in the game that uses Physics will use that API. Imagine Half Life 2 and the incredible physics that the game employed this is all included in any game that makes use of the PhysX API. And it still runs in Software on the CPU. BUT some games that use the API also include 'Extra' effects for when you have the GPU power to calculate them like huge amounts of debri from destroyed objects, very realistic cloth instead of just an artists rendering. The point is the API is more important than just its GPU accelerated parts. If you turn PhysX "off" in UT3 your still using the API for every physics calculation in the game just some of the extreme applications of the API in that game are deactivated (and nothing is now run on the GPU). The API itself has been ported to the PS3, XBOX360, Wii, iPhone and the Android platform as a fast general use physics API its only on computers where NVIDIA offer their GPU's to process it. I personally run my machine with GPU Accelerated PhysX deactivated. Link to comment Share on other sites More sharing options...
x-byte Posted February 4, 2010 Share Posted February 4, 2010 What you mean is NVidia doesn't have a full dedicated unit for tessellation, it still does the computing in hardware. And NVidias own numbers show a big advantage for their solution compared to AMD's. How well it will perform in real tests we don't know yet. But if they can keep those number as high as promised even during full GPU load, it will be impressive. But everything they said so far are just paper facts. They need to deliver soon. I'm just saying that not having hardware support might be a bad thing. Software is normally a lot slower than a hardware based solution because it cost more processing power. Bu we'll see what Nvidia have come up with. Will be interesting to see. I really don't care about tests done by the same company that creates the hardware. That's good and all. All I'm saying is that the folks over at OpenGL were the first ones to actually implement it to their API. In fact, I'd be willing to bet that the same people that worked/developed Tessellation for Microsoft DirectX are the same ones that first implemented it back in ~2007 for OpenGL (before the whole Microsoft FUD campaign against OpenGL just around the release of Windows Vista). So what if OpenGL had it first? There is a reason DirectX is a more popular development plattform other than its tech. Link to comment Share on other sites More sharing options...
Buio Posted February 4, 2010 Share Posted February 4, 2010 Regarding PhysX it's free to use the SDK, and there has been some nice games using it without GPU accelerated physics. If it's better or worse physics API than f.e. Havok or Bullet I don't know, I guess a game programmer has to answer that. Link to comment Share on other sites More sharing options...
C++vid Posted February 4, 2010 Share Posted February 4, 2010 I'm just saying that not having hardware support might be a bad thing. Software is normally a lot slower than a hardware based solution because it cost more processing power. Bu we'll see what Nvidia have come up with. Will be interesting to see. I really don't care about tests done by the same company that creates the hardware. So what if OpenGL had it first? There is a reason DirectX is a more popular development plattform other than its tech. Yup. And I just mentioned that reason =). Link to comment Share on other sites More sharing options...
Johnson Posted February 25, 2010 Share Posted February 25, 2010 http://www.itp.net/579398-gtx-400-series-to-launch-on-march-26th Link to comment Share on other sites More sharing options...
neocongirl Posted February 25, 2010 Share Posted February 25, 2010 wait doesn't nvidia re-badge gpu's? anyway heard they are having horrible yields so can't wait to see them fail. Still glad you price mapped and were anti-consumer nvidia?! ^_^ karma always comes around Link to comment Share on other sites More sharing options...
Recommended Posts