The changing face of 3D Graphics


Recommended Posts

I originally posted this on the Stalker: Oblivion Lost forums, but I thought it might look good here. I do not intend this an NVIDIA vs ATI discussion, so please don't start making declarations of n00bness or act befitting an ATI Fanboy or NVIDIOT. I also apologise to those who are fan boys or NVIDIOTS and assure you that those comments were not meant in a negative way

This new era of "Graphics card wars" are not been beneficial to the consumer. When two companies rival each other, it normally leads to better products for much cheaper.

Instead, we have hardware companies siding with Software developers (although in all fairness, I have heard that NVIDIA's developer support is rival to none).

We have Valve Software telling us NVIDIA is crap based off benchmarks they themselves performed for Half Life 2. (note that they refused to use the Forceware generation of drivers, and I am yet to see these drivers used by Valve to benchmark their product)

Then we have Unreal Tech and Id Software siding with NVIDIA. There are also recent claims made by NVIDIA that rival cards do not render a scene correctly (UT2003 been cited as the main example).

What we have is two high profile companies and their allies throwing words at each other.

In such times, normally product quality goes up and consumer price goes down. In terms of graphics hardware, this is not happening. Hardware levels are been pushed harder than the "natural procession", leading to a massively increased production cost thus a greater instore cost. Then this product is outdated much too quick (the 9800xt/9600xt is meant to be outdated by as soon as march with the release of a new ATI card).

To further make it difficult, the major flagtitle for Direct X 9 graphics (Half Life 2) was delayed. After pumping information into the public telling them what hardware to purchase for their software, they stooge everyone and delayed their product (read the fine print, they delayed before they were hacked. Completely unrelated events). People who did race out and purchase the recommended cards end up with a high price card and no game to play it on. The truth is that here in Australia, there is no way that I am going to fork up $450 (AUD) just to play a $100 (AUD) game. By the time Valve do release their game, the face of Graphics hardware will be greatly changed. ATI will have their new R423 out and Valve Software will be telling us to buy this NEW card instead.

Then to further difficult matters, rumour has it that Direct X 10 will surface early next year. Graphics cards will jump instantly in and release DX10 shaders, whilst any games not yet released will also shift their game into new shaders (this = delay).

So I have humbled myself this way: I don't care for Anti-aliasing (jaggeds are my friend or Anistropic Filtering. As long as I can get at least 30 frames per second at 640x480 (absolute minimum, only ever been forced there with Deus Ex 2) then I am satisfied.

I apologise for the length of this lengthy rant, it's just been brewing up in me since these ATIvNVIDIA threads became the cool thing.

(oh yeah, ATI's 32 bit precision is a waste of architecture. Pixar studio's own renderer which aims for Photo realistic quality is only 24 bit precision, and looks nicer than any game I am yet to see)

Link to comment
Share on other sites

so basicly you are complaining that video cards technology is developing too fast for you?

fine. stick with GF2 for 2 more years and then upgrade to whatever is 'budget card' at that time.

1st you are bitching that you are unhappy with R400+hl2 deal and that valve will recommend R423 or that games will be used dx10 api when it's out but then you are saying that you want to play at 640x480 w/o any image quality.

so are you that clueless to blindly do what valve is saying? if you have ati9800 or nv9600 are you that insecure that new video cards will be faster?

video card wars is absolutly and without question benefit to us.

if it wasn't for that we'd still be stuck in tnt2 performance and quality.

it's up to you to study the facts, not the hype, and then make an educated purchase based on those facts, not hype.

if you can't do that, well, you'll get screwed and will be buying new $500 video card every 6month

Edited by MxxCon
Link to comment
Share on other sites

Like, what about that nvidia demo of advanced REALTIME graphics, with the very pretty and very realistic DAWN/DUSK ladies/elves/whatever ?

Or, the year before with ATI's demo of RACHEAL, the then insanely realistic talking head!

Neither would be possible without these neat advances that you find unessecary....

Because without antialiasing, tru-form, etc. 32-bit precision, we couldn't get anything that looked close to that, in realtime.

And we'd need more than 640x480 resolution to do it in. :huh:

(PS: If you think Pixar is the best the 3d rendering/dev-ing community has to offer..... ughhhh.... Renderman is very good..... but still...... other stuff is better....)

(PPS: (To MxxCon) .... What's wrong with buying a $500 dollar video card every 6 months? :D )

Link to comment
Share on other sites

lol, i buy a new vid card every maybe 1-1 1/2 years not much of a big deal, i buy mainstream vid cards, and dream for the best (like me having the 9600pro, and dreaming for a 9800xt), not a problem for me at all

Link to comment
Share on other sites

I get what he's saying though. When you try to be cheap (my parents are...) you have to upgrade A LOT. I've gone through 3 different video cards in the last year and a half.

Link to comment
Share on other sites

so basicly you are complaining that video cards technology is developing too fast for you?

fine. stick with GF2 for 2 more years and then upgrade to whatever is 'budget card' at that time.

1st you are bitching that you are unhappy with R400+hl2 deal and that valve will recommend R423 or that games will be used dx10 api when it's out but then you are saying that you want to play at 640x480 w/o any image quality.

so are you that clueless to blindly do what valve is saying? if you have ati9800 or nv9600 are you that insecure that new video cards will be faster?

video card wars is absolutly and without question benefit to us.

if it wasn't for that we'd still be stuck in tnt2 performance and quality.

it's up to you to study the facts, not the hype, and then make an educated purchase based on those facts, not hype.

if you can't do that, well, you'll get screwed and will be buying new $500 video card every 6month

Well, now that you ask I'm actually currently running very happily a GeForce 4 Ti 4800.

Hmmm, my post does seem to be a bit of a bitch-fest....

Only meant to observe the way the face of the graphics industry is changing, and even perhaps why the console is rapidly becoming the only cost effective way to be a gamer.

sorry, just musing aloud

Link to comment
Share on other sites

Like, what about that nvidia demo of advanced REALTIME graphics, with the very pretty and very realistic DAWN/DUSK ladies/elves/whatever ?

Or, the year before with ATI's  demo of RACHEAL, the then insanely realistic talking head!

Neither would be possible without these neat advances that you find unessecary....

Because without antialiasing, tru-form, etc. 32-bit precision, we couldn't get anything that looked close to that, in realtime.

And we'd need more than 640x480 resolution to do it in.  :huh: 

(PS: If you think Pixar is the best the 3d rendering/dev-ing community has to offer..... ughhhh.... Renderman is very good..... but still...... other stuff is better....)

I'm not arguing against the new stuff. Just saying that the companies are under greater pressure to surge ahead faster than the development cycle should be. Mayhaps that's where NVIDIA are currently falling behind, their R&D cycle been greatly reduced because of greatly increased competition. You do have to admit it's been a while since there has been such fierce competition in the Graphics hardware industry.

On a side note, with a couple of questionable mods I had the Dawn demo running on my GeForce 4 MX 440 albeit very slow. That same method doesn't work for my Ti 4800, unfortunately. And don't forget that Dawn can be modded to work with ATI cards (some say it runs better on ATI cards, but that's probably because it runs in OpenGL).

I know there's a lot better than Pixar's renderers, but they are very nice quality. They were used for Lord Of The Rings: The Two Towers movie (don't know if they were used for Return Of The King or not). And I'm yet to see any game that looks as realistic as Lord Of The Rings (movie, the games sucked).

Link to comment
Share on other sites

I get what he's saying though. When you try to be cheap (my parents are...) you have to upgrade A LOT. I've gone through 3 different video cards in the last year and a half.

cheap? 3 cards? lol somethings wrong with you. I bought a Geforce 3 when they first came out.. still have it what...3 years later? still works like a charm... i can play most games at 1024x768 or 1152x864 no problem.

Link to comment
Share on other sites

If you've been following the graphics scene at all the past year you would know nvidia's architecture used in their FX line of products were not developed around the dx9 standard. They chose to develop their own code because at the time ATI was not in the picture. An nvidia card was in about 80% - 90% of all gaming computers. And because of this they thought Microsoft would cave into their demands. However, ATI emerged out of nowhere and released the 9700 Pro which proved to be nearly twice as fast as the Geforce 4 TIs. Since the core in these 9700's was developed around MS's dx9 standard MS decided not to go with nvidia's code. Nvidia did not show up at any of the meetings and were thus not in the loop of what was happening. By the time they figured it out it was already too late and would have required they go back to the drawing board. Rather then do that and waste countless amounts of time and money they just decided to release what they had and use their PR machine to continue what it did best. This is when the optimizations came. They knew their card's performance would be rather lacking so they decided to make optimations to their drivers by decreasing image quality, removing aspects of scenery, etc. They were basically forced to do this in order to keep sales going while they wrote a compiler which would ultimately convert the dx9 code into their own. This compiler was introduced with the 'forceware' drivers. However, there are still optimizations and the image quality isn't on par with that of ATI's Radeons as was shown with Futuremarks last patch to 3DMark03.

Yes there are other aspects involved as well. The NV30 (nvidia's fx core) only utilizes either FP16 or FP32. It cannot do FP24 like that of ATI's R300. In FP32 it's performance drops a significant amount. The design just isn't capable of producing high fps in FP32 mode. This is partly the reason why Valve had to write a special backend specifically for the FX cards in order to get decent fps in HL2. ATI's cards run it perfectly fine without any special code. It is the same for all current and future dx9 games.

Did you ever happen to think that maybe HL2 isn't the only game out there that some people might want to play. I upgraded not because of HL2 but because it was high time I did, HL2 only played a role. I would also rather the game be delayed to work out bugs and be finished then to get a piece of crap. If you did upgrade for that game then it's only your fault. If you were smart you would have waited until the game came out.

Oh and nvidia has the 32 bit precision, not ATI.

Link to comment
Share on other sites

I get what he's saying though. When you try to be cheap (my parents are...) you have to upgrade A LOT. I've gone through 3 different video cards in the last year and a half.

well stop being cheap and wait, get a good card for once. Not budget cards. Thats wut I did, saved up and got my self a 9800 Pro.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.