Issue with ATI 9600XT


Recommended Posts

It seems to me that ATI cards are also move expensive compared to Nvidia offerings. I was looking at a 9600 Pro v. a Nvidia fx5600. The Pro was $150 while I got a fx5600 for $95 shipped to my door. The performance isn't that much different as indicated by an article on Tom's hardware or anandtech, I forget which.

Link to comment
Share on other sites

umm the pro is like 30 percent faster andthe xt is like one and a half times as fast. dont base everything on jsut one article. look around the net and youl see taht in a good amount of bmarks it creams it.

and to the stable and reliable guys, i have the perfect solution. a rock. look it dosent break. but can you play games on it? :no:

Edited by nuka_t
Link to comment
Share on other sites

Welll i've used a nVidia card for the last 4 years and I have never been happier with my 9600. please let's stop all this flaming and just get along lol. It's like saying say, my FORD is better than your CHEVY. who cares as long as the owner likes it? One Love.

Link to comment
Share on other sites

Yeah, flamage is lame.

The point was that NVIDIA does have AF.

They are both graphics card companies, they both have very similar features, they both fit into our computers, we as the consumer have a conscientious decision over our purchase, we are mostly happy with the hardware. The chances of a person going out and buying a new graphics card based off an ati vs Nvidia thread is remote to none. NVIDIA makes cards, so do ATI. Both are going to be around for a very long time. Learn to live with it.

/end of peace negotiations

Link to comment
Share on other sites

Check the attached screenie, NVIDIA does have AF... and no, it's not unique to my card, even my ancient TNT2 had AF on it. And it's not a "hacked" driver or nothing either, it's the official Forceware 53.03 set of drivers.

If you didn't buy into NVIDIA because someone told you they didn't have AF, then you were lied to. Research before you post such things, duder.

For some reason I had in my mind nvidia used their own method of anistropic filtering under a different name but which rendered the same results. My mind must have been wondering. In any case my previous statements still remain true. The Geforce 4/FX are inferior to the Radeon in terms of a combination of speed, aa and anis, image quality, and dx9 support.

everyday i see a thread about problems with ati cards on this site

Did the thought ever happen to occur to you that since the introduction of the 9700 the majority of video cards made by nvidia in the average household has decreased significantly? This means that more homes have ATI built cards now which ultimately gives a false appearance of having more problems. There are countless different combinations of hardware and software in any given computer. Obviously there will be some conflicts, most of which are simple user errors or problems caused from incompatibilities between old nvidia drivers and ati drivers. Nvidia has just as many problems. Take your head out of whatever hole it is you have it stuck in and open your eyes to the real world.

Edited by ANova
Link to comment
Share on other sites

For some reason I had in my mind nvidia used their own method of anistropic filtering under a different name but which rendered the same results. My mind must have been wondering. In any case my previous statements still remain true. The Geforce 4/FX are inferior to the Radeon in terms of a combination of speed, aa and anis, image quality, and dx9 support.

Did the thought ever happen to occur to you that since the introduction of the 9700 the majority of video cards made by nvidia in the average household has decreased significantly? This means that more homes have ATI built cards now which ultimately gives a false appearance of having more problems. There are countless different combinations of hardware and software in any given computer. Obviously there will be some conflicts, most of which are simple user errors or problems caused from incompatibilities between old nvidia drivers and ati drivers. Nvidia has just as many problems. Take your head out of whatever hole it is you have it stuck in and open your eyes to the real world.

You are partially right.... NVIDIA and ATI both use different methods for Anistropic Filtering and Antialiasing. You are right in saying that ATI have a much faster method for these effect. as can be seen from my above screenie, I have them off anyway (even if I had a Radeon). I prefer framerate over these anyday. And with those settings, NVIDIA has the same level of performance as ATI. ATI's only advantage is their anistropic filtering and antialiasing engine.

As for DX9 support, despite the occasional game that uses DX9 shaders we have not yet seen a decent test base for

comparison of DX9 shader quality/support.

Yes, Radeon has stepped up into the market and once again made a name for itself. Competition is almost always a good thing, as in theory the consumer wins out.

But then it comes down to a matter of taste and which developers you wish to follow.

Which is why I follow Id (Doom III), Epic (Unreal Tech), Crytek (Far Cry), and GSC (Stalker) as all these developers back NVIDIA (or if you really believe they were paid off, then they develop there games alongside NVIDIA with NVIDIA in mind).

Some may follow Valve (Half Life 2) to ATI.....umm, what are some other games that are part of ATI's "Get In The Game" classification? :blink:

As for quality, it all comes down to taste and game engine. Ask Epic and Ati doesn't render UT2003 correctly.

Ask Valve software and NVIDIA doesn't have the grunt to power their graphics engine.

It's all a case of he said/she said.

Link to comment
Share on other sites

As for DX9 support, despite the occasional game that uses DX9 shaders we have not yet seen a decent test base for

comparison of DX9 shader quality/support.

No, we've seen what happens when dx9 is introduced. Futuremark's 3dmark03, Valve's HL2, and eidos' Tomb Raider AOD all gave the same results when benchmarked for the first time on an FX. As is common for nvidia they used heavy driver optimizations to 'fix' these problems.
Which is why I follow Id (Doom III), Epic (Unreal Tech), Crytek (Far Cry), and GSC (Stalker) as all these developers back NVIDIA (or if you really believe they were paid off, then they develop there games alongside NVIDIA with NVIDIA in mind).

You might want to learn more about how businesses work. It's all about the money for most. Nvidia has the money from their past experience. They also have an excellent PR machine. "The way it's meant to be played" is just a gimmick they pay companies to put in their games to trick the consumer into thinking that so and so game will run better on nvidia's hardware. UT2k3 usually wins by a marginal 3-8 fps more on nvidia hardware then it does on ATI hardware. And guess what, it's not a dx9 game. Everyone knows nvidia's cards are good dx8 and lower performers. This may come as a shock to you but nvidia demanded a certain patch be withdrawn from Tomb Raider AOD awhile back, a game that has the "The way it's meant to be played" logo btw. And do you know why this is so? Because that patch gave the game a benchmarking utility. And that benchmark consistantly showed ATI's hardware to trounce all over nvidia's since the game is dx9 based. Valve decided to support ATI because they didn't like what nvidia offered. What's so wrong with that?

Ask Epic and Ati doesn't render UT2003 correctly.

Are you kidding me? That is almost unnoticable if it even is indeed true. And considering i've never heard of it other then in that one article i'd say it was just an anomoly or a problem with the driver set they were using at that time. Sounds like nvidia is grasping at straws.

Link to comment
Share on other sites

As is common for nvidia they used heavy driver optimizations to 'fix' these problems.

You might want to learn more about how businesses work.? It's all about the money for most.? Nvidia has the money from their past experience.? They also have an excellent PR machine.? "The way it's meant to be played" is just a gimmick they pay companies to put in their games to trick the consumer into thinking that so and so game will run better on nvidia's hardware.? UT2k3 usually wins by a marginal 3-8 fps more on nvidia hardware then it does on ATI hardware.? And guess what, it's not a dx9 game.? Everyone knows nvidia's cards are good dx8 and lower performers.? This may come as a shock to you but nvidia demanded a certain patch be withdrawn from Tomb Raider AOD awhile back, a game that has the "The way it's meant to be played" logo btw.? And do you know why this is so?? Because that patch gave the game a benchmarking utility.? And that benchmark consistantly showed ATI's hardware to trounce all over nvidia's since the game is dx9 based.? Valve decided to support ATI because they didn't like what nvidia offered.? What's so wrong with that?

...

Are you kidding me?? That is almost unnoticable if it even is indeed true.? And considering i've never heard of it other then in that one article i'd say it was just an anomoly or a problem with the driver set they were using at that time.? Sounds like nvidia is grasping at straws.

Dude, read the link that was in that post before you instantly criticize the statement without even understanding it.

ATI doesn't show the full detail textures in several instances within the Unreal engine.

Angel of Darkness is a game with countless glitches that even patching is yet to correct. I played it on a Radeon and half of the enemies game meshes had inverted normals...and it didn't even detect that I was using an AMD and attempted to use hyperthreading, which highly affected vertices within skinning functions.

Since you missed the links, here is another one, very impartial.

THIS HERE, so you can see it easier

Another one, as it's harder to miss 2 links instead of one

And I ask you not to insult my intelligence in future posts. I have a clear understanding of how the games industry works, and the companies who did attend NVIDIA's editors day had no marketting or financial gain from it. I'll put it this way, Id software have no contracts with NVIDIA (besides standard code security ones) whilst it is worth a look into the depth of the contract between VALVE and ATI.

And you're calling "The Way It's Meant To Be Played" a gimmick......ditto for ATI's "Get In The Game".

NVIDIA is not grasping at straws, they are not suffering financial hardships nor has their share in the market dropped any. In fact, if you care to swing by NVIDIA.com you can see that in fact they scored a contract with the US Government (NASA, to be precise). Have a look

quote.jpg

Are you going to tell me that the US Government is part of NVIDIA's "grasping at straws"? I don't think NASA was going to gamble their uber-expensive equipment just because a company would pay them off, it stands to reason they chose the most reliable and best performer of the cards available to them. (yes, I know that NASA lost the Rover, but that is due to a communications failure, been blamed on freak storms down here in Australia)

Oh yeah, here's a link. Read it before commenting: This is the link

Stop looking for a debate, and just play some games on the card of your preference and allow others to do the same.

Edited by Chode
Link to comment
Share on other sites

NVIDIA is not grasping at straws, they are not suffering financial hardships nor has their share in the market dropped any. In fact, if you care to swing by NVIDIA.com you can see that in fact they scored a contract with the US Government (NASA, to be precise). Have a look

http://www.nvidia.com/docs/IO/10781/quote.jpg

Are you going to tell me that the US Government is part of NVIDIA's "grasping at straws"? I don't think NASA was going to gamble their uber-expensive equipment just because a company would pay them off, it stands to reason they chose the most reliable and best performer of the cards available to them. (yes, I know that NASA lost the Rover, but that is due to a communications failure, been blamed on freak storms down here in Australia)

Oh yeah, here's a link. Read it before commenting: This is the link

Stop looking for a debate, and just play some games on the card of your preference and allow others to do the same.

Ahh, so that is why the rover stopped communicating with NASA for a few days :p :rofl:

j/k doesn't matter which card people use, as long as you can play the games :yes:

Oh and glad to see the 9600XT problems were solved ;)

Link to comment
Share on other sites

ATI doesn't show the full detail textures in several instances within the Unreal engine.
Again, very minor stuff. No one even bothered to notice until nvidia pointed it out. It's certainly nothing along the lines of removing fog entirly like nvidia did. Along with everything else. I've played UT2K3 on my Radeon 9500 Pro and never once noticed any slowdown in framrates or reductions in image quality/missing textures.
Angel of Darkness is a game with countless glitches that even patching is yet to correct. I played it on a Radeon and half of the enemies game meshes had inverted normals...and it didn't even detect that I was using an AMD and attempted to use hyperthreading, which highly affected vertices within skinning functions.

It doesn't matter if the game is buggy or not. What matters is how it runs on the hardware in question. Shouldn't the game run the same on both types of video cards if they are indeed both equal? But no, you see an entirely different situation.

Since you missed the links, here is another one, very impartial.

THIS HERE, so you can see it easier

Another one, as it's harder to miss 2 links instead of one

Did you even read those articles? I did. And they clearly point out that nvidia admitted their drivers had optimizations and that ATI's cards are faster. It's also quite clear that while the Geforce FX is designed to run in both FP16 and FP32 (nvidia's reasoning is that their cards were designed with the future in mind) it simply does not have the power to run anything at a decent rate in FP32. So what's the use of being compatible with future instructions and languages if you cannot play said game at a decent framrate? The article from firingsquad explains all this. And yet you seem to ignore it.

And I ask you not to insult my intelligence in future posts. I have a clear understanding of how the games industry works, and the companies who did attend NVIDIA's editors day had no marketting or financial gain from it. I'll put it this way, Id software have no contracts with NVIDIA (besides standard code security ones) whilst it is worth a look into the depth of the contract between VALVE and ATI.
Actually most of the companies that attended did have marketting to gain from it. When asked why Harvey Smith from ION Storm attended one of his reasons was as follows: "good coverage for our game." Yes id software doesn't have a contract with nvidia, yet, but this is for a good reason. Doom 3's engine utilizes OpenGL, not direct-x. The Geforce's are good opengl cards. I wouldn't be surprised at all if D3 also happens to sport the "The way it's meant to be played" logo by the time it's released. And I can tell you the development companys have nothing to gain from putting that slogan intheir game. Therefore they are getting something in return, whether or not that something is in the form of money doesn't matter. Oh and before there was a contract between ATI and Valve nvidia tried to pay off Valve all the same. Valve decided to go with ATI. And for good reason.
And you're calling "The Way It's Meant To Be Played" a gimmick......ditto for ATI's "Get In The Game".

I have not seen any games that have the "get in the game" slogan. Would you be so kind as to show me? It's perfectly fine to have a slogan for your company but once you start putting that slogan in products made by other companies it becomes an entirely different matter.

NVIDIA is not grasping at straws, they are not suffering financial hardships nor has their share in the market dropped any. In fact, if you care to swing by NVIDIA.com you can see that in fact they scored a contract with the US Government (NASA, to be precise). Have a look

http://www.nvidia.com/docs/IO/10781/quote.jpg

Are you going to tell me that the US Government is part of NVIDIA's "grasping at straws"? I don't think NASA was going to gamble their uber-expensive equipment just because a company would pay them off, it stands to reason they chose the most reliable and best performer of the cards available to them. (yes, I know that NASA lost the Rover, but that is due to a communications failure, been blamed on freak storms down here in Australia)

I never said nvidia was hurting finantially. Where did you get that assumption? I think it's perfectly fine NASA decided to go with nvidia. Their using them to render realistic scenes that would mirror instances the Rover is bound to go through. This is fine as they have time to render these scenes. It's a bit different with games. I think your mistaking my take on this situation as bashing nvidia as a company. This just isn't the case. My agenda is to make it clear to everyone that their FX line of cards is severely lacking in many areas compared to the competition. And as such I feel it's fanboyish to continue to support them just because these cards have the nvidia name printed on them.

Have a nice day.

Link to comment
Share on other sites

Again, very minor stuff.  No one even bothered to notice until nvidia pointed it out.  It's certainly nothing along the lines of removing fog entirly like nvidia did.  Along with everything else.  I've played UT2K3 on my Radeon 9500 Pro and never once noticed any slowdown in framrates or reductions in image quality/missing textures.
Can you show evidence (or cite examples) where an NVIDIA card does not display fog?

I am not questioning the speed that the Radeon portrays the scene at, but if it is not showing the scene the "way it's meant to be played" (aka, as the developers programmed the game to appear) then that is a very major issue.

It doesn't matter if the game is buggy or not. What matters is how it runs on the hardware in question. Shouldn't the game run the same on both types of video cards if they are indeed both equal? But no, you see an entirely different situation.

It matters very much if the game has bugs, as I said it attempted to use incorrect code paths to match hardware. Trying to treat my Athlon as a Pentium is one thing, but "minor bugs" can cause countless graphical glitches which would have very adverse effects on frame rates. Where you make your mistake is that ATI and NVIDIA cards are not equal, they both have their own graphic engines which both render scenes in very different manners.

Did you even read those articles? I did. And they clearly point out that nvidia admitted their drivers had optimizations and that ATI's cards are faster.
During the whole event, NVIDIA made careful sure to mention ATI as little as possible (due to the fact that they dislike bad-mouthing their competition). I fail to see where they said quote/unquote "ATI's cards are faster". And you say "optimizations" as though they are a bad thing. Optimization=good. The dictionary tell us that Optimization is "The procedure or procedures used to make a system or design as effective or functional as possible, especially the mathematical techniques involved.". In other words, the same result in less time with less resource consumption.Any programmer could tell you that unoptimised code is a bad thing, so why suddenly do people think that optimised drivers are a bad thing?
And I can tell you the development companys have nothing to gain from putting that slogan intheir game.

Quite the contrary. Heard of S.T.A.L.K.E.R. Oblivion Lost? Until recently it was a game that no-one knew of, just an upcoming product from a Ukranian development house. NVIDIA have been assisting them with their graphics engine (Specifically X-Ray) to ensure optimal performance. As a result of NVIDIA's assistance, they know have a product in development which has been described as "the best looking game of all time" (Pc Powerplay #96).

NVIDIA has such a strong base and a strong reputation among developers because of the amount that NVIDIA will lend developers to companies to assist them with the products use of graphics hardware. NVIDIA also was assisting VALVE software for a piece, before VALVE partnered with ATI (bunch of ingrates).

I have not seen any games that have the "get in the game" slogan. Would you be so kind to show me some? It's perfectly fine to have a slogan for your company but once you start putting that slogan in products made by other companies it becomes an entirely different matter.
Firstly, NVIDIA doesn't put that logo in games, the developers do. As I said above, NVIDIA likes to play a hand in the development of products to help ensure optimum quality, and also assist in advertising for the product. This is just the way that the developers acknowledge NVIDIA.

This is the official website for "Get In The Game" It currently lists 4 items under it with dates:

The first has no date, of course, but is just a listing of their Radeon XT series of cards, the first under the "Get In The Game" promotion.

The second is October 2003, Half Life 2 (I non-monetarily bet you that Half Life 2 will contain an ATI logo somewhere during the startup. Pm me if you're willing to take that bet up). Well, we won't see that for months yet...

The third is November 2003, which is just stating that they sponsored the Halo Worldwide Tournament.

The fourth is December 2003, which is also just stating that they sponsored Cyber X Games.

The fifth is January 2004, Lord Of The Rings: War Of The Ring. Mainly talking about some competition that Vivendi and ATI are running. (don't know if this game is out yet, if it is and anyone's played it is their an ATI logo during the startup?)

Seems the one most worth mentioning is Half Life 2, but we may be waiting as late as November 2004 before the largest piece of ATI's "Get In The Game" promotion is seen.

I think your mistaking my take on this situation as bashing nvidia's as a company. This just isn't the case. My agenda is to make it clear to everyone that their FX line of cards is severely lacking in many areas compared to the competition. And as such I feel it's fanboyish to continue to support them just because these cards have the nvidia name printed on them.

Ok, you are not dissing them as a company. Besides antialiasing and anistropic filtering engines, how else are they lacking? And don't even mention the "official" Half Life 2 benchmarks, I wouldn't trust Gabe Nawell if the future of the gaming industry depended on it. Please don't just state "this sux, and they can't <insert feature here> as good as a Radeon" but actually back it up with reliable linkage (guru3d or Tom's Hardware is a good place to start).

Have a nice day.

Thank you, you too :) I am quite enjoying having a civil debate with someone who isn't just throwing fanboy material at me, but you are doing a good job of keeping me on my toes.

I will make note that ATI may have beat NVIDIA to the market with Direct X 9 cards, and that as a result NVIDIA has had a hell of a time catching up to them. What I question is if as a result of the failures/delays of the next generation games if even ATI's release of the Direct X 9 card was premature. Indeed it is a very interesting time to be involved in the games industry :D

Link to comment
Share on other sites

Can you show evidence (or cite examples) where an NVIDIA card does not display fog?

I am not questioning the speed that the Radeon portrays the scene at, but if it is not showing the scene the "way it's meant to be played" (aka, as the developers programmed the game to appear) then that is a very major issue.

it it

in a beta driver, in a driver that never supposed to be released, where fog wasn't implemented yet

Link to comment
Share on other sites

Hey my friend Anova, what's up? :)

Apparently we have some disagreement again about drivers. Well here we go again...

Apparently we have run into the minority yet again...some freaks who just laid down some dough on nVidia hardware and is desperately trying to justify a horrible decision (betcha these guys got some 5600 ultra garbage and paid the likes of a 9600 XT). They are a loud minority, though, however, they still are a minority.

I have posted this before, but I will yet again:

There are 7 ATi cards for every 3 nVidia at Neowin

Because most Neowinians do a little research before spending 200 quid

I do not know about you, but I don't listen to any neowinian mouthing off in these forums. I have learned that some of them (NOT most) are not too well educated about the current status of graphics hardware, even though it is listed all throughout the internet. So I occasionally do a little community service and educated the uneducated when they mouth off about things they don't know...

1. Anandtech recently posted this on his very popular and repected hardware website:

Link, source

You?ve been living too perfect of a life if you?ve never used the phrase ?it?s been a long day,? and for NVIDIA it has most definitely been a very long day. Just over two weeks ago the graphics industry was shook by some very hard hitting comments from Gabe Newell of Valve, primarily relating to the poor performance of NVIDIA cards under Half Life 2. All of the sudden ATI had finally done what they had worked feverishly for years to do, they were finally, seemingly overnight, crowned the king of graphics and more importantly ? drivers. There were no comments on Half Life 2 day about ATI having poor drivers, compatibility problems or anything even remotely resembling discussions about ATI from the Radeon 8500 days.

Half Life 2 day was quickly followed up with all sorts of accusations against NVIDIA and their driver team; more and more articles were published with new discoveries, shedding light on other areas where ATI trounced NVIDIA. Everything seemed to all make sense now; even 3DMark was given the credibility of being the ?I told you so? benchmark that predicted Half Life 2 performance several months in advance of September 12, 2003. At the end of the day and by the end of the week, NVIDIA had experienced the longest day they?ve had in recent history.

Some of the more powerful accusations went far beyond NVIDIA skimping on image quality to improve performance; these accusations included things like NVIDIA not really being capable of running DirectX 9 titles at their full potential, and one of the more interesting ones ? that NVIDIA only optimizes for benchmarks that sites like AnandTech uses. Part of the explanation behind the Half Life 2 fiasco was that even if NVIDIA improves performance through later driver revisions, the performance improvements are only there because the game is used as a benchmark ? and not as an attempt to improve the overall quality of their customers? gaming experience. If that were true, then NVIDIA?s ?the way it?s meant to be played? slogan would have to go under some serious rethinking; the way it?s meant to be benchmarked comes to mind .

I could sit here all night and post similar articles for you, but I think this one is clear enough.

Good night.

Edited by adamp2p
Link to comment
Share on other sites

Apparently we have run into the minority yet again...some freaks who just laid down some dough on nVidia hardware and is desperately trying to justify a horrible decision (betcha these guys got some 5600 ultra garbage and paid the likes of a 9600 XT).  They are a loud minority, though, however, they still are a minority.

Three points,

1. I would request for you not to insult me again, and attempt to show some manners. If you must know, I actually own a GeForce 4 Ti 4800.

2. Learn something about statistics. A poll upon a single forum (especially one as large as this, which is divided into countless sections) reveals nothing about the entire gaming world. It is an abstract value, not something to be cited as truth. Also, the majority are not always right (the majority voted for George Bush).

3. The true division in Hardware lies closer to 35, 35, 30

35% Own NVIDIA

35% Own ATI

30% Use onboard/alternate brand/don't bother with 3D acceleration (aka, don't play 3D games)

I do not know about you, but I don't listen to any neowinian mouthing off in these forums. I have learned that some of them (NOT most) are not too well educated about the current status of graphics hardware, even though it is listed all throughout the internet. So I occasionally do a little community service and educated the uneducated when they mouth off about things they don't know...

It's time for you to come down off your high horse. You can count yourself among those "not too well educated about the current status of graphics hardware".

I work in the 3D Graphics industry (studied at the Academy Of Interactive Entertainment, Canberra if you want to look it up) and I am willing to say that I know more about 3D graphics than you do (some peoples knowledge doesn't came from flaming others).

Besides, didn't your first thread to this effect get locked? Please do not bring flamage to this thread, nor a lack of knowledge.

Link to comment
Share on other sites

Ok, I researched and here are the exact figures for the market share (consider this an ammendment to my above post)

This came from here. It is dated 10/29/2003, but should be a good reference guide.

I shall do this in random quoatation format. Bold refer to major points.

Desktop Graphics Direction

"ATI Technologies is gaining momentum with its leadership position in performance and mobile segments

"Intel Corporation continues to occupy the market of the cost-effective integrated solutions"

"NVIDIA Corporation experiences some declines in its overall share, but is still the largest desktop graphics chip company in the industry"

"13.75% increase in terms of graphics units sold, including standalone, integrated, mobile, professional, etc."

"biggest supplier of graphics products in the third quarter of the year was Intel with 35% of the whole market"

"NVIDIA Corporation and ATI Technologies boast with nearly equal shares of the graphics market ? 25% and 22% respectively."

"Other providers of graphics products, such as VIA Technologies, Silicon Integrated Systems and Matrox Graphics... 9%, 8% and 1% parts of the market in that order.

"sales of Silicon Motion decreased dramatically during the quarter"

Standalone GPUs Trend: NVIDIA Leads, ATI Catches Up

"NVIDIA Corporation...declining 1% to 53%"

"ATI Technologies...gained 3% to reach 40%"

"companies like SiS and Matrox Graphics were 3% per each with SiS leading in terms of unit sales"

Desktop Standalone Graphics Chips ? ATI Gaining Momentum, NVIDIA Fighting Strong

"NVIDIA maintained its Number One position in desktop standalone graphics chip"

"NVIDIA...decline of 2% in terms of share, down to 62%.

"ATI Technologies...up 4% to 32%

"SiS and Matrox stayed flat with 3% each."

Desktop Integrated Graphics ? SiS Collapses, Intel Skyrockets

"Intel Corporation...gained an extra 3% to 67%"

"SiS lost 2% to 13%. "

"NVIDIA and VIA stayed flat with 17% and 3%"

New Breed DirectX 9.0 Chips on Track

"ATI...sales of its RADEON 9200-series products with about 80% of the DirectX 8.1 parts"

"NVIDIA leads the DirectX 9.0 value segment with 72% of the market"

"ATI has only 27%, even a bit lower than earlier"

NOTE: I'm not sure, but I think this article predates the release of the Radeon XT series, so disregard the last two statements

"NVIDIA tripled sales of Performance Direct X 9.0 GPUs"

"NVIDIA...32% of the market."

"ATI...68%. In overall, it means that ATI Technologies still sells loads more high-performance DirectX 9.0 parts than the rival."

"It worth to point out that the third quarter was the first quarter for NVIDIA?s high-performance GeForce FX 5900 revenue shipments. Given that the company did not manage to outperform ATI?s powerful RADEON-series in terms of sales, we may conclude that enthusiasts still prefer ATI?s VPUs to NVIDIA products. Therefore, we may anticipate strengthening of ATI?s positions in mainstream and value segments as well eventually."

ATI and Intel Compete in Mobile

"Intel?s share rose to 45%"

"ATI gained 5% to 39% share"

"VIA Technologies...declined 26% to 11%"

What does this all mean?

Desktop Graphics market share in order from highest to lowest: intel (35), NVIDIA(25), ATI(22)

Standalone GPU's market share in order from highest to lowest: NVIDIA(53), ATI(40)

Desktop Standalone Graphics market share in order from highest to lowest: NVIDIA (62), ATI (32)

Desktop Integrated Graphics market share in order from highest to lowest: intel (67), NVIDIA(17), sis(13), via(3)

I didn't include the DirectX 8.1/9 figures, due to the market having significantly changed.

Mobile solution market share in order from highest to lowest: intel (45), ATI(39), via (11)

In all fairness it can be seen that NVIDIA has had a major headstart in the market than ATI, but that notwithstanding it can be seen that they both share similar market shares.

This post is directed at those who say that ATI dominate the market, and those who say is is all about money.

My point is that they are both companies both doing great business with both great market shares, so let people choose whatever graphics card they want to without insulting them or trying to force them through intimidation to change their hardware....or at least insult those running integrated Intel graphics more :p

Edited by Chode
Link to comment
Share on other sites

Can you show evidence (or cite examples) where an NVIDIA card does not display fog?

I don't remember what sites stated the fog issue as it was months ago but I can tell you that multiple sources backed up the claim.

I am not questioning the speed that the Radeon portrays the scene at, but if it is not showing the scene the "way it's meant to be played" (aka, as the developers programmed the game to appear) then that is a very major issue.

Again I must reiterate. "I've played UT2K3 on my Radeon 9500 Pro and never once noticed any... reductions in image quality/missing textures." If indeed there are or were missing textures it is so slight that it's unnoticable unless you study every image in detail. And with a problem that slight i'd be willing to bet it was not done intentionally by ATI but was rather caused by a driver flaw or a flaw in the way the game handles ATI hardware. Which wouldn't be at all surprising considering Epic is an avid supporter of nvidia.

It matters very much if the game has bugs, as I said it attempted to use incorrect code paths to match hardware. Trying to treat my Athlon as a Pentium is one thing, but "minor bugs" can cause countless graphical glitches which would have very adverse effects on frame rates. Where you make your mistake is that ATI and NVIDIA cards are not equal, they both have their own graphic engines which both render scenes in very different manners.

You make a good point, however, do you have any evidence showing that the game fails to recognise nvidia hardware which thus affects it's performance? You told me it failed to recognise your AMD but said nothing about the graphics processor. I would assume a company like eidos would not disclude a company as big as nvidia in their games. Again, I have seen multiple articles showing the FX to fall seriously short in AOD which supports past results with 3dmark03. Why is you think nvidia and futuremark were feuding with each other? It's obvious nvidia was trying to defend their product as any company would do despite whether or not the product in question is indeed worth the consumer's hard earned money. The results brought forth by Valve's benchmark with HL2 only confirms what 3dmark03 and AOD were saying. Doesn't it make sense that when all dx9 based games available for benchmarking show the same results it usually means those results are correct? Especially when none of the above companies are in any way affiliated with each other? More proof that something was seriously wrong was with futuremark's 330 patch to 3dmark03. Results directly afterwards showed large declines in performance with Geforce FX cards while scores for Radeons remained unchanged. This can only mean that there were heavy optimizations made to the drivers specifically for 3dmark.

During the whole event, NVIDIA made careful sure to mention ATI as little as possible (due to the fact that they dislike bad-mouthing their competition). I fail to see where they said quote/unquote "ATI's cards are faster".

They didn't say it directly obviously, but it was implied a couple times whether by nvidia or the author of the article. One such quote is "Yes, it does take more work ? NVIDIA admitted as much. The NV3X platform isn?t as easy to program fast as R300 and R350 are." As well as "On the other hand, people who own GeForce FX cards will be able to take advantage of games that feature longer shader instructions. Of course, whether any cards of this generation will be fast enough to take advantage of the higher precision and longer pixel shader instructions when the games finally arrive, is another matter."

And you say "optimizations" as though they are a bad thing. Optimization=good. The dictionary tell us that Optimization is "The procedure or procedures used to make a system or design as effective or functional as possible, especially the mathematical techniques involved.". In other words, the same result in less time with less resource consumption.Any programmer could tell you that unoptimised code is a bad thing, so why suddenly do people think that optimised drivers are a bad thing?

I agree; optimizations are a good thing. However, there is a fine line between optimizations for games for the intent and purpose of playing those games and another thing altogether for benchmarks. The purpose being to inflate numbers and make your product look better then it is to the unsuspecting public. Benchmarks should not have optimizations, period. Their whole purpose is to have a medium to which you can use to grade performance. If the results don't mimmick what you can expect in all real world games then those results are useless.

Quite the contrary. Heard of S.T.A.L.K.E.R. Oblivion Lost? Until recently it was a game that no-one knew of, just an upcoming product from a Ukranian development house. NVIDIA have been assisting them with their graphics engine (Specifically X-Ray) to ensure optimal performance. As a result of NVIDIA's assistance, they know have a product in development which has been described as "the best looking game of all time" (Pc Powerplay #96).

As I said I think optimizations for games are perfectly fine. We'll have to wait and see how well the solutions from both sides run on this game before we make any assumptions though.

NVIDIA has such a strong base and a strong reputation among developers because of the amount that NVIDIA will lend developers to companies to assist them with the products use of graphics hardware. NVIDIA also was assisting VALVE software for a piece, before VALVE partnered with ATI (bunch of ingrates).

Of course nvidia has a strong user base. They were previously the king of graphics without any real competition after 3DFX went out of business. It sounds like you have an unfair bias towards Valve without knowing all the facts. Valve recieved no help from nvidia at all except in the development of the special backend code for HL2 in order for the game to run half decent on FX cards. This was largly because of the use of HDR in HL2 which apparently cannot run well in FP16 mode. And considering the NV30 core cannot run at decent speeds in FP32 the mix between the two was necessary. It only seems fair that nvidia should offer to help since it is their hardware that is causing the problems. ATI's happens to run the game perfectly fine without any need for special code.

Firstly, NVIDIA doesn't put that logo in games, the developers do. As I said above, NVIDIA likes to play a hand in the development of products to help ensure optimum quality, and also assist in advertising for the product. This is just the way that the developers acknowledge NVIDIA.

Oh come on, it's the same thing. Nvidia uses it's power of persuasion to get the developers to put that logo in their games. This serves as advertising with the intent on claiming this game will only run well on nvidia based hardware. Most people don't know any better. In turn the develop gets something, whether it be money, help in the form of nvidia lending a hand, or advertising for that developer's game.

This is the official website for "Get In The Game" It currently lists 4 items under it with dates:

As I said, it's ATI's slogan used for various events. There currently are no games that sport that slogan. And we won't know if there will be until HL2 is released.

Ok, you are not dissing them as a company. Besides antialiasing and anistropic filtering engines, how else are they lacking? And don't even mention the "official" Half Life 2 benchmarks, I wouldn't trust Gabe Nawell if the future of the gaming industry depended on it. Please don't just state "this sux, and they can't <insert feature here> as good as a Radeon" but actually back it up with reliable linkage (guru3d or Tom's Hardware is a good place to start).

Basically the fact that ATI uses FP24 is a plus. It's a good mid point unlike nvidia's FP32. They had good intentions with thoughts for the future but what good is having compatibility for future games if the card won't be able to run those games at decent framrates? Any enthusiast will tell you all of ATI's cards produce better image quality as well. In most areas at least. I also like knowing my card was designed around the DX9 standards. The NV30 was designed with it's own code in mind. Therefore if you attempt to run any detonator drivers the results will be catastrophic for dx9 games. If you use forceware drivers the dx9 code is being converted into the code the NV30 can understand. This isn't what the dx9 team had in mind. It can also lead to imperfections in the rendering. Need I go on?

Thank you, you t:)? :)? I am quite enjoying having a civil debate with someone who isn't just throwing fanboy material at me, but you are doing a good job of keeping me on my toes.

I enjoy a good debate from a knowledgable person as well. Though I question if you really are knowledgable considering i'm having to explain myself on this matt:whistle:istle:

I will make note that ATI may have beat NVIDIA to the market with Direct X 9 cards, and that as a result NVIDIA has had a hell of a time catching up to them. What I question is if as a result of the failures/delays of the next generation games if even ATI's release of the Direct X 9 card was premature. Indeed it is a very interesting time to be involved in the games indu:Dry? :D

Premature? I don't think it was premature. Look at AMD and it's Athlon 64.

Link to comment
Share on other sites

seriously guys who cares. x likes nvidia, y likes ati. both for different reasons. thats life. arguing about it is stupid

x has sex with y. xy is born. :laugh:

Link to comment
Share on other sites

seriously guys who cares.  x likes nvidia, y likes ati.  both for different reasons.  thats life.  arguing about it is stupid
If you want us to stop, just say...but we are doing our best to keep it a civil debate, one that people may learn from :D
x has sex with y. xy is born.

Ummm, how about NO!!! :wacko:

Ok, our "rounds" are starting to get almost too long to follow...don't worry I read it but I'll paraphrase my responses.

The whole fog and benchmarking issues are a touchy matter. Ati loves to heavily claim NVIDIA intentionally did it, whilst NVIDIA claim that it was a driver glitch.. they say that their method for optimisation strenghthens rendering only what needs to be rendered, and that this did not work right with the benchmarking software in question. Of course, both sides of the debate have had massive driver issues in the past, and currently I see no problems with the scene I am viewing on my Ti4800

NVIDIA forced Eidos to pull the patch because it adversely affected how NVIDIA handled the game. I had the game, and I noticed with that patch that suddenly my in game frame rates were less than half of what I had pre-patch. And this was to no graphical correction...in fact, the only graphics fix worth mentioning was meant to stop the shotgun been transparent...but it still was. The whole game was shonky, in fact Paramount pictures blames this game for how bad the second movie went in the box office (but I attest that the movie itself sucked, despite the game)

For this part I will criticize only VALVE, not ATI. For the official benchmarks of HL2, you notice they used the v45 series of drivers for the NVIDIA cards (yes, before the drivers did much in the way of Direct X 9 due to the fact that at the time there were no DirectX 9 products). VALVE refused to use a pre-release version that NVIDIA offered them for no good reason, saying "the driver shouldn't be used for benchmark testing". It is from here that Gabe Nawell suggested to the public that NVIDIA drivers do not render fog... but his assumption comes from a leaked beta of the v50.xx series of driver, one which had countless problems and that the public was never meant to see (a lot like the HL2 beta...how ironic).

NVIDIA did state that it takes longer to develop shaders for the FX generation of cards (I add this so I can remain impartial)

That is what I don't like about VALVE. How would you like it if someone benchmarked your card using a dated version of the Catalyst drivers? I'm just not liking the "bitchiness" that VALVE has brought into the gaming market. I mean, you can say that it all started in UT2003's "the way it's meant to be played", but Epic didn't go to the public with benchmarks unfairly favouring NVIDIA and saying "The Catalyst series of drivers should not be benchmarked with". VALVE are intentionally attempting to alienate a large number of graphics users (if my last post be used, 25% of the gaming market in order to favour 22% of it). I'm just not liking them from the game developers standpoint (game development been the field I am trained in).

I agree with you on the AMD 64 been premature. lol

But I do agree with Spyder, X says what NVIDIA feeds them and Y says what ATI feed them. Lets settle it at this, I own an NVIDIA GeForce 4 Ti 4800, and I am very happy with it. What card do you own, and are you happy with it? If so, then what's the point of arguing? No matter what either of us says the other isn't going to go fork up money just to appease the other.

Link to comment
Share on other sites

A few days ago I bought my first ATI card, and being an avid nVidia user have never seen an issue with a card/OS. One and a while if playing a game it will lose focus and minimize to the desktop. Anyone know of this issue or how to correct it?

I just corrected a similar problem on another computer. The card in question was a GeForce 4 Mx.

What was causing the problem was actually Norton's fine products (specifically their firewall and antivirus). To correct it, all I did was run liveupdate on both products and downloaded all updates, then it was fine.

Link to comment
Share on other sites

The whole fog and benchmarking issues are a touchy matter. Ati loves to heavily claim NVIDIA intentionally did it, whilst NVIDIA claim that it was a driver glitch.. they say that their method for optimisation strenghthens rendering only what needs to be rendered, and that this did not work right with the benchmarking software in question. Of course, both sides of the debate have had massive driver issues in the past, and currently I see no problems with the scene I am viewing on my Ti4800
Actually, ATI never even got involved in the fog issue as far as i'm aware. That, along with many other issues like removal of certain areas of scenery and brush, were brought forth by third parties such as Futuremark, Anandtech, HardOCP, etc. Of course nvidia didn't admit to it, at first. Once the matter was circulated by many various forms of media and public outcry began only then did nvidia admit to it, but in the usual obscure way. Why would you see any problems with scene rendering on your ti4800? The issue here is not with the geforce 4 but rather the geforce fx.
NVIDIA forced Eidos to pull the patch because it adversely affected how NVIDIA handled the game. I had the game, and I noticed with that patch that suddenly my in game frame rates were less than half of what I had pre-patch. And this was to no graphical correction...in fact, the only graphics fix worth mentioning was meant to stop the shotgun been transparent...but it still was. The whole game was shonky, in fact Paramount pictures blames this game for how bad the second movie went in the box office (but I attest that the movie itself sucked, despite the game)

This the first time my attention has been brought to this matter. Got any proof showing the general performance decreased as much as you claim after the patch?

For this part I will criticize only VALVE, not ATI. For the official benchmarks of HL2, you notice they used the v45 series of drivers for the NVIDIA cards (yes, before the drivers did much in the way of Direct X 9 due to the fact that at the time there were no DirectX 9 products). VALVE refused to use a pre-release version that NVIDIA offered them for no good reason, saying "the driver shouldn't be used for benchmark testing". It is from here that Gabe Nawell suggested to the public that NVIDIA drivers do not render fog... but his assumption comes from a leaked beta of the v50.xx series of driver, one which had countless problems and that the public was never meant to see (a lot like the HL2 beta...how ironic).
LOL. OK, the drivers they used were pre-forceware. Thus before the code converter was introduced. Valve did not attempt to hide their reasoning behind not using the driver release nvidia wanted them to. The reason being that Valve did their own tests on those drivers because they suspected something was awry. And little to their surprise they found many various optimizations made specifically for the HL2 benchmark. They also found some code that detected when the viewer was taking a screenshot in order to increase image quality specifically for that screenshot. Next they did what no one else had the balls to do, and that was to publicise their findings knowing it would **** off loyal nvidia fans, and nvidia themselves. Which is exactly what happened as is shown by you. But...Gabe didn't care. And I fully support his decision as I feel what nvidia did was just plain wrong. Being the big comglomerate company nvidia is it is common for them to use their power to get what they want. Now I may be overplaying this but the evidence says quite the contrary.
That is what I don't like about VALVE. How would you like it if someone benchmarked your card using a dated version of the Catalyst drivers? I'm just not liking the "bitchiness" that VALVE has brought into the gaming market. I mean, you can say that it all started in UT2003's "the way it's meant to be played", but Epic didn't go to the public with benchmarks unfairly favouring NVIDIA and saying "The Catalyst series of drivers should not be benchmarked with". VALVE are intentionally attempting to alienate a large number of graphics users (if my last post be used, 25% of the gaming market in order to favour 22% of it). I'm just not liking them from the game developers standpoint (game development been the field I am trained in).

Valve used the current set of drivers at the time. Both for nvidia and ATI. Nvidia wanted them to use an unreleased set after the bad results were brought to view. Oddly enough nvidia never officially released the driver set they wanted Valve to use. :rolleyes: And even if Valve used an older version of the Cats such as the 3.4s the performance difference would have been marginal. With every new release of Catalyst drivers comes more stability, not really more performance. As for Valve attempting to "alienate a large number of graphics users". So what your saying is if a person is to make bad decisions based on uneducated purchases they have the right to bitch and complain about those who brought the problems to light rather then the company responsible for the faulty product? Ignorance is bliss huh. I don't think so.

Link to comment
Share on other sites

Actually, ATI never even got involved in the fog issue as far as i'm aware.? That, along with many other issues like removal of certain areas of scenery and brush, were brought forth by third parties such as Futuremark, Anandtech, HardOCP, etc.? Of course nvidia didn't admit to it, at first.? Once the matter was circulated by many various forms of media and public outcry began only then did nvidia admit to it, but in the usual obscure way.

The reason being that Valve did their own tests on those drivers because they suspected something was awry.? And little to their surprise they found many various optimizations made specifically for the HL2 benchmark.? They also found some code that detected when the viewer was taking a screenshot in order to increase image quality specifically for that screenshot.?

Public outcry = lamers with no clue bitching on forums, oh no, that would so hurt their sales (sarcasm)

"he who is without sin cast the first stone". ATI also admitted to intentional driver cheating just last year.

VALVE never even saw the drivers NVIDIA was offering them. If they were comparing it off anything at all then they were looking at the leaked betas, but that's like comparing what windows Longhorn will be to the current alpha's floating around. VALVE never saw the Forceware generation of drivers until it was fully released to the public.

And this crap about special code in the drivers, that you are pulling out of your ass. You know nothing of coding. The screenshot system works by dumping the current rendered scene into a file... it doesn't work by "Render" => "Screenshot button" => "Render again" => "Dump to file". In fact, to suggest a driver could have the accuracy to universally capture a single specific code run as the game runs on the fly without any degradation in performance, and suddenly jam the details up in order to render a better frame is just....Stupid. A driver is not a code parser!

As for Valve attempting to "alienate a large number of graphics users". So what your saying is if a person is to make bad decisions based on uneducated purchases they have the right to bitch and complain about those who brought the problems to light rather then the company responsible for the faulty product?

Rolls Royce is a car company, BMW is a car company. Based off your logic, mechanics should not work upon Rolls Royce's because the owner made an "uneducated purchase".

You are grasping at straws now, my friend.

Edited by Chode
Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.