TechSpot: History of the GPU #3 - The Nvidia vs. ATI era begins

With the turn of the century, the graphics industry bore witness to further consolidation. In the consumer graphics market, ATI announced the acquisition of ArtX Inc. in February 2000 for around $400 million in stock. ArtX was developing the GPU code-named Project Dolphin (eventually named “Flipper”) for the Nintendo GameCube, which added significantly to ATI’s bottom line.

Also in February, 3dfx announced a 20 percent workforce cut, then promptly moved to acquire Gigapixel for $186 million and gained the company’s tile-based rendering IP. Meanwhile, S3 and Nvidia settled their outstanding patent suits and signed a seven-year cross-license agreement.

But where 3dfx was once a byword for raw performance, its strengths around this time laid in its full-screen antialiasing image quality. The Voodoo 5 introduced T-buffer technology as an alternative to transformation and lighting, by basically taking a few rendered frames and aggregating them into one image. This produced a slightly blurred picture that, when run in frame sequence, smoothed out the motion of the animation.

Prior to the Voodoo 5’s arrival, ATI had announced the Radeon DDR as “the most powerful graphics processor ever designed for desktop PCs.” Previews of the card had already gone public on April 25, and only 24 hours later Nvidia countered with the announcement of the GeForce 2 GTS (GigaTexel Shader). The latter included Nvidia’s version of ATI’s Pixel Tapestry Architecture, named Nvidia Shading Rasterizer, allowing for effects such as specular shading, volumetric explosion, refraction, waves, vertex blending, shadow volumes, bump mapping and elevation mapping to be applied on a per-pixel basis via hardware.

Read: The History of the Modern Graphics Processor, Part 3: The Nvidia vs. ATI era begins

These articles are brought to you in partnership with TechSpot.

Report a problem with article
Previous Story

Rumor: Microsoft Office for iOS and Android not coming until October 2014

Next Story

Apple and Yahoo discussing partnerships

16 Comments

Commenting is disabled on this article.

My old Asus laptop had 5870 graphics...to this day it can still play most games pretty well, didn't really add much to the overall cost either...nvidia however, offers the "cuda cores" which gives you a lot more horsepower for Adobe apps and I mean A LOT more. So the whole, "couldn't pay me" or "don't need" seems silly if you use any of those apps (Photoshop, After Effects, Premiere). Just sayin.

I never once even thought about putting an AMD chip in my pc are they even close to intel in performance and stability how about Linux graphics support . Intel has out of this world Linux support when it comes to drivers

That <1% of the market must be very happy with the Linux support of the weakest GFX hardware around.
AMD outruns Intel easily, and in the low-mid segments outclasses Nvidia with ease.

Shadowzz said,
That <1% of the market must be very happy with the Linux support of the weakest GFX hardware around.
AMD outruns Intel easily, and in the low-mid segments outclasses Nvidia with ease.

What AMD CPU can out run a core I 5 ivy bridge ?

The only graphics I need is the intel HD4000 or HD2500 both have amazing performance . I game in consoles so intel graphics are perfect even in Linux could not be happier

The intel on die GPU's are really nice actually, if you forget to switch GPU's on my laptop, as long as you don't do anything really intensive you won't actually notice the difference (much)

Once you have a decent GPU experience however, you won't go back. I can promise you that. My 6770 performs really well for what I do, and although I could upgrade when I do more video work, right now it's perfectly acceptable. Does gaming, rendering and video nicely.

TurboShrimp said,
The only graphics I need is the intel HD4000 or HD2500 both have amazing performance . I game in consoles so intel graphics are perfect even in Linux could not be happier

For low-res, low quality and not AA. However, it is pretty stable with most games.

The Intel Integrated Graphics can run games indeed... but that doesn't mean you should game on them, they are indeed bad for it and actually not meant for that.

TurboShrimp said,
The only graphics I need is the intel HD4000 or HD2500 both have amazing performance . I game in consoles so intel graphics are perfect even in Linux could not be happier
O_o You can't use HD4000/HD2500 and 'amazing performance' in the same sentence. They are great for light (and I emphasize the word light) gaming, Blu-ray playback, and other low intense activities. But, if you want to game with any sort of detail and resolution, you'll want a dedicated graphics card. Besides, AMD and Nvidia offer much better integrated GPU's.

Arceles said,
The Intel Integrated Graphics can run games indeed... but that doesn't mean you should game on them, they are indeed bad for it and actually not meant for that.

Do you have a source that says playing games on intel graphics are bad for the chip ?

tsupersonic said,
O_o You can't use HD4000/HD2500 and 'amazing performance' in the same sentence. They are great for light (and I emphasize the word light) gaming, Blu-ray playback, and other low intense activities. But, if you want to game with any sort of detail and resolution, you'll want a dedicated graphics card. Besides, AMD and Nvidia offer much better integrated GPU's.

I stand by amazing because it is for me . Amd CPUs no thanks you can't pay me to use and CPU . And like I said I do not game at all on pc only game on consoles

TurboShrimp said,

Do you have a source that says playing games on intel graphics are bad for the chip ?

Its a pretty well known fact that working up an integrated GPU is not great for the chip due to the excess amount of heat you are exhausting. Dedicated cards often have their own fan units and heat sinks.

TurboShrimp said,

I stand by amazing because it is for me . Amd CPUs no thanks you can't pay me to use and CPU . And like I said I do not game at all on pc only game on consoles

Which is funny, because AMD's integrated GPU's on their hybrid (fusion) chips far outstrip Intel's...
One has to question what "games" you're actually playing on that chip. It certainly isn't anything remotely taxing, like BF3.

ingramator said,

Its a pretty well known fact that working up an integrated GPU is not great for the chip due to the excess amount of heat you are exhausting. Dedicated cards often have their own fan units and heat sinks.

Pretty well known fact? Well known by who? You've just made that statement up completely out of thin air, based on a misinformed opinion of your own.
Excess amount of heat? From what? The piddly onboard GPU that's baked into the CPU? Do you not understand that they're low-powered versions of dedicated GPU's and thus produce a lot less heat (At the cost of performance)? Do you not know that CPU's also have their own heatsink and fans?
There is a world of difference between discrete GPU's and integrated GPU's. Unless, of course, you're now going to claim that every smartphone and tablet out there shouldn't be used for games, because those too tend to use a single SoC that contains a CPU, GPU and numerous other components. Often without a dedicated fan, too.

TurboShrimp said,

I stand by amazing because it is for me . Amd CPUs no thanks you can't pay me to use and CPU . And like I said I do not game at all on pc only game on consoles

What great bias we have here While I also tend to stick with Intel these days, their APU's are amazing for the money and graphics performance you get. You definitely can't compare an integrated Intel GPU with similar AMD/Nvidia integrated GPU's.

For most users who don't game or do anything graphics intensive on their computer, the HD 4500 will be good enough. But, like I said for any type of modern gaming with details cranked and higher resolution, you will DEFINITELY want a dedicated graphics GPU to get a decent gaming experience. I game on both PC and console (although very rarely), so for me, I definitely need a dedicated GPU (I play on high details at 1680x1050). Is the HD4000 better than its predecessors? Absolutely. Is it better than its competitors integrated GPUs? Not a chance.

Regarding heat - my netbook doesn't throw out much heat when I do light gaming on it - it gets hotter, but not uncomfortably so. Even desktops with the HD4000 have adequate cooling to support the heat dissipation from a HD4000. Like Kushan said, the CPU's usually have heatsinks/active cooling to help dissipate heat.