TechSpot: Full AMD and Nvidia GPU Comparison with Latest Drivers

After a busy year with numerous GPU releases by mid-September things had settled down for good. And then, AMD threw us a curve ball. Their Catalyst 12.11 beta drivers delivered major performance gains in many popular games such as Battlefield 3, Borderlands 2, Civilization V, Skyrim, Sleeping Dogs and StarCraft II. While most titles ran around 10% faster depending on their settings, Battlefield 3 was 20 to 30% faster.

Around the same time, Nvidia released a new beta driver of its own (GeForce 310.33) which claimed modest gains for the GTX 680 and GTX 660 in several titles, and this driver has since been replaced by the GeForce 310.61 update, which made further performance enhancements.

With updated pricing and performance across the board, we figured it would be worth revisiting both company's offerings to see where you should spend your hard-earned cash this holiday season and into early next year.

Read: The Best Graphics Cards: AMD and Nvidia GPU Comparison with Latest Drivers

These articles are brought to you in partnership with TechSpot.

Report a problem with article
Previous Story

Microsoft's push to hire overseas workers raises eyebrows

Next Story

It's probable Apple will lose smartphone wars without Steve Jobs

32 Comments

Commenting is disabled on this article.

I see a lot of people here who run their games at 1080p with AA16x and aniso 32x with ultra hard textures quality and all this at 180fps on a 3 generation old card.

Wow i mean that's incredible. My core i5 and 6950 can barely run GW2 at 1680x1050. I actually need to turn the shadow/reflection down a bit and use aa4x only for it to be playable (and it's barely playable to be honest).

With my GTX 570, Sleeping Dogs runs @ 26 FPS on 1680/1050 resolution. 8 gb of ram and an Ivy Bridge Core i5 on ultra. Its unacceptable whereas a friend with an AMD gpu 7850 wich is rated @ the same , to run it with 39-42 fps on the same resolution and details. He has the same cpu as I have and 6 gigs of ram. nVidia should start fixing crap with new drivers soon as AMD already is on a rush .

It's an AMD sponsored game, so it was designed to take advantage of their cards. Make sure you're not running the top anti-aliasing, because it performs very badly on nVidia cards.

theyarecomingforyou said,
It's an AMD sponsored game, so it was designed to take advantage of their cards. Make sure you're not running the top anti-aliasing, because it performs very badly on nVidia cards.

I throttled back the AA by one step and now I have 40-55 fps without notable graphical differences. The same crap is available with Hitman Absolution. I haven't seen any nvidia powered games lately tough. I still remember old games with the "Meant to be played " tag but not anymore these days. Thanks for the tip!

deadheadline said,
I throttled back the AA by one step and now I have 40-55 fps without notable graphical differences.

I know, I had to do the same thing and I have a GTX 680 SLI setup!

deadheadline said,
I haven't seen any nvidia powered games lately tough. I still remember old games with the "Meant to be played " tag but not anymore these days.

Well, Borderlands 2 is the most obvious recent game. But more and more games are going to AMD, like the upcoming Far Cry 3.

deadheadline said,
With my GTX 570, Sleeping Dogs runs @ 26 FPS on 1680/1050 resolution. 8 gb of ram and an Ivy Bridge Core i5 on ultra. Its unacceptable whereas a friend with an AMD gpu 7850 wich is rated @ the same , to run it with 39-42 fps on the same resolution and details. He has the same cpu as I have and 6 gigs of ram. nVidia should start fixing crap with new drivers soon as AMD already is on a rush .

Max AA on that game enable Super Sampling.. witch is not useable even today. Check Witcher 2 UBER SAMPLING witch is about the same for performance degradation. Use mid setting witch is FXAA HIGH

never going to buy AMD again. I had a HD 4890 (about 3 years old) its still better than a crap load of cards get its "legacy" now so gets less or none updates..

Screw AMD go Nvidia every time at least they have driver support.

MrAnalysis said,
never going to buy AMD again. I had a HD 4890 (about 3 years old) its still better than a crap load of cards get its "legacy" now so gets less or none updates..

Screw AMD go Nvidia every time at least they have driver support.

It was 3 years ago when I last had problems with AMD drivers. I'm guessing if you're trying to force C12.11 on an 4890 then you're bound to have problems.

MrAnalysis said,
never going to buy AMD again. I had a HD 4890 (about 3 years old) its still better than a crap load of cards get its "legacy" now so gets less or none updates..

Screw AMD go Nvidia every time at least they have driver support.

I agree with you. AMD released drivers for Widows 8 so that you can use your old card but there is no ccc or it does not work. I had to go back to 12.12.

alwaysonacoffebreak said,

It was 3 years ago when I last had problems with AMD drivers. I'm guessing if you're trying to force C12.11 on an 4890 then you're bound to have problems.

There are always problem with AMD but that can be said about Nvidia too. AMD says they provide support for their older cards but it is all lies. They only release real support for the latest generation. Anything older they tweak to work but at the cost of lost FPS.

I have had AMD for a while now and I am finally tired of them. It is time for me to move on to Nvidia again.

"At 1680x1050, which is a typical resolution for gaming on these graphics cards"

That's certainly news to me. I typically only play on 1920x1080 and that's with a Radeon HD 4850. You guys?

KSib said,
"At 1680x1050, which is a typical resolution for gaming on these graphics cards"

That's certainly news to me. I typically only play on 1920x1080 and that's with a Radeon HD 4850. You guys?


I also play at a 1920x1080 resolution(with a evga GTX 460 ti).

Hmm yes, according to the Steam hardware survey, the most common display resolution is 1920x1080.

http://store.steampowered.com/hwsurvey

KSib said,
"At 1680x1050, which is a typical resolution for gaming on these graphics cards"

That's certainly news to me. I typically only play on 1920x1080 and that's with a Radeon HD 4850. You guys?

KSib said,
"At 1680x1050, which is a typical resolution for gaming on these graphics cards"

That's certainly news to me. I typically only play on 1920x1080 and that's with a Radeon HD 4850. You guys?

This resolution is considered typical for cards under 150 USD ( low-end ) in the article. For more expensive the 1920x1080 is taken in consideration. May be the under 150 USD may work fine in 1080p resolution in some games, but not in all.

KSib said,
"At 1680x1050, which is a typical resolution for gaming on these graphics cards"

That's certainly news to me. I typically only play on 1920x1080 and that's with a Radeon HD 4850. You guys?


Yep, I play in 1680x1050. I don't really fancy shelling out for a new (16:9) monitor when this one's still working perfectly after about four years.

MightyJordan said,

Yep, I play in 1680x1050. I don't really fancy shelling out for a new (16:9) monitor when this one's still working perfectly after about four years.

I'm on the same way of thinking, only difference is I'm still on 1440x900 haha, it's hell of an old monitor but works perfectly and I have no intention to hook up a new one nor my TV till it lasts.

Skittlebrau said,
Hmm yes, according to the Steam hardware survey, the most common display resolution is 1920x1080.

http://store.steampowered.com/hwsurvey

steam users have money! Many people who pirate don't have money for games or a good enough gpu to run games at that resolution. Laptop users have 1366x768 and i'm sure most desktops run their games at 1680x1050 otherwise they will get really low frame rates unless they have a powerful gpu.

torrentthief said,

steam users have money! Many people who pirate don't have money for games or a good enough gpu to run games at that resolution. Laptop users have 1366x768 and i'm sure most desktops run their games at 1680x1050 otherwise they will get really low frame rates unless they have a powerful gpu.

yea i don't agree, i know people whom will pirate just cause they can! they have the money and there are always sales!!! i don't have a lot of money and ill wait for a sale.

I game at 2560x1600.

I have Crossfire 6870's but even if i disable Crossfire a single 6870 will run 99.9% of games maxed out at this res. With AA + AF as well.

Almost all PC games and game engines are made with consoles in mind, so they're made for ancient 7 year old hardware and will easily run on any decent graphics card. Which isn't a good thing as far as i'm concerned. Consoles are seriously holding back PC graphics, and you now get people that actually think games like Portal 2 look good because they're so used to crappy graphics that this low graphical standard has now become acceptable, when PC hardware could do so much better and isn't even beginning to be stressed with stuff like this.

I'm not just talking about graphics either, the same goes for gameplay physics which can massively benefit from more processing power in PC's.

KSib said,
"At 1680x1050, which is a typical resolution for gaming on these graphics cards"

That's certainly news to me. I typically only play on 1920x1080 and that's with a Radeon HD 4850. You guys?

I use that rez. I'm looking to upgrade to a 1920 x 1280 here soon.

Lord Method Man said,
1680x1050? What is this 2006? No one uses that anymore. 16:10 in general is dead.

Sorry, but in order for me to have a larger monitor, I need to get a new desk. The one I have, is quite old--it's made for CRT's. Just because it's not "standard" anymore doesn't mean it's "dead"--it flips on just fine. I want to get an "L" shaped real wood desk and get 3 1920 x 1280 monitors, but if I do that, I need to get a new rig, as I would want to game on them. That's gonna be about 2-3k to get nice stuff. I just can't win!

Lord Method Man said,
1680x1050? What is this 2006? No one uses that anymore. 16:10 in general is dead.

16:9 is only popular because it's cheap, not because it's better. Most PC gaming enthusiasts I know use 16:10 displays; it's a much better aspect ratio for the desktop, where vertical space is at a premium.

Looks like I'll be waiting for the next generation of video cards before I replace my GTX 560 Ti. There isn't a similarly priced card that can consistently do twice as well yet.

Dr_Asik said,
Looks like I'll be waiting for the next generation of video cards before I replace my GTX 560 Ti. There isn't a similarly priced card that can consistently do twice as well yet.

Exactly how I feel (5770 here though).

MightyJordan said,

Exactly how I feel (5770 here though).

It's amazing how the 5770 soldiers on. My monitor is 1680x1050 and honestly no game released thus far gives me any trouble.

Just by glancing at the table of contents on the testing, it seems like it's more of a gaming benchmark set. And no games that I'm aware of will use over 2GB of memory on it's own.

KSib said,
Just by glancing at the table of contents on the testing, it seems like it's more of a gaming benchmark set. And no games that I'm aware of will use over 2GB of memory on it's own.

That's not true, as it depends on the settings used. If you're gaming at 2560x1600 or 5760x1200 with maximum anti-aliasing then you can break 2GB in the most demanding games. That said, you're more likely to come across the limits of the GPU than the VRAM.

As for the comparison, I think it's misleading to include the overclocked 7970 card but not the overclocked nVidia cards or the 4GB models as was pointed out.

Zeet said,
They should used the nvidia cards with more than 2GB as well for the test.

Well in the first place it was nvidia that made the 2GB spec for reference cards. Manufacturers were the ones that decided that 4GB cards should exist, however that would then command a premium. IMO nvidia vs. amd should be compared using their reference cards and not with the special manufacturer editions; those non-ref cards should be compared with equal non-ref cards so as not to distort the results.

ruelle41 said,
IMO nvidia vs. amd should be compared using their reference cards and not with the special manufacturer editions; those non-ref cards should be compared with equal non-ref cards so as not to distort the results.

But that doesn't give an accurate picture of the GPU market. The cheapest GTX680 you can get at the moment is actually factory overclocked by over 100MHz and the 4GB cards also come overclocked. Yet the comparison we see here is of a reference model that few would buy. The same is true with the HD7970.

I just don't think enough was done to ensure a fair and representative comparison.