TechSpot: Gigabyte Radeon HD 7950 3072MB Review

AMD ended 2011 on a high note, unleashing the market's fastest graphics card (single-GPU) and beating its adversary to the next-generation GPU yet again.

Conveniently enough, AMD also offers a potentially valid alternative to its flagship with its Radeon HD 7950, which is essentially a lower-specced and lower-priced version of the HD 7970. The HD 7950 is set at $419 for the 1536MB version, while the full 3072MB variant is $449. Although it's currently possible to find a 3GB model for $449, you can expect to pay closer to $500.

Gigabyte has redesigned the PCB and included an upgraded cooler that is meant to lower temperatures and improve overclocking. Considering the HD 7970's respectable performance, we expect a solid showing from the HD 7950.

Read: Gigabyte Radeon HD 7950 3072MB Review

These articles are brought to you in partnership with TechSpot.

Report a problem with article
Previous Story

Box for Android adds 50 GB of free storage

Next Story

HP accidentally reveals Windows 8 SKUs?

15 Comments

Commenting is disabled on this article.

I dont get it, arent the xx50 series usually the budget versions of the series? eg...my 6850 when it was released was like $180 (or maybe 230) this is almost double the price for same entry point?

Osiris said,
I dont get it, arent the xx50 series usually the budget versions of the series? eg...my 6850 when it was released was like $180 (or maybe 230) this is almost double the price for same entry point?

79xx is their High-end
78xx Mid/budget
77xx Mid->low

Just this time these cards are expensive because new architecture has been implemented. Even the 7850, like yours, will be more expensive ( over 250$ probably ) than you bought the 6850 back then.

Anyway... I was ready for first time to go with AMD ( 7950 ) but I truly hesitate. A bit the high cost, a bit the not so impressed... I'll wait for Kepler eventually to make comparisons and buy accordingly.

PC EliTiST said,
Anyway... I was ready for first time to go with AMD ( 7950 ) but I truly hesitate. A bit the high cost, a bit the not so impressed... I'll wait for Kepler eventually to make comparisons and buy accordingly.
Hopefully AMD will drop their prices around then too in order to hit nVidia (not because I dislike nVidia--I actually like them generally, but just so prices go down).

PC EliTiST said,

79xx is their High-end
78xx Mid/budget
77xx Mid->low

Just this time these cards are expensive because new architecture has been implemented. Even the 7850, like yours, will be more expensive ( over 250$ probably ) than you bought the 6850 back then.

Anyway... I was ready for first time to go with AMD ( 7950 ) but I truly hesitate. A bit the high cost, a bit the not so impressed... I'll wait for Kepler eventually to make comparisons and buy accordingly.

ah yeah that makes complete sense, my only excuse for my sillyness would be it was late at night hehe. Thanks for clearing it up though

Well the memory extra sole purpose is for EYEFinity... witch AMD excel at. There's more people that play 3 monitors than 3D setup (witch is a force of Nvidia). If you're not using multiple monitor 1.5-2gig is plenty enought for 1080p

ryoohki said,
Well the memory extra sole purpose is for EYEFinity... witch AMD excel at. There's more people that play 3 monitors than 3D setup (witch is a force of Nvidia). If you're not using multiple monitor 1.5-2gig is plenty enought for 1080p
Multiple monitors do not need even 1 GB of video card memory, even at 1080p. Heck, the current MacBook Air's can drive a 27" (2560x1440--much higher than 1080p) and its own monitor (not 1080p) running off of Sandy Bridge's integrated GPU (iGPU), which is sharing memory with the rest of the system (as little as 2 GB). Obviously, you're not playing any games on that, but it works.

Also, I have had two 24" 1080p monitors for at least the past 4 years, and, until late last year, I had an older video card with just 256-512 MB of RAM running both, while playing the latest video games (at lower and lower specs as time went on, as my video card showed its age).

KomaWeiss said,
Though, I'll never buy another AMD card. When will they drop DVI and just go with HDMI from now on?
I hope they don't. I don't see any advantages to using a typical computer monitor with HDMI - I mean I have a 2.1 sound system. Granted there are adapters and such to go between DVI/HDMI, so it's not a huge issue.

Hopefully never. For a computer monitor there isn't a good reason to use HDMI over DVI. The only advantage that HDMI offers is that it can carry sound, but that's pretty much null unless you completely hate your ears and are using your monitor's built in speakers?

I'm planning on picking up nVidia's next flagship card as I'm not entirely happy with AMD at the moment, but them supporting DVI *and* HDMI instead of just HDMI is a bizarre nitpick. From the looks of the card it has 3 HDMI ports (1 normal, 2 mini) and DVI.

Amarok said,
Hopefully never. For a computer monitor there isn't a good reason to use HDMI over DVI. The only advantage that HDMI offers is that it can carry sound, but that's pretty much null unless you completely hate your ears and are using your monitor's built in speakers?

I'm planning on picking up nVidia's next flagship card as I'm not entirely happy with AMD at the moment, but them supporting DVI *and* HDMI instead of just HDMI is a bizarre nitpick. From the looks of the card it has 3 HDMI ports (1 normal, 2 mini) and DVI.

Those are 2 mini displayports, not mini HDMI

KomaWeiss said,
Though, I'll never buy another AMD card. When will they drop DVI and just go with HDMI from now on?

Yeah great idea drop support that most monitors have.

PS Adaptors are not same.

KomaWeiss said,
Though, I'll never buy another AMD card. When will they drop DVI and just go with HDMI from now on?

Probably in the next few years as Intel/AMD move to drop native support for VGA in their chipsets by 2015. The low end (really just VGA with a DVI connection) of DVI is also going away.

As for not using the sound on DisplayPort and HDMI: you can configure your PC to send the sound over your auxiliary cable rather than the combined video/sound cable, thus saving your speakers. It's usually smart enough to send over auxiliary in preference to those anyway.

I hope that more push DisplayPort (though I hope they pick either mini-DP or normal rather than having both; I prefer mini-DP because I already have multiple mini-DP ports and dongles for them (both my laptop and my PC's video card) and normal DP looks similar to HDMI). If for no other reason, I think it will encourage the push of Thunderbolt into the PC world once everyone has DP-based monitors that they can add to the end of a Thunderbolt chain.

tsupersonic said,
I hope they don't. I don't see any advantages to using a typical computer monitor with HDMI - I mean I have a 2.1 sound system. Granted there are adapters and such to go between DVI/HDMI, so it's not a huge issue.

HDMI is nothing more in bandwidth for your monitor then what DVI offers. It's both identical. It just adds sound to it.
You can use this thingy to make HDMI a DVI thing (same for thise DVI to VGA doowacky's)

Shadowzz said,

HDMI is nothing more in bandwidth for your monitor then what DVI offers. It's both identical. It just adds sound to it.
You can use this thingy to make HDMI a DVI thing (same for thise DVI to VGA doowacky's)
Thank you sir for restating everything I said.

Incredible... I never thought I'd see graphics cards manufactured before 2016/17 with excess of 2GB DDR2-&-up RAM be referred to as modest-spec'd cards?! ;>