TechSpot: Graphics Card Overclocking, Is It Really Worth It?

Overclocking plays a vastly different role in the computer industry today than it did 10 years ago, a time when overclockers were considered outlaws by manufacturers. Back then even mentioning overclocking could void your warranty and industry leaders like Intel were working to eliminate it all together.

In contrast, nowadays processor and graphics cards manufacturers have embraced the practice, touting high 'overclockability' as a feature and in the process using it to sell enthusiast oriented products at a premium.

Here's one scenario that begs the question of whether overclocking is worth it... You go out to buy a new graphics card, set a budget, and it'd seem that for another $30-60 you can always go with the next step up that performs a little better. Or, you could save those extra dollars, go for the budget model and overclock it and basically match the next step up's performance.

With that in mind, we have hand-picked three graphics cards that represent select price ranges to see just how much extra value can be obtained through overclocking. For the $100+ range we have the Radeon HD 6750, the GeForce GTX 560 Ti has been used to represent the $200+ market. Then at the top of the food chain we have the Radeon HD 6970 going for $300 and up.

Read: Graphics Card Overclocking, Is It Really Worth It?

These articles are brought to you in partnership with TechSpot.

Report a problem with article
Previous Story

Motorola sues Apple over iPhone and iCloud

Next Story

Symantec admits to pcAnywhere threat from code leak

25 Comments

Commenting is disabled on this article.

I gave overclocking a try but for such a small performance increase it wasn't worth giving up stability.
If you constantly run your car at a higher RPM, don't expect it to last as long. I've heard the same about electronics.

I do, but then I run on water so the card is very cool anyway (and that's my only worry - additional heat and strain) so why not?

It hasn't been worth it since they started making the cards overclockable. Now instead of selling different price points of the exact same card with different BIOS they use upscreened parts on the higher end boards. This means that the low end boards failed the upscreening and will burn out very quickly with overclocking. Industry got wise and now we're buying hardware that becomes faulty because its all pre-overclocked to begin with.

I stopped overclocking my cards some years ago, for me it's not worthy. Perhaps it's because I don't play a lot anymore and most games run pretty well on my PC

FoxieFoxie said,
GUYS, GUYS, I HAVE A NEW ARTICLE FOR YOU

Is buying a cell phone instead of a telegraph worth it?


I Still prefer morse code.

oc a CPU like the 2500k can easily go to 4.1ghz with the default cooler and 4.6ghz with a 40$ investment. Video card are tougher to Oc, but i managed a good oc on my sli 460gtx setup...

majortom1981 said,
Its worth it when you cannot afford to upgrade your gpu and your gpu is alread a generation or two behind.
.

It's less worth it at that point. It's like beating a horse harder when everyone else has a car.

threetonesun said,
.

It's less worth it at that point. It's like beating a horse harder when everyone else has a car.

lol, thats funny, and true sadly

ever stop to think the the card could probibly perform at that anyway and they are charging u more for a crap card? thats what im thinking these days anyways, where as b4 u had to try and squeeze the performance out where now, u change a fsb or multip[lier and ur done

DKAngel said,
ever stop to think the the card could probibly perform at that anyway and they are charging u more for a crap card? thats what im thinking these days anyways, where as b4 u had to try and squeeze the performance out where now, u change a fsb or multip[lier and ur done

Technically you are correct.

I think they create a chip for the top end card then disable silicon to gimp the card to lower performance for the different tier of chip.

In a way it makes sense as its cheaper to make one chip then disable silicon than to make a new chip for every tier of graphics card. Which is why you can sometimes crossflash a card to unlock the disabled features, like the early 6950 chips which you could flash with 6970 firmware to unlock shaders.

DKAngel said,
ever stop to think the the card could probibly perform at that anyway and they are charging u more for a crap card? thats what im thinking these days anyways, where as b4 u had to try and squeeze the performance out where now, u change a fsb or multip[lier and ur done

The slower processors (be it gpu or cpus) can be from bins that had issues and could not be made to run at the top speeds in the proposed TDP or had to have parts disabled (like having less stream processors or cores). The first can be made to run at the top speeds, using better cooling and higher voltages, the second usually have stability issues when unlocked/oerclocked.

TheLegendOfMart said,
Technically you are correct.

I think they create a chip for the top end card then disable silicon to gimp the card to lower performance for the different tier of chip.

In a way it makes sense as its cheaper to make one chip then disable silicon than to make a new chip for every tier of graphics card. Which is why you can sometimes crossflash a card to unlock the disabled features, like the early 6950 chips which you could flash with 6970 firmware to unlock shaders.

In the case of the AMD tri-core CPU they were quad core but one of the cores are faulty thus they disable it and then rebrand it as as a tri-core product. By doing this they can salvage as many CPU's and GPU's as possible thus able reduce waste and keep their high end within an affordable price range.

IMHO I value stability and reliability over performance so I wouldn't over clock.

TheLegendOfMart said,
Of course its worth it, free performance increase assuming your case can dump the excess heat.

Higher power consumption and shorter lifespans aren't free. I'm not saying don't do it but the hidden, or looked over, costs are there.

i dont think o/c is worth it at all to be honest it did back in the day when u could get a celery 300 to 900 on air =]

Even if you had used phase-change, 900 was impossible...I believe the record is 705.

On air, you were unable to come close to achieving 600. The vast majority found that system stability was compromised at 500 or higher.

The most common overclocked speed of the 300A was 450...roughly half of what you claim.

Final Thoughts
Is graphics card overclocking worth it? In short, yes it can be. Certain configurations respond well to the change and after all, it's free and mostly safe. However game changing results should not be expected.