TechSpot: Brink GPU & CPU performance review

Splash Damage is no newcomer to the first-person shooter genre having released two Wolfenstein games, Doom 3, and Enemy Territory: Quake Wars in the last decade. With a portfolio comprised of highly-acclaimed titles, we were very excited to hear about the studio's latest creation.

Released just a week ago, Brink is a class-based multiplayer first-person shooter built with a modified version of idTech 4, a game engine developed and licensed by id Software and published by Bethesda Softworks, the folks behind The Elder Scrolls franchise along with recent iterations of Fallout.

Brink looks to be an exciting, fast-paced first-person shooter. Continue reading to see how the game performs on a dozen current and previous-generation GPUs priced from $100 to $700.

​Read: Brink GPU & CPU performance review

These articles are brought to you in partnership with TechSpot.

Report a problem with article
Previous Story

T-Mobile roadmap leaks; new 16-MP WP7, BlackBerrys and Androids?

Next Story

Microsoft releases Xbox 360 Spring 2011 dashboard update

26 Comments

Commenting is disabled on this article.

3 patches later and still no CF support game shouldn't have been released for PC until they had both CF/SLI support typical rush it to market mentality

I've had no major problems with it at all. Ran it on a 8800gt, an ati 4870 and a nvidia 460 with no noticeable change in fps, though i'd to turn off shadows on the 8800gt. I've a modest system, quad core and 4gb of ram and I can play everything just fine.

I think it's absolutely preposterous that Brink doesn't have proper Crossfire support. What makes them think their game is ready for release without it? They're just going to leave that segment of the buyers s**t out of luck? Not to mention the reported horrid performance on AMD cards to begin with.

lol. I cant believe the ppl still wasting money buying Videocards... A 6 years computer is by far better than any nowday CPU/GPU. Yes, the 360 is a way better than any of those expensive and useless videocards. Oh wait, you can run it at 1080p, oh, who cares!!

ThePitt said,
lol. I cant believe the ppl still wasting money buying Videocards... A 6 years computer is by far better than any nowday CPU/GPU. Yes, the 360 is a way better than any of those expensive and useless videocards. Oh wait, you can run it at 1080p, oh, who cares!!

They are some strong drugs you are taking.

ThePitt said,
lol. I cant believe the ppl still wasting money buying Videocards... A 6 years computer is by far better than any nowday CPU/GPU. Yes, the 360 is a way better than any of those expensive and useless videocards. Oh wait, you can run it at 1080p, oh, who cares!!

Sarcasm? Joke? I want to believe your comment wasn't meant to be serious.

What is wrong with Fraps, it is far better way of testing than timedemos which can be optimized for. No one has used demos for years, its all about Fraps get with the times. Also not sure what is wrong with their CPU results, they are quite useful I have found.

swright said,
What is wrong with Fraps, it is far better way of testing than timedemos which can be optimized for. No one has used demos for years, its all about Fraps get with the times. Also not sure what is wrong with their CPU results, they are quite useful I have found.

Everybody is using demo (just look at the latest TF2 replay update, duh).

Also as said before, playing back a demo is way more accurate than playing against bots for 1min with fraps on the corner of you screen. Because with timenetdemo it's exactly the same scene played over, so it's perfect for bench who tries to compare GPU/CPU performances just like here.

"timedemos which can be optimized for. " LOLOLOLOLooll

Anthonyd said,

Everybody is using demo (just look at the latest TF2 replay update, duh).

Also as said before, playing back a demo is way more accurate than playing against bots for 1min with fraps on the corner of you screen. Because with timenetdemo it's exactly the same scene played over, so it's perfect for bench who tries to compare GPU/CPU performances just like here.

"timedemos which can be optimized for. " LOLOLOLOLooll

Not sure who everybody is but 99% of tech sites test with Fraps in the vast majority of games they use for testing. Furthermore I have not read a recent graphics card review that featured TF2

Anthonyd said,

Everybody is using demo (just look at the latest TF2 replay update, duh).

Also as said before, playing back a demo is way more accurate than playing against bots for 1min with fraps on the corner of you screen. Because with timenetdemo it's exactly the same scene played over, so it's perfect for bench who tries to compare GPU/CPU performances just like here.

"timedemos which can be optimized for. " LOLOLOLOLooll

Not sure who everybody is but 99% of tech sites test with Fraps in the vast majority of games they use for testing. Furthermore I have not read a recent graphics card review that featured TF2

Too bad this is a poor console portage that can't record demo and play timenetdemo (like ET:QW/Doom3 and others idtech games).

This ends up with such tests were the testers are playing for the lulz for 1min with fraps instead of doing a real benchmark playing the exact same sequence again&again.

Just look at the CPU tests, you'll know the avg % who is wrong on those bench.

Techspot is such a biased site. They include benchmarks for nVidia's dual-GPU solution, the GTX 590, but don't include AMD's 6990. This artificially skews the results to make nVidia look more competitive than they actually are.

theyarecomingforyou said,
Techspot is such a biased site. They include benchmarks for nVidia's dual-GPU solution, the GTX 590, but don't include AMD's 6990. This artificially skews the results to make nVidia look more competitive than they actually are.

I agree it was poor form to include the GTX 590 and not the HD 6990. They shouldn't of included the 590 if they weren't going to test the 6990 in my opinion.

theyarecomingforyou said,
Techspot is such a biased site. They include benchmarks for nVidia's dual-GPU solution, the GTX 590, but don't include AMD's 6990. This artificially skews the results to make nVidia look more competitive than they actually are.
Or if you read the whole thing you'd know why they didn't include the HD6990.

Singh400 said,
Or if you read the whole thing you'd know why they didn't include the HD6990.

Why doe it require crossfire support though? Doesn't that require more then 1 card in multiple PCI-X slots?

HoochieMamma said,

Why doe it require crossfire support though? Doesn't that require more then 1 card in multiple PCI-X slots?

2 GPU's on one board.

Wow you probably should have read the article before commenting on it

HoochieMamma said,

Hmmm thought the card should be smart enough to make the computer consider it as being a single card.

I don't think the cards intelligence is in question, it is a matter of connecting the GPUs.

Dear god these ATi problems need to get cleared up. I've tried 10.12, 11.2, 11.3, 11.4, 11.5, 11.5a rc1, 11.5a rc3, and I'm trying 10.8 right now. All with proper clean installs.

I'm running a 5850, Q8200 @ 3.4 GHz, and at 1920x1080. I dip down to 20 fps frequently on the drivers that have given me the best performance (11.2 and 11.5a are practically identical). I have everything turned off, including shadows, and threadrenderer set to 0 or 2 doesn't make a difference.