Jump to content



Photo

GTX 680 Specs + Benchmarks

benchmarks 680 7970

  • Please log in to reply
30 replies to this topic

#16 Miuku.

Miuku.

    A damned noob

  • Joined: 10-August 03
  • Location: Finland, EU
  • OS: :: OS X :: SLES ::

Posted 16 March 2012 - 18:30

Displays with that resolution capability are designed for graphic design. Certainly not gaming.

Not really - LG and other manufacturers have 4k displays already ready and the only thing we're really waiting for is a few specs to be finalized and the manufacturing costs to go down, naturally.

I'm personally salivating at the thought of getting a 4k "UltraHD" 30"+ screen on my table.


#17 Astra.Xtreme

Astra.Xtreme

    Electrical Engineer

  • Tech Issues Solved: 3
  • Joined: 02-January 04
  • Location: Milwaukee, WI

Posted 16 March 2012 - 18:42

Not really - LG and other manufacturers have 4k displays already ready and the only thing we're really waiting for is a few specs to be finalized and the manufacturing costs to go down, naturally.

I'm personally salivating at the thought of getting a 4k "UltraHD" 30"+ screen on my table.


Yeah cost is the killer. Do they have a decent response rate? The ones I see out now are at 7-15ms, which isn't that great compared to most other monitors.

#18 LaP

LaP

    Forget about it

  • Tech Issues Solved: 1
  • Joined: 10-July 06
  • Location: Quebec City, Canada
  • OS: Windows 8.1 Pro Update 1

Posted 16 March 2012 - 18:53

high end cards have always had an MSRP of $500. Based on supply and demand, though, they can reach $550. you'll never see a high-end card go for <$500.


Oh sure 500$ for high end gfx card is perfectly fine. But i just don't think those new card bring enough to justify an upgrade from the previous generation. If you are 2 generation behind maybe.

#19 BillyJack

BillyJack

    Neowinian

  • Joined: 20-December 05
  • Location: Florida, United States

Posted 16 March 2012 - 19:00

Displays with that resolution capability are designed for graphic design. Certainly not gaming.


Maybe at first they were designed for graphics but they have become common for gaming. Specially for the people who spend the money. That is why games support that resolution. At first game developers had to start supporting that resolution but now they all do or at least close to them all. I do realize that due to the price of the monitor most gamers have just stuck with HD resolution. Therefore the slow adoption rate makes it to a point where no manufacturer or developer really care. The next thing that they care about is multi-monitors and 3D. So if you throw out the resolution is not made for gaming aspect the cards still need the power to support multi-monitors.

#20 Jason S.

Jason S.

    Neowinian Senior

  • Tech Issues Solved: 4
  • Joined: 01-September 03
  • Location: Cleveland, Ohio

Posted 16 March 2012 - 19:25

Oh sure 500$ for high end gfx card is perfectly fine. But i just don't think those new card bring enough to justify an upgrade from the previous generation. If you are 2 generation behind maybe.

agreed

#21 +metal_dragen

metal_dragen

    Neowinian Senior

  • Joined: 21-September 02
  • Location: Indy, USA

Posted 17 March 2012 - 13:03

I am not impressed. The Battlefield 3 benchmarks at that resolution are only 72 fps. I get that with my (4870x2) x 2. What I am looking for is a card that can push about 70 fps at 2560 x 1600 in Battlefield 3 which my cards cannot do.


Really? So a card with a single GPU can push the same performance as 4 GPUs from 3 generations ago? I'd call that pretty impressive. Besides, as others have said, resolution isn't everything. If it's pixelated and jagged, I don't care how high the resolution is. You want 70+ FPS @ 2560x1600? Buy two of them and SLI them.

#22 BillyJack

BillyJack

    Neowinian

  • Joined: 20-December 05
  • Location: Florida, United States

Posted 19 March 2012 - 17:53

Really? So a card with a single GPU can push the same performance as 4 GPUs from 3 generations ago? I'd call that pretty impressive. Besides, as others have said, resolution isn't everything. If it's pixelated and jagged, I don't care how high the resolution is. You want 70+ FPS @ 2560x1600? Buy two of them and SLI them.


Do you have a 30" monitor and play games with a 2560 x1600 resolution? Or have you ever experienced playing games like this on another person's computer? Games at this resolution are not pixelated or jagged. Unless you were letting a GPU upscale a game that did not support this resolution the image will look the same as 1920x1080.

Everyone's opinion is there own. Some people rather have anti aliasing rather than resolution and others rather have resolution rather than anti aliasing. Hell, some people like playing games on consoles better than PCs. Some people even like using multiple monitors for there games.

My cards are direct x 10.1, I do not have all the settings set that high and they only use about 65% of the GPU in crossfire. Plus my fps can sometimes drop as low as 30 fps and go as high as 100 fps depending on the scene. On average they are around 70 fps in BF3.

Thirty inch monitors have been out for a while now and plenty of people own them and game with one. So I feel that these state of the art top leading graphic cards should be able to push a game at a resolution of 2560x1600. I remember when graphic cards could barely push a game at the resolutions that were the norm at the time. Since then graphic cards have come a long way and so have monitors. I know that the games have come a long way too. My point is why is a graphics card always behind. The number one card seems like it can never push a certain game to the max, or resolution, or something. You think that it would be able to do anything the most recent game and hardware allow and a decent performance. That is why I am not impressed.

#23 ahhell

ahhell

    Neowinian Senior

  • Joined: 30-June 03
  • Location: Winnipeg - coldest place on Earth - yeah

Posted 19 March 2012 - 18:57

Displays with that resolution capability are designed for graphic design. Certainly not gaming.

Wow. Really?

:s

If that's true how do you explain most gamers buying 27"+ or multiple displays for strictly gaming. :rolleyes:

#24 Astra.Xtreme

Astra.Xtreme

    Electrical Engineer

  • Tech Issues Solved: 3
  • Joined: 02-January 04
  • Location: Milwaukee, WI

Posted 19 March 2012 - 19:10

Wow. Really?

:s

If that's true how do you explain most gamers buying 27"+ or multiple displays for strictly gaming. :rolleyes:


That's a completely unrelated matter. He was talking about monitors that run at 2560x1600 resolution. Newegg sells 4 monitors that support that and they run over $1k each. They don't have a great response time, so it's a pretty clear indicator that they aren't made for gaming. Hense why games play like crap at that resolution. ;)

#25 CGar

CGar

    WOOF

  • Joined: 04-February 07

Posted 19 March 2012 - 19:25

I think some people are gravely mistaken that we're on the "cusp" of super high resolution monitors.

Yes, there are 2560 x ____ resolution monitors out there, but they've been there for many years, and they have not gained any traction. The new iPad is the first real attempt at making super high resolution screens a mainstream thing, but I still think it will be be 4+ years before we even see 1080p start to fade. Heck, when 1080p first came out, it took quite a few years before 720p TVs dwindled out. Plus with telecom and cable companies fighting tooth and nail to throttle our bandwidth consumption and 1080p just now starting to become standard for TV shows, I really doubt they are going to want to start moving to even higher resolutions.

Not to mention that console makers have no intention of pushing ending the lifecycle of our current consoles anytime soon, that will also prolong the life of 1080p.

Don't get me wrong, I am a PC enthusiast, and I want nothing more than to see the PC system get pushed as far as possible, but with profitability being a key factor these days, companies just can't afford to keep doing that.

Luckily it seems like nVidia is doing a good job of spending a lot of their money on R&D and is expaning their portfolio and prowess.

-Tegra is still slow to gain market share but they are gaining some traction
-The Kepler chips seem to be VERY fast compared to AMD's offering, and that's only the GK104 chip. I think they're going to use that advantage to give the GK114 chip more time to bake so they don't have a Gen 1 Fermi repeat.
-The 300 series drivers/cards are bringing 3 potentially awesome new features: TXAA, Quad-monitor support, and Adaptive VSync
-The kepler cards have pushed clocks higher and driven power usage down by a lot
-Preliminary benchmarks are showing that SLI is scaling very well

So yes, while I don't think video cards are jumping by leaps and bounds like they used to, I feel like nVidia is making great efforts to improve the quality/value of their products

#26 Reacon

Reacon

    [VGW] Woohoo!

  • Joined: 12-May 08
  • Location: Katabatic
  • OS: Win 7 & Slackware

Posted 19 March 2012 - 19:37

i bought my 5850 almost a year ago for 125 bucks. i oc'ed it and it performs similar to 6890. there is still nothing out there that I can REMOTELY justify upgrading to.


Indeed. My Q8200 (which is OCd to 2.8 GHz) is more of a bottleneck than my OCd 5850 in most games.

#27 +metal_dragen

metal_dragen

    Neowinian Senior

  • Joined: 21-September 02
  • Location: Indy, USA

Posted 20 March 2012 - 12:06

Do you have a 30" monitor and play games with a 2560 x1600 resolution? Or have you ever experienced playing games like this on another person's computer? Games at this resolution are not pixelated or jagged. Unless you were letting a GPU upscale a game that did not support this resolution the image will look the same as 1920x1080.


I have a 27" iMac (which is the 16:9 2560x1440) and have gamed on it. Depending on the GPU you are running, you may have to scale back on the AA to get decent framerates. At that resolution, I'll grant you that it's harder to see the pixelation/jaggedness, but that's purely because of the pixel density, not because of the graphics power. That doesn't mean it's not there.

Thirty inch monitors have been out for a while now and plenty of people own them and game with one. So I feel that these state of the art top leading graphic cards should be able to push a game at a resolution of 2560x1600. I remember when graphic cards could barely push a game at the resolutions that were the norm at the time. Since then graphic cards have come a long way and so have monitors. I know that the games have come a long way too. My point is why is a graphics card always behind. The number one card seems like it can never push a certain game to the max, or resolution, or something. You think that it would be able to do anything the most recent game and hardware allow and a decent performance. That is why I am not impressed.


The problem isn't the cards necessarily. Most cards of the last generation have enough raw power to push that resolution without an issue. The problem lies in the pipeline between the GPU and the display, i.e. the card's drivers, the game code, how well optimized the game code is, what features the engine supports and the game is using, etc. Raw resolution with good AA at 2560x1600 is easy for most mid-high end GPUs. However, you throw things in like tesselation, physics, transparency, HDR and other lighting effects and the resources the GPU has available for raw resolution/AA are lowered because of the demands of these other technologies.

I'm not trying to disagree with you - given the choice, I'd game on a 27-30" monitor with all the settings cranked. The reality is that because of these other effects and technologies games use, it's a balance between resolution and image quality, not 100% of both.

#28 BillyJack

BillyJack

    Neowinian

  • Joined: 20-December 05
  • Location: Florida, United States

Posted 20 March 2012 - 17:33

I agree it depends on the game and hardware. It would be nice if BF3 was more efficient. I suppose the 30" monitors never hit main stream. When I bought mine 2 to 3 years ago they seemed to be getting more popularity. Of course it cost 1100 then. At the time there was over a dozen on Newegg and Newegg did not carry everthing that was being sold. Since then the market turned and the number of 30" monitors has decreased. I looked on Newegg the other day and there are only two now.

My monitor is the LG W3000H. It has a 5 ms response time. It would be nice if it was 2 ms but it is good enough for gaming. Plus it is an IPS panel. It would be awesome if it was 2 ms and LED back lit. Oh well. It is good enough for me though

I guess if I want 60 fps in the top games I will just have to do crossfire or sli. Or I can learn to live with a few less fps. I really do love 2560 x1600 because the map does not take up so much of the screen.

#29 Minimoose

Minimoose

    Neowinian Senior

  • Joined: 25-August 07

Posted 22 March 2012 - 11:23

You can't notice anymore fluidity than that


You haven't tried moving your mouse reasonably fast in a first person shooter then. Use a 120hz monitor with a fast (quake or w/e) first person shooter running at 120 fps or some higher multiple for a day, then go back to a 60hz monitor at 60fps; you won't want to play on the 60hz anymore.

#30 b_roca

b_roca

    Neowinian

  • Joined: 04-December 04

Posted 22 March 2012 - 11:29

i bought my 5850 almost a year ago for 125 bucks. i oc'ed it and it performs similar to 6890. there is still nothing out there that I can REMOTELY justify upgrading to.


The same deal here. I paid £200 for my 5850 just after they were released and it's still going pretty strong @1920x1200. My previous card was an 8800GTS 640mb and I think the next card in-line worthy of the same price/performance upgrade will be either the 7850 or 7870 - I'm thinking the latter will be a better deal for around the £240 mark as once overclocked it's looking like a great performer.

No doubt it'll be the usual straight swap from $ to £ so something like the GTX680 or 7970 at near double the price of a 7870 is not enough performance to justify the price imo.



Click here to login or here to register to remove this ad, it's free!