GTX 680 Specs + Benchmarks


Recommended Posts

In a leaked Nvidia slide, the final GeForce GTX 680 specifications are revealed and more...

In a leaked slide coming out of the Chinese web-site PCOnline.com.cn, we see an official Nvidia slide listing the GeForce GTX 680 specifications. The GeForce GTX 680 is based on the 28nm Kepler GK104 chip and is said to launch around March 22nd. The GTX 680 features clock speeds of 1006 MHz (base), and 1058 MHz (boost). The memory is clocked at 6 GHz (1500 MHz actual), and 2 GB GDDR5 memory running across a 256-bit wide interface. As we know, the GK104 will feature 1536 Stream Processors, 128 TMUs, and 32 ROPs. As with the AMD Radeon 7000 series cards, the GeForce GTX 680 will be based on a PCIe 3.0 bus interface with support for DX11.1, Quad Way SLI and display outputs of two DL-DVI, a HDMI and Display port 1.2. The card will utilize two 6-pin PCIe power connectors and has a TDP of 195W.

Nvidia-Kepler-GK104-GeForce-GTX680,S-G-330208-13.jpg

As we saw at the Game Developers Conference, Epic showed the infamous Samaritan Demo running on a single Nvidia's Kepler GPU, which could be the aforementioned GTX 680. Now, we see a slide coming out ofNGF community that shows the performance of the GTX 680 versus AMD's top two cards, the HD 7970 andHD 7950. The slide uses the HD 7970 as the baseline for performance comparison. Looking across the board, the GTX 680 outperforms the HD 7970 by roughly 20 percent in the seven games/benchmarks utilized. The one benchmark that stands out the most is the performance increase with Battlefield 3 (4xAA). The GTX 680 shows over a 40 percent performance increase, which could be related to use of Nvidia's FXAA technology.

Nvidia-Kepler-GK104-GeForce-GTX680,S-E-330206-13.png

Back in early March, we saw the first images of the GK104 card thanks to a leak coming from Chinese web-site Chiphell. Today, Chiphell has provided us with other leaked images and this time it is a near-final version of the graphics card. From these images, we see the cards Quad Way SLI support, along with its unique stacked power connector setup.

Nvidia-Kepler-GK104-GeForce-GTX680,S-F-330207-13.jpg

Nvidia-Kepler-GK104-GeForce-GTX680,T-E-330242-13.jpg

TomsHardware

For skeptics who refuse to believe randomly-sourced bar-graphs of the GeForce GTX 680 that are starved of pictures, here is the first set of benchmarks run by a third-party (neither NVIDIA nor one of its AIC partners). This [p]reviewer from HKEPC has pictures to back his benchmarks. The GeForce GTX 680 was pitted against a Radeon HD 7970, and a previous-generation GeForce GTX 580. The test-bed consisted of an extreme-cooled Intel Core i7-3960X Extreme Edition processor (running at stock frequency), ASUS Rampage IV Extreme motherboard, 8 GB (4x 2 GB) GeIL EVO 2 DDR3-2200 MHz quad-channel memory, Corsair AX1200W PSU, and Windows 7 x64.

Benchmarks included 3DMark 11 (performance preset), Battlefield 3, Batman: Arkham City, Call of Duty: Modern Warfare 3, Lost Planet 2, and Unigine Heaven (version not mentioned, could be 1). All tests were run at a constant resolution of 1920x1080, with 8x MSAA on some tests (mentioned in the graphs).

179a_thm.jpg179b_thm.jpg179c_thm.jpg179d_thm.jpg179e_thm.jpg179f_thm.jpg179g_thm.jpg

TechPowerUp

Link to comment
Share on other sites

I am not impressed. The Battlefield 3 benchmarks at that resolution are only 72 fps. I get that with my (4870x2) x 2. What I am looking for is a card that can push about 70 fps at 2560 x 1600 in Battlefield 3 which my cards cannot do.

Link to comment
Share on other sites

I am not impressed. The Battlefield 3 benchmarks at that resolution are only 72 fps. I get that with my (4870x2) x 2. What I am looking for is a card that can push about 70 fps at 2560 x 1600 in Battlefield 3 which my cards cannot do.

You can't notice anymore fluidity than that

  • Like 2
Link to comment
Share on other sites

I'm not really impressed by the advances in GPU's currently - we're looking at a transition to 4k displays soon and these cards can barely do 2k without choking.

Link to comment
Share on other sites

You can't notice anymore fluidity than that

You are right but I have not seen benchmarks for 2560x1600 in Battlefield 3. Currently my cards cannot push the game in that resolution. Since this card is the latest and greatest I would expect it to do better than my cards. However, at the 1920x1200 resolution the new king of the hill is pushing the same fps as my current, several generations behind, cards. That is why I am disappointed. I would like to replace my cards with the newest card so that I can play Battlefield 3 at 2560x1600 around 70 fps.

I'm not really impressed by the advances in GPU's currently - we're looking at a transition to 4k displays soon and these cards can barely do 2k without choking.

I agree, if this card is still struggling with the latest games at 2560x1600 resolution then why bother making screens with larger resolutions. Despite graphic cards have advanced a lot since ten years ago they are not advancing fast enough or at least that well.

Maybe rather than making graphic cards that burn up so much energy they should focus on changing the paradigm of physics engines, textures and how games are coded. There is no way we will ever get photo realistic games if something does not change drastically.

Link to comment
Share on other sites

I'm not really impressed by the advances in GPU's currently - we're looking at a transition to 4k displays soon and these cards can barely do 2k without choking.

resolution aint everything, my friend. what about AA, AF, tessellation, DirectX11, and all the other technology that goes into video games? subsurface scattering, bump mapping, ambient occulsion, motion blur...

you also have to realize that the drivers for these cards are probably pretty rough. The 1st driver release for the AMD 7970 was rough. Took a 2nd round to get better performance.

i think it's pretty funny that the tester has a extreme-cooled cpu running at stock speed :p

also, i will be buying (2) of these cards :woot:

Link to comment
Share on other sites

I'm not really impressed by the advances in GPU's currently - we're looking at a transition to 4k displays soon and these cards can barely do 2k without choking.

Yep i'm not impressed either. Specially at the price asked if it's 50$ more than a already overpriced 7970. The price of the 7950 is ridiculous too. What will be interesting to see when real professional benchs will come in is the overclocking abilities of this card. The memory clock and core clock are already high for a card of this power. With a stock fan there might not be that much more room left for overclocking if you want to keep the jet engine out of your home (most stock coolers sound like jet engine when you oc too much).

Gonna keep my oced 6950 with a custom cooler. I can get it to 900/1400 (fan at 60%) without any voltage tweak and it can runs current games fine. I only wish i had bought the 2gb version i fear that 1gb might not be enough for upcpming games.

Link to comment
Share on other sites

Yep i'm not impressed either. Specially at the price asked if it's 50$ more than a already overpriced 7970. The price of the 7950 is ridiculous too. What will be interesting to see when real professional benchs will come in is the overclocking abilities of this card. The memory clock and core clock are already high for a card of this power. With a stock fan there might not be that much more room left for overclocking if you want to keep the jet engine out of your home (most stock coolers sound like jet engine when you oc too much).

high end cards have always had an MSRP of $500. Based on supply and demand, though, they can reach $550. you'll never see a high-end card go for <$500.

Link to comment
Share on other sites

resolution aint everything, my friend. what about AA,

Imo resolution > AA

I prefer high resolution with low AA than lower resolution with higher aa.

Link to comment
Share on other sites

i bought my 5850 almost a year ago for 125 bucks. i oc'ed it and it performs similar to 6890. there is still nothing out there that I can REMOTELY justify upgrading to.

Link to comment
Share on other sites

Imo resolution > AA

I prefer high resolution with low AA than lower resolution with higher aa.

i agree - but you could probably dumb down the graphics to such a low level that 4k and 2k (as they put it) run smoothly. but what fun is that? they want cards that can handle 4k for some reason, but to do that, you need to dumb down all the other effects, and that's no fun.

Link to comment
Share on other sites

I agree, if this card is still struggling with the latest games at 2560x1600 resolution then why bother making screens with larger resolutions. Despite graphic cards have advanced a lot since ten years ago they are not advancing fast enough or at least that well.

Displays with that resolution capability are designed for graphic design. Certainly not gaming.

Link to comment
Share on other sites

Displays with that resolution capability are designed for graphic design. Certainly not gaming.

Not really - LG and other manufacturers have 4k displays already ready and the only thing we're really waiting for is a few specs to be finalized and the manufacturing costs to go down, naturally.

I'm personally salivating at the thought of getting a 4k "UltraHD" 30"+ screen on my table.

Link to comment
Share on other sites

Not really - LG and other manufacturers have 4k displays already ready and the only thing we're really waiting for is a few specs to be finalized and the manufacturing costs to go down, naturally.

I'm personally salivating at the thought of getting a 4k "UltraHD" 30"+ screen on my table.

Yeah cost is the killer. Do they have a decent response rate? The ones I see out now are at 7-15ms, which isn't that great compared to most other monitors.

Link to comment
Share on other sites

high end cards have always had an MSRP of $500. Based on supply and demand, though, they can reach $550. you'll never see a high-end card go for <$500.

Oh sure 500$ for high end gfx card is perfectly fine. But i just don't think those new card bring enough to justify an upgrade from the previous generation. If you are 2 generation behind maybe.

Link to comment
Share on other sites

Displays with that resolution capability are designed for graphic design. Certainly not gaming.

Maybe at first they were designed for graphics but they have become common for gaming. Specially for the people who spend the money. That is why games support that resolution. At first game developers had to start supporting that resolution but now they all do or at least close to them all. I do realize that due to the price of the monitor most gamers have just stuck with HD resolution. Therefore the slow adoption rate makes it to a point where no manufacturer or developer really care. The next thing that they care about is multi-monitors and 3D. So if you throw out the resolution is not made for gaming aspect the cards still need the power to support multi-monitors.

Link to comment
Share on other sites

Oh sure 500$ for high end gfx card is perfectly fine. But i just don't think those new card bring enough to justify an upgrade from the previous generation. If you are 2 generation behind maybe.

agreed

Link to comment
Share on other sites

I am not impressed. The Battlefield 3 benchmarks at that resolution are only 72 fps. I get that with my (4870x2) x 2. What I am looking for is a card that can push about 70 fps at 2560 x 1600 in Battlefield 3 which my cards cannot do.

Really? So a card with a single GPU can push the same performance as 4 GPUs from 3 generations ago? I'd call that pretty impressive. Besides, as others have said, resolution isn't everything. If it's pixelated and jagged, I don't care how high the resolution is. You want 70+ FPS @ 2560x1600? Buy two of them and SLI them.

  • Like 1
Link to comment
Share on other sites

Really? So a card with a single GPU can push the same performance as 4 GPUs from 3 generations ago? I'd call that pretty impressive. Besides, as others have said, resolution isn't everything. If it's pixelated and jagged, I don't care how high the resolution is. You want 70+ FPS @ 2560x1600? Buy two of them and SLI them.

Do you have a 30" monitor and play games with a 2560 x1600 resolution? Or have you ever experienced playing games like this on another person's computer? Games at this resolution are not pixelated or jagged. Unless you were letting a GPU upscale a game that did not support this resolution the image will look the same as 1920x1080.

Everyone's opinion is there own. Some people rather have anti aliasing rather than resolution and others rather have resolution rather than anti aliasing. Hell, some people like playing games on consoles better than PCs. Some people even like using multiple monitors for there games.

My cards are direct x 10.1, I do not have all the settings set that high and they only use about 65% of the GPU in crossfire. Plus my fps can sometimes drop as low as 30 fps and go as high as 100 fps depending on the scene. On average they are around 70 fps in BF3.

Thirty inch monitors have been out for a while now and plenty of people own them and game with one. So I feel that these state of the art top leading graphic cards should be able to push a game at a resolution of 2560x1600. I remember when graphic cards could barely push a game at the resolutions that were the norm at the time. Since then graphic cards have come a long way and so have monitors. I know that the games have come a long way too. My point is why is a graphics card always behind. The number one card seems like it can never push a certain game to the max, or resolution, or something. You think that it would be able to do anything the most recent game and hardware allow and a decent performance. That is why I am not impressed.

Link to comment
Share on other sites

Displays with that resolution capability are designed for graphic design. Certainly not gaming.

Wow. Really?

:s

If that's true how do you explain most gamers buying 27"+ or multiple displays for strictly gaming. :rolleyes:

Link to comment
Share on other sites

Wow. Really?

:s

If that's true how do you explain most gamers buying 27"+ or multiple displays for strictly gaming. :rolleyes:

That's a completely unrelated matter. He was talking about monitors that run at 2560x1600 resolution. Newegg sells 4 monitors that support that and they run over $1k each. They don't have a great response time, so it's a pretty clear indicator that they aren't made for gaming. Hense why games play like crap at that resolution. ;)

Link to comment
Share on other sites

I think some people are gravely mistaken that we're on the "cusp" of super high resolution monitors.

Yes, there are 2560 x ____ resolution monitors out there, but they've been there for many years, and they have not gained any traction. The new iPad is the first real attempt at making super high resolution screens a mainstream thing, but I still think it will be be 4+ years before we even see 1080p start to fade. Heck, when 1080p first came out, it took quite a few years before 720p TVs dwindled out. Plus with telecom and cable companies fighting tooth and nail to throttle our bandwidth consumption and 1080p just now starting to become standard for TV shows, I really doubt they are going to want to start moving to even higher resolutions.

Not to mention that console makers have no intention of pushing ending the lifecycle of our current consoles anytime soon, that will also prolong the life of 1080p.

Don't get me wrong, I am a PC enthusiast, and I want nothing more than to see the PC system get pushed as far as possible, but with profitability being a key factor these days, companies just can't afford to keep doing that.

Luckily it seems like nVidia is doing a good job of spending a lot of their money on R&D and is expaning their portfolio and prowess.

-Tegra is still slow to gain market share but they are gaining some traction

-The Kepler chips seem to be VERY fast compared to AMD's offering, and that's only the GK104 chip. I think they're going to use that advantage to give the GK114 chip more time to bake so they don't have a Gen 1 Fermi repeat.

-The 300 series drivers/cards are bringing 3 potentially awesome new features: TXAA, Quad-monitor support, and Adaptive VSync

-The kepler cards have pushed clocks higher and driven power usage down by a lot

-Preliminary benchmarks are showing that SLI is scaling very well

So yes, while I don't think video cards are jumping by leaps and bounds like they used to, I feel like nVidia is making great efforts to improve the quality/value of their products

Link to comment
Share on other sites

This topic is now closed to further replies.