Question

Posted

[quote]
[b][font="Verdana"][size="2"][color="#3c3b3b"]In a leaked Nvidia slide, the final GeForce GTX 680 specifications are revealed and more...[/color][/size][/font][/b]

In a leaked slide coming out of the Chinese web-site [url="http://diy.pconline.com.cn/graphics/news/1203/2702126.html?ad=6431"]PCOnline.com.cn[/url], we see an official Nvidia slide listing the GeForce GTX 680 specifications. The GeForce GTX 680 is based on the 28nm Kepler GK104 chip and is said to launch around March 22nd. The GTX 680 features clock speeds of 1006 MHz (base), and 1058 MHz (boost). The memory is clocked at 6 GHz (1500 MHz actual), and 2 GB GDDR5 memory running across a 256-bit wide interface. [url="http://www.tomshardware.com/news/Nvidia-Kepler-GK104-GeForce-GTX-670-680,14691.html"]As we know[/url], the GK104 will feature 1536 Stream Processors, 128 TMUs, and 32 ROPs. As with the AMD Radeon 7000 series cards, the GeForce GTX 680 will be based on a PCIe 3.0 bus interface with support for DX11.1, Quad Way SLI and display outputs of two DL-DVI, a HDMI and Display port 1.2. The card will utilize two 6-pin PCIe power connectors and has a TDP of 195W.
[img]http://media.bestofmicro.com/Nvidia-Kepler-GK104-GeForce-GTX680,S-G-330208-13.jpg[/img]
As we saw at the [url="http://www.tomshardware.com/news/Kepler-Samaritan-GeForce-GK104-gpu,14927.html"]Game Developers Conference[/url], Epic showed the infamous Samaritan Demo running on a single Nvidia's Kepler GPU, which could be the aforementioned GTX 680. Now, we see a slide coming out of[url="http://forum.ngfcommunity.com/index.php?threads/the-nvidia-kepler-thread-gtx-6xx.1447/page-8#post-89173"]NGF community[/url] that shows the performance of the GTX 680 versus AMD's top two cards, the [url="http://www.tomshardware.com/reviews/radeon-hd-7970-benchmark-tahiti-gcn,3104.html"]HD 7970[/url] and[url="http://www.tomshardware.com/reviews/radeon-hd-7950-overclock-crossfire-benchmark,3123.html"]HD 7950[/url]. The slide uses the HD 7970 as the baseline for performance comparison. Looking across the board, the GTX 680 outperforms the HD 7970 by roughly 20 percent in the seven games/benchmarks utilized. The one benchmark that stands out the most is the performance increase with Battlefield 3 (4xAA). The GTX 680 shows over a 40 percent performance increase, which could be related to use of [url="http://www.tomshardware.com/news/nvidia-epic-samaritan-kepler-fermi,14966.html"]Nvidia's FXAA technology[/url].
[img]http://media.bestofmicro.com/Nvidia-Kepler-GK104-GeForce-GTX680,S-E-330206-13.png[/img]
[url="http://www.tomshardware.com/news/nvidia-gpu-kepler-graphics-card-geforce,14921.html"]Back in early March[/url], we saw the first images of the GK104 card thanks to a leak coming from Chinese web-site Chiphell. Today, Chiphell has provided us with other leaked images and this time it is a near-final version of the graphics card. From these images, we see the cards Quad Way SLI support, along with its unique stacked power connector setup.
[img]http://media.bestofmicro.com/Nvidia-Kepler-GK104-GeForce-GTX680,S-F-330207-13.jpg[/img]
[img]http://media.bestofmicro.com/Nvidia-Kepler-GK104-GeForce-GTX680,T-E-330242-13.jpg[/img]

[url="http://www.tomshardware.com/news/Nvidia-Kepler-GeForce-GTX680-gpu,15012.html"][size=2]TomsHardware[/size][/url][/quote]



[quote]For skeptics who refuse to believe randomly-sourced bar-graphs of the GeForce GTX 680 that are starved of pictures, here is the first set of benchmarks run by a third-party (neither NVIDIA nor one of its AIC partners). This [p]reviewer from HKEPC has pictures to back his benchmarks. The GeForce GTX 680 was pitted against a Radeon HD 7970, and a previous-generation GeForce GTX 580. The test-bed consisted of an extreme-cooled Intel Core i7-3960X Extreme Edition processor (running at stock frequency), ASUS Rampage IV Extreme motherboard, 8 GB (4x 2 GB) GeIL EVO 2 DDR3-2200 MHz quad-channel memory, Corsair AX1200W PSU, and Windows 7 x64.

Benchmarks included 3DMark 11 (performance preset), Battlefield 3, Batman: Arkham City, Call of Duty: Modern Warfare 3, Lost Planet 2, and Unigine Heaven (version not mentioned, could be 1). All tests were run at a constant resolution of 1920x1080, with 8x MSAA on some tests (mentioned in the graphs).

[url="http://www.techpowerup.com/img/12-03-16/179a.jpg"][img]http://tpucdn.com/img/12-03-16/179a_thm.jpg[/img][/url][url="http://www.techpowerup.com/img/12-03-16/179b.jpg"][img]http://tpucdn.com/img/12-03-16/179b_thm.jpg[/img][/url][url="http://www.techpowerup.com/img/12-03-16/179c.jpg"][img]http://tpucdn.com/img/12-03-16/179c_thm.jpg[/img][/url][url="http://www.techpowerup.com/img/12-03-16/179d.jpg"][img]http://tpucdn.com/img/12-03-16/179d_thm.jpg[/img][/url][url="http://www.techpowerup.com/img/12-03-16/179e.jpg"][img]http://tpucdn.com/img/12-03-16/179e_thm.jpg[/img][/url][url="http://www.techpowerup.com/img/12-03-16/179f.jpg"][img]http://tpucdn.com/img/12-03-16/179f_thm.jpg[/img][/url][url="http://www.techpowerup.com/img/12-03-16/179g.jpg"][img]http://tpucdn.com/img/12-03-16/179g_thm.jpg[/img][/url]

[url="http://www.techpowerup.com/162498/GTX-680-Generally-Faster-Than-HD-7970-New-Benchmarks.html"][size=2]TechPowerUp[/size][/url][/quote]

Share this post


Link to post
Share on other sites

30 answers to this question

  • 0

Posted

I say it looks solid for what?... 50 USD more than 7970? (Y)

Share this post


Link to post
Share on other sites
  • 0

Posted

the gtx 680 will be 1000mhz/6000mhz stock ?

Wonder how much room for oc this card will have.

Share this post


Link to post
Share on other sites
  • 0

Posted

I am not impressed. The Battlefield 3 benchmarks at that resolution are only 72 fps. I get that with my (4870x2) x 2. What I am looking for is a card that can push about 70 fps at 2560 x 1600 in Battlefield 3 which my cards cannot do.

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='BillyJack' timestamp='1331919523' post='594735684']
I am not impressed. The Battlefield 3 benchmarks at that resolution are only 72 fps. I get that with my (4870x2) x 2. What I am looking for is a card that can push about 70 fps at 2560 x 1600 in Battlefield 3 which my cards cannot do.
[/quote]

You can't notice anymore fluidity than that
2 people like this

Share this post


Link to post
Share on other sites
  • 0

Posted

I'm not really impressed by the advances in GPU's currently - we're looking at a transition to 4k displays soon and these cards can barely do 2k without choking.

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='Muhammad Farrukh' timestamp='1331919673' post='594735692']
You can't notice anymore fluidity than that
[/quote]

You are right but I have not seen benchmarks for 2560x1600 in Battlefield 3. Currently my cards cannot push the game in that resolution. Since this card is the latest and greatest I would expect it to do better than my cards. However, at the 1920x1200 resolution the new king of the hill is pushing the same fps as my current, several generations behind, cards. That is why I am disappointed. I would like to replace my cards with the newest card so that I can play Battlefield 3 at 2560x1600 around 70 fps.

[quote name='MiukuMac' timestamp='1331919862' post='594735698']
I'm not really impressed by the advances in GPU's currently - we're looking at a transition to 4k displays soon and these cards can barely do 2k without choking.
[/quote]

I agree, if this card is still struggling with the latest games at 2560x1600 resolution then why bother making screens with larger resolutions. Despite graphic cards have advanced a lot since ten years ago they are not advancing fast enough or at least that well.

Maybe rather than making graphic cards that burn up so much energy they should focus on changing the paradigm of physics engines, textures and how games are coded. There is no way we will ever get photo realistic games if something does not change drastically.

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='MiukuMac' timestamp='1331919862' post='594735698']
I'm not really impressed by the advances in GPU's currently - we're looking at a transition to 4k displays soon and these cards can barely do 2k without choking.
[/quote]
resolution aint everything, my friend. what about AA, AF, tessellation, DirectX11, and all the other technology that goes into video games? subsurface scattering, bump mapping, ambient occulsion, motion blur...

you also have to realize that the drivers for these cards are probably pretty rough. The 1st driver release for the AMD 7970 was rough. Took a 2nd round to get better performance.

i think it's pretty funny that the tester has a extreme-cooled cpu running at stock speed :p

also, i will be buying (2) of these cards :woot:

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='MiukuMac' timestamp='1331919862' post='594735698']
I'm not really impressed by the advances in GPU's currently - we're looking at a transition to 4k displays soon and these cards can barely do 2k without choking.
[/quote]

Yep i'm not impressed either. Specially at the price asked if it's 50$ more than a already overpriced 7970. The price of the 7950 is ridiculous too. What will be interesting to see when real professional benchs will come in is the overclocking abilities of this card. The memory clock and core clock are already high for a card of this power. With a stock fan there might not be that much more room left for overclocking if you want to keep the jet engine out of your home (most stock coolers sound like jet engine when you oc too much).

Gonna keep my oced 6950 with a custom cooler. I can get it to 900/1400 (fan at 60%) without any voltage tweak and it can runs current games fine. I only wish i had bought the 2gb version i fear that 1gb might not be enough for upcpming games.

Share this post


Link to post
Share on other sites
  • 0

Posted

Some reports suggest that 680 will be priced at 500 USD.

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='LaP' timestamp='1331920553' post='594735726']
Yep i'm not impressed either. Specially at the price asked if it's 50$ more than a already overpriced 7970. The price of the 7950 is ridiculous too. What will be interesting to see when real professional benchs will come in is the overclocking abilities of this card. The memory clock and core clock are already high for a card of this power. With a stock fan there might not be that much more room left for overclocking if you want to keep the jet engine out of your home (most stock coolers sound like jet engine when you oc too much).
[/quote]
high end cards have always had an MSRP of $500. Based on supply and demand, though, they can reach $550. you'll never see a high-end card go for <$500.

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='Jdawg683' timestamp='1331920527' post='594735722']
resolution aint everything, my friend. what about AA,
[/quote]

Imo resolution > AA

I prefer high resolution with low AA than lower resolution with higher aa.

Share this post


Link to post
Share on other sites
  • 0

Posted

i bought my 5850 almost a year ago for 125 bucks. i oc'ed it and it performs similar to 6890. there is still nothing out there that I can REMOTELY justify upgrading to.
1 person likes this

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='LaP' timestamp='1331921324' post='594735756']
Imo resolution > AA

I prefer high resolution with low AA than lower resolution with higher aa.
[/quote]

i agree - but you could probably dumb down the graphics to such a low level that 4k and 2k (as they put it) run smoothly. but what fun is that? they want cards that can handle 4k for some reason, but to do that, you need to dumb down all the other effects, and that's no fun.

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='BillyJack' timestamp='1331920489' post='594735712']
I agree, if this card is still struggling with the latest games at 2560x1600 resolution then why bother making screens with larger resolutions. Despite graphic cards have advanced a lot since ten years ago they are not advancing fast enough or at least that well.
[/quote]

Displays with that resolution capability are designed for graphic design. Certainly not gaming.

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='Astra.Xtreme' timestamp='1331922468' post='594735788']
Displays with that resolution capability are designed for graphic design. Certainly not gaming.
[/quote]
Not really - LG and other manufacturers have 4k displays already ready and the only thing we're really waiting for is a few specs to be finalized and the manufacturing costs to go down, naturally.

I'm personally salivating at the thought of getting a 4k "UltraHD" 30"+ screen on my table.

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='MiukuMac' timestamp='1331922646' post='594735796']
Not really - LG and other manufacturers have 4k displays already ready and the only thing we're really waiting for is a few specs to be finalized and the manufacturing costs to go down, naturally.

I'm personally salivating at the thought of getting a 4k "UltraHD" 30"+ screen on my table.
[/quote]

Yeah cost is the killer. Do they have a decent response rate? The ones I see out now are at 7-15ms, which isn't that great compared to most other monitors.

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='Jdawg683' timestamp='1331920955' post='594735744']
high end cards have always had an MSRP of $500. Based on supply and demand, though, they can reach $550. you'll never see a high-end card go for <$500.
[/quote]

Oh sure 500$ for high end gfx card is perfectly fine. But i just don't think those new card bring enough to justify an upgrade from the previous generation. If you are 2 generation behind maybe.

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='Astra.Xtreme' timestamp='1331922468' post='594735788']
Displays with that resolution capability are designed for graphic design. Certainly not gaming.
[/quote]

Maybe at first they were designed for graphics but they have become common for gaming. Specially for the people who spend the money. That is why games support that resolution. At first game developers had to start supporting that resolution but now they all do or at least close to them all. I do realize that due to the price of the monitor most gamers have just stuck with HD resolution. Therefore the slow adoption rate makes it to a point where no manufacturer or developer really care. The next thing that they care about is multi-monitors and 3D. So if you throw out the resolution is not made for gaming aspect the cards still need the power to support multi-monitors.

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='LaP' timestamp='1331924037' post='594735862']
Oh sure 500$ for high end gfx card is perfectly fine. But i just don't think those new card bring enough to justify an upgrade from the previous generation. If you are 2 generation behind maybe.
[/quote]
agreed

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='BillyJack' timestamp='1331919523' post='594735684']
I am not impressed. The Battlefield 3 benchmarks at that resolution are only 72 fps. I get that with my (4870x2) x 2. What I am looking for is a card that can push about 70 fps at 2560 x 1600 in Battlefield 3 which my cards cannot do.
[/quote]

Really? So a card with a single GPU can push the same performance as 4 GPUs from 3 generations ago? I'd call that pretty impressive. Besides, as others have said, resolution isn't everything. If it's pixelated and jagged, I don't care how high the resolution is. You want 70+ FPS @ 2560x1600? Buy two of them and SLI them.
1 person likes this

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='metal_dragen' timestamp='1331989414' post='594737336']
Really? So a card with a single GPU can push the same performance as 4 GPUs from 3 generations ago? I'd call that pretty impressive. Besides, as others have said, resolution isn't everything. If it's pixelated and jagged, I don't care how high the resolution is. You want 70+ FPS @ 2560x1600? Buy two of them and SLI them.
[/quote]

Do you have a 30" monitor and play games with a 2560 x1600 resolution? Or have you ever experienced playing games like this on another person's computer? Games at this resolution are not pixelated or jagged. Unless you were letting a GPU upscale a game that did not support this resolution the image will look the same as 1920x1080.

Everyone's opinion is there own. Some people rather have anti aliasing rather than resolution and others rather have resolution rather than anti aliasing. Hell, some people like playing games on consoles better than PCs. Some people even like using multiple monitors for there games.

My cards are direct x 10.1, I do not have all the settings set that high and they only use about 65% of the GPU in crossfire. Plus my fps can sometimes drop as low as 30 fps and go as high as 100 fps depending on the scene. On average they are around 70 fps in BF3.

Thirty inch monitors have been out for a while now and plenty of people own them and game with one. So I feel that these state of the art top leading graphic cards should be able to push a game at a resolution of 2560x1600. I remember when graphic cards could barely push a game at the resolutions that were the norm at the time. Since then graphic cards have come a long way and so have monitors. I know that the games have come a long way too. My point is why is a graphics card always behind. The number one card seems like it can never push a certain game to the max, or resolution, or something. You think that it would be able to do anything the most recent game and hardware allow and a decent performance. That is why I am not impressed.

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='Astra.Xtreme' timestamp='1331922468' post='594735788']
Displays with that resolution capability are designed for graphic design. Certainly not gaming.
[/quote]
Wow. Really?

:s

If that's true how do you explain most gamers buying 27"+ or multiple displays for strictly gaming. :rolleyes:

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='ahhell' timestamp='1332183453' post='594742210']
Wow. Really?

:s

If that's true how do you explain most gamers buying 27"+ or multiple displays for strictly gaming. :rolleyes:
[/quote]

That's a completely unrelated matter. He was talking about monitors that run at [color=#282828]2560x1600 resolution. Newegg sells 4 monitors that support that and they run over $1k each. They don't have a great response time, so it's a pretty clear indicator that they aren't made for gaming. Hense why games play like crap at that resolution. ;)[/color]

Share this post


Link to post
Share on other sites
  • 0

Posted

I think some people are gravely mistaken that we're on the "cusp" of super high resolution monitors.

Yes, there are 2560 x ____ resolution monitors out there, but they've been there for many years, and they have not gained any traction. The new iPad is the first real attempt at making super high resolution screens a mainstream thing, but I still think it will be be 4+ years before we even see 1080p start to fade. Heck, when 1080p first came out, it took quite a few years before 720p TVs dwindled out. Plus with telecom and cable companies fighting tooth and nail to throttle our bandwidth consumption and 1080p just now starting to become standard for TV shows, I really doubt they are going to want to start moving to even higher resolutions.

Not to mention that console makers have no intention of pushing ending the lifecycle of our current consoles anytime soon, that will also prolong the life of 1080p.

Don't get me wrong, I am a PC enthusiast, and I want nothing more than to see the PC system get pushed as far as possible, but with profitability being a key factor these days, companies just can't afford to keep doing that.

Luckily it seems like nVidia is doing a good job of spending a lot of their money on R&D and is expaning their portfolio and prowess.

-Tegra is still slow to gain market share but they are gaining some traction
-The Kepler chips seem to be VERY fast compared to AMD's offering, and that's only the GK104 chip. I think they're going to use that advantage to give the GK114 chip more time to bake so they don't have a Gen 1 Fermi repeat.
-The 300 series drivers/cards are bringing 3 [i]potentially[/i] awesome new features: TXAA, Quad-monitor support, and Adaptive VSync
-The kepler cards have pushed clocks higher and driven power usage down by a lot
-Preliminary benchmarks are showing that SLI is scaling very well

So yes, while I don't think video cards are jumping by leaps and bounds like they used to, I feel like nVidia is making great efforts to improve the quality/value of their products

Share this post


Link to post
Share on other sites
  • 0

Posted

[quote name='rajputwarrior' timestamp='1331922070' post='594735768']
i bought my 5850 almost a year ago for 125 bucks. i oc'ed it and it performs similar to 6890. there is still nothing out there that I can REMOTELY justify upgrading to.
[/quote]

Indeed. My Q8200 (which is OCd to 2.8 GHz) is more of a bottleneck than my OCd 5850 in most games.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0

  • Recently Browsing   0 members

    No registered users viewing this page.