GTX 970 Coil Whine - Please share your experience


Recommended Posts

The performance gap isn't that huge for the top cards, but the performance/TDP is in NVidias favor by a wide margin. Being Small Form Factor is huge for me. I'm running a G1 970 with 4790k and 500W 80 Gold PSU in silence. I actually had issues with power with a 770.

What? They draw almost the same watt in certain game and idle is pretty much the same compared to competition. it's suited for laptop not entirely necessary for desktop. 

Link to comment
Share on other sites

What? They draw almost the same watt in certain game and idle is  negligible it's suited for laptop not entirely necessary for desktop. 

Look, he already has the 970 and if he's not affected by its shortcoming or dissatisfied enough with NVIDIA to change it, then let him keep it. He also values performance/watt where Maxwell has an edge. In a few short months Volcanic Islands will most likely surpass Maxwell in both performance and performance/watt, but those cards aren't here yet.

 

Besides, this topic isn't about AMD vs NVIDIA, although one can be created for the recent problems, speculation about Volcanic Islands and unlocked Maxwell.

Link to comment
Share on other sites

What? They draw almost the same watt in certain game and idle is pretty much the same compared to competition. it's suited for laptop not entirely necessary for desktop. 

 

Are we talking about the same thing?

 

970 vs 770 vs 290X TDP

 

145w vs 230w vs 300w. Not that close. Idle is irrelevant since you buy a gaming board to game, on ultra. :)

Look, he already has the 970 and if he's not affected by its shortcoming or dissatisfied enough with NVIDIA to change it, then let him keep it. He also values performance/watt where Maxwell has an edge. In a few short months Volcanic Islands will most likely surpass Maxwell in both performance and performance/watt, but those cards aren't here yet.

 

Besides, this topic isn't about AMD vs NVIDIA, although one can be created for the recent problems, speculation about Volcanic Islands and unlocked Maxwell.

 

I do think AMD may have a significant advantage next gen with freesync. G-sync adds to much of a luxury tax and so far limited support.

Link to comment
Share on other sites

It isn't the fastest card anymore in the price range when the 290X can be had at a lower price.

They trade blows, the GTX is slightly faster on average at 1080p and below and the 290X is slightly faster on average at 1440 or above. The 290X is also a much more power hungry and loud card, so even at the current ~20$ price difference I hardly see anyone returning their GTX 970 to the store for performance reasons. Anyway it's besides the point, the point is no one got cheated by NVIDIA over this, if anyone can claim with a straight face that he bought the GTX 970 for its X amount of sprolgelrs despite never having heard of a florberl before, and still getting great performance, then go ahead and be outraged, but it's all quite silly.

 

However I think the '3.5 gb performance issue' itself is rather overblown. the card still delivers the same performance that we saw in intitial reviews. We knew from the beginning that the 970 doesn't perform as well at higher resolutions as the r9 290(x). There were 4k benchmarks done on release, the main difference is now we know exactly why.

 

It is a bit silly that none of the major reviewers picked up on the fact that the card prefers to not allocate over 3.5gb ram whenever possible though, perhaps this will result in more thorough benchmark procedures in the future.

The performance gap between the GTX 980 and 970 is consistent from 1080p to 4K (see http://www.anandtech.com/bench/product/1355?vs=1351 for instance). If what you said was true we should be seeing the GTX 980 do unexpectedly better than the 970 at 4K since it unaffected by the memory controller issue. Anandtech tried and couldn't come up with a gaming benchmark that would demonstrate a performance problem due to that issue, they might be still trying or finally gave up, if they find anything it'll probably be some extremely specific scenarios.

The only way people have noticed any "issue" was through synthetic tests done in CUDA, gaming benchmark data revealed nothing unusual and that's because the issue isn't really an issue as far as gaming performance is concerned. I don't see why testing procedures would change as a result of this discovery. What's there to catch? If you bought the card for its compute capabilities it's another story, but then Geforces aren't really compute cards anyway.

Link to comment
Share on other sites

the point is no one got cheated by NVIDIA over this

Let's just agree to disagree.

The performance gap between the GTX 980 and 970 is consistent from 1080p to 4K (see http://www.anandtech.com/bench/product/1355?vs=1351 for instance). If what you said was true we should be seeing the GTX 980 do unexpectedly better than the 970 at 4K since it unaffected by the memory controller issue. Anandtech tried and couldn't come up with a gaming benchmark that would demonstrate a performance problem due to that issue, they might be still trying or finally gave up, if they find anything it'll probably be some extremely specific scenarios.

The only way people have noticed any "issue" was through synthetic tests done in CUDA, gaming benchmark data revealed nothing unusual and that's because the issue isn't really an issue as far as gaming performance is concerned. I don't see why testing procedures would change as a result of this discovery. What's there to catch? If you bought the card for its compute capabilities it's another story, but then Geforces aren't really compute cards anyway.

I don't see memory usage in those benchmarks so I can't say if they're relevant or not. Anandtech tried and couldn't come up with a gaming benchmark that would demonstrate a performance problem under a time constraint. And by gaming benchmarks, I hope you're talking about something else than the NVIDIA supplied numbers.

Link to comment
Share on other sites

I don't see memory usage in those benchmarks so I can't say if they're relevant or not. Anandtech tried and couldn't come up with a gaming benchmark that would demonstrate a performance problem under a time constraint. And by gaming benchmarks, I hope you're talking about something else than the NVIDIA supplied numbers.

I was answering ViperAFK's suggestion that the reason why the R9 290X does better than the GTX 970 at 4K is the memory controller thing. If that was effectively the case then the same benchmarks should show the 980 bizarrely outperforming the 970 at 4K for the same reason, and they don't.

By gaming benchmarks I meant Anandtech's benchmarks. Anyway, if and when we get benchmark data showing that this can be an actual problem in gaming - well at that point the media hype and comment threads will probably have died and no one will care anymore but anyway - I'll reassess, but so far this is just a non-issue.

Link to comment
Share on other sites

I was answering ViperAFK's suggestion that the reason why the R9 290X does better than the GTX 970 at 4K is the memory controller thing. If that was effectively the case then the same benchmarks should show the 980 bizarrely outperforming the 970 at 4K for the same reason, and they don't.

By gaming benchmarks I meant Anandtech's benchmarks. Anyway, if and when we get benchmark data showing that this can be an actual problem in gaming - well at that point the media hype and comment threads will probably have died and no one will care anymore but anyway - I'll reassess, but so far this is just a non-issue.

I never said that the reason the 290x does better at 4k is just because of the memory controller thing, I agree with pretty much everything you've said ;)

Link to comment
Share on other sites

I never said that the reason the 290x does better at 4k is just because of the memory controller thing, I agree with pretty much everything you've said ;)

Sorry if I misunderstood you, that's how I interpreted:

We knew from the beginning that the 970 doesn't perform as well at higher resolutions as the r9 290(x). There were 4k benchmarks done on release, the main difference is now we know exactly why.

 

By "we know exactly why" I thought you were referring to the memory segmentation on the 970, what did you mean if not?

 

By the way PC Perspective had this to say after days of analysing the issue:

 

I spent nearly the entirety of two days testing the GeForce GTX 970 and trying to replicate some of the consumer complaints centered around the memory issue we discussed all week. I would say my results are more open ended than I expected. In both BF4 and in CoD: Advanced Warfare I was able to find performance settings that indicated the GTX 970 was more apt to stutter than the GTX 980. In both cases, the in-game settings were exceptionally high, going in the sub-25 FPS range and those just aren't realistic. A PC gamer isn't going to run at those frame rates on purpose and thus I can't quite convince myself to get upset about it. http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Looking-GTX-970-Memory-Performance/COD-Advanced-Warfare-and-Clos

 

This mirrors what Guru3d had to say:

Utilizing graphics memory after 3.5 GB can result into performance issues as the card needs to manage some really weird stuff in memory, it's nearly load-balancing. But fact remains it seems to be handling that well, it

Link to comment
Share on other sites

Sorry if I misunderstood you, that's how I interpreted:

We knew from the beginning that the 970 doesn't perform as well at higher resolutions as the r9 290(x). There were 4k benchmarks done on release, the main difference is now we know exactly why.

 

By "we know exactly why" I thought you were referring to the memory segmentation on the 970, what did you mean if not?

 

By the way PC Perspective had this to say after days of analysing the issue:

 

I spent nearly the entirety of two days testing the GeForce GTX 970 and trying to replicate some of the consumer complaints centered around the memory issue we discussed all week. I would say my results are more open ended than I expected. In both BF4 and in CoD: Advanced Warfare I was able to find performance settings that indicated the GTX 970 was more apt to stutter than the GTX 980. In both cases, the in-game settings were exceptionally high, going in the sub-25 FPS range and those just aren't realistic. A PC gamer isn't going to run at those frame rates on purpose and thus I can't quite convince myself to get upset about it. http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Looking-GTX-970-Memory-Performance/COD-Advanced-Warfare-and-Clos

 

This mirrors what Guru3d had to say:

Utilizing graphics memory after 3.5 GB can result into performance issues as the card needs to manage some really weird stuff in memory, it's nearly load-balancing. But fact remains it seems to be handling that well, it

Link to comment
Share on other sites

This topic is now closed to further replies.