Jump to content
  • 0
Sign in to follow this  
Followers 0

Question

Posted

Hey guys. This will be quite a long topic so just bare with me. TDP will be excluded at first but I will get to that later on, don't worry.

 

This whole thing is purely my opinion and nothing else.

 

Most of the benchmarks are done within Windows but I will also include some benchmarks from Linux, since the OS itself can not show the pure performance of an CPU if it is not optimized for it.

 

Whenever possible I will be comparing the 4770k vs the 9590.

 

First off lets start with CPU Zlib

 

 

This integer benchmark measures combined CPU and memory subsystem performance through the public ZLib compression library. CPU ZLib test uses only the basic x86 instructions but is nonetheless a good indicator of general system performance.

 

FX-9590-41.jpg

 

As we can see the 9590 can easily keep up with any CPU from that time (We should also consider that Piledriver arch is already 2 years old)

 

Next up - CPU Hash

 

FX-9590-40.jpg

 

And once again we can see that the 9590 can easily hold it's own ground.

 

We need to consider that these two benchmarks take full advatange of every core/thread they can.

 

Next up FPU benchmark.

 

FX-9590-42.jpg

 

While better than the 8350 it still lacks power to compete with Intel. If you keep up with the reviews you should already know the the FPU unit in BD and forward is actually weak.

 

Next up Cinebench (I should note that I am a little biased at this bench, since Maxtron only uses Intel compilers that are said to cripple AMD hardware to some point.)

 

FX-9590-43.jpg

 

 

But it still uses every thread it can take so I'm guessing at this point that the 5GHz clock is helping the 9590 and it actually can keep up. Not bad I'd say.

 

Next up, CIV V (Very CPU bound game)

 

FX-9590-45.jpg

 

 

Here you can easily see the weak point for AMD FX series. If the game is CPU bound rather than GPU it'll lose even to an 2600k.

 

Now to the very weak point for the BD arch and its newer brothers, single threaded performance.

 

FX-9590-44.jpg

 

 

You can easily see how it falls flat of every Intel CPU there is, even the i3's. This inculdes Prime, it's the same in that benchmark, it just can't handle its performance there, atlhough once again I should mention my bias against Prime, it's and x87 arch, already dead but for some reason people still keep taking it as an performance suggestion.

 

Next up MediaCoder x64

 

FX-9590-53.jpg

 

When it comes to media decoding it's easy to see that the new arch has paid off easily. The more "cores" the better of you will be.

 

The same goes for x264 media and truecrypt (I guess thats dead now huh?).

 

My point is, when ever you have enough threads/cores in your program support the FX can easily hold its own.

 

Now, lets get to gaming. First off, Dirt 3.

 

FX-9590-65.jpg

 

 

As we can see (we know that Dirt 3 is AMD optimized) it has no problem going ahead of most Intel CPUs. But what about games that are not optimized for both CPUs? We will find out now.

 

This what happens

 

FX-9590-67.jpg

 

It just falls flat. Thats it.

 

Anyway, back to newer games (that use newer engines) the FX can easily keep up thanks to devs that actually put multi-threaded support in.

 

FX-9590-75.jpg

 

 

As we can see here. 1 FPS short...well, should really matter.

 

Now lets see some figures for power consumption and after that I will say my own opinion.

 

 

With a TDP of about 220W, it the FX-9590

Share this post


Link to post
Share on other sites

50 answers to this question

  • 0

Posted

I love my AMD (1090T) but the TDP is hideous; as soon as i power my computer it goes into 300W, then a stable 285W (idle); if i play a game that is GPU demanding then it goes way above that. My i7, on the other hand, deals with power quite well.

 

Having said that i find that CPU demanding apps like rendering work better in my AMD then in my Intel (by a small margin); the fact that it has more cores and more speed (the 1090T is OC) helps, of course; single threads apps perform great in both CPUs (i have lots of RAM and SSD in both computers and in reality the differences are, most of the time, quite small).

 

In the end, since they are so similar i chose the AMD for my desktop because of price: the mainboard + CPU + RAM saved me some bucks when compared with a similar system by Intel; for my laptop i chose Intel because nowdays it's damn hard to find a good AMD CPU in a OEM laptop...

Share this post


Link to post
Share on other sites
  • 0

Posted

its nice to see AMD is competing once again.

At least they can say,

 

close-enough.png

Share this post


Link to post
Share on other sites
  • 0

Posted

I would suggest replacing those 100W light bulbs with some LED ones. At least here they're ~5 euro's, which they easily save ten times over in a few years :)

 

But some impressive benches from the FX, did not expect it to climb so high with the i7's.

The development is surely interesting, they are not to far behind Intel and have rumored to solved the piledrivers bottleneck. And vastly increasing single threaded performance should also in turn mean greatly improved multi-threaded performance. And if they hold up, next year I can see them bypass Intel.

 

I have an FX4100 myself, fairly old and starting to show its oldness. Hopefully the next gen stays AM3+ but I'm afraid they'll go FM2.

Share this post


Link to post
Share on other sites
  • 0

Posted

I would suggest replacing those 100W light bulbs with some LED ones. At least here they're ~5 euro's, which they easily save ten times over in a few years :)

 

Totally agree with this, I have a bunch of daylight LED's which certainly provide a less 'dingy' feel to my flat :)

 

Anyway, back to the point, what is the 'best' bang for buck AMD processor at the moment? I'm about ready for an upgrade and I really don't know where they stand right now. Currently have a Athlon 2 x4 2.8ghz @ 3.2ghz.

 

I imagine sufficient time has passed to make buying a mid tier AMD processor a worthy upgrade, I just don't know where to put my money :) Looking for the best performance for price.

Share this post


Link to post
Share on other sites
  • 0

Posted

I can imagine that games will be more optimised for AMD's since both the Xbox One and PS4 use AMD APU's.

Share this post


Link to post
Share on other sites
  • 0

Posted



But some impressive benches from the FX, did not expect it to climb so high with the i7's.

The development is surely interesting, they are not to far behind Intel and have rumored to solved the piledrivers bottleneck. And vastly increasing single threaded performance should also in turn mean greatly improved multi-threaded performance. And if they hold up, next year I can see them bypass Intel.

they havent. even though they doubled the decoder fabric, single threaded performance only went up in single digit percentage. this suggests they have some other major bottlenecks somewhere else in the design. maybe its fetch, brand prediction, or its the dedicated pipelines. they will never catch intel anyways because of the node advantage. amd is at 28nm while intel is already at 22nm. even if amd somehow got out of its contractual obligations with GF and went with TSMC, TSMC is still years behind intel. 

1 person likes this

Share this post


Link to post
Share on other sites
  • 0

Posted

they havent. even though they doubled the decoder fabric, single threaded performance only went up in single digit percentage. this suggests they have some other major bottlenecks somewhere else in the design. maybe its fetch, brand prediction, or its the dedicated pipelines. they will never catch intel anyways because of the node advantage. amd is at 28nm while intel is already at 22nm. even if amd somehow got out of its contractual obligations with GF and went with TSMC, TSMC is still years behind intel. 

I was not talking about this chip, but for their next gen.

Share this post


Link to post
Share on other sites
  • 0

Posted

I was not talking about this chip, but for their next gen.

 

I was talking about steamroller. whats the rumor about excavator?

Share this post


Link to post
Share on other sites
  • 0

Posted

 

Anyway, back to the point, what is the 'best' bang for buck AMD processor at the moment?

 

Probably the FX-8320.

 

 

I love my AMD (1090T) but the TDP is hideous; as soon as i power my computer it goes into 300W, then a stable 285W (idle);

 

This actually isn't possible. You are talking the same wattage that an 9590 takes under load what your 1090T takes idle? Yeah, no. Maybe from the wallsocket but the CPU itself, no.

Share this post


Link to post
Share on other sites
  • 0

Posted

 

In the end, since they are so similar i chose the AMD for my desktop because of price: the mainboard + CPU + RAM saved me some bucks when compared with a similar system by Intel; for my laptop i chose Intel because nowdays it's damn hard to find a good AMD CPU in a OEM laptop...

Intel pay enormous amount of money to OEM for not using AMD CPU so it's not surprised that you barely able to seek a good amd processor in laptop.

1 person likes this

Share this post


Link to post
Share on other sites
  • 0

Posted

Hey guys. This will be quite a long topic so just bare with me. TDP will be excluded at first but I will get to that later on, don't worry.

 

This whole thing is purely my opinion and nothing else.

 

Most of the benchmarks are done within Windows but I will also include some benchmarks from Linux, since the OS itself can not show the pure performance of an CPU if it is not optimized for it.

 

Whenever possible I will be comparing the 4770k vs the 9590.

 

 

FX-9590-73.jpg

 

On idle, it has no problem. Sure the 4770k is about 20W lower but we need to understand the Intel CPU is on 22nm and AMD is stuck on 32nm thanks to GloFo. On load, oh well, that is another story. If you want a space heater (like me) you should get one :p If not, keep to an Intel CPU.

 

Now lets get to the point I said earlier about TDP.

 

Sure, the TDP of 220W of the 9590 is big but that doesn't mean it will draw it all the time. It will, sure, under load and even more than the 220. But imho this chip is meant for AMD enthusiasts as well.

 

220W TDP - lets get to that. I personally have around 9-10 light bulbs in my house, all of them 100W bulbs and most of the time 4-5 are always on. I don't see a big problem. Also, most of our newer gen GPU's take way more than that, why shouldnt we complaing about that? Sure, AMD is the only one with that TDP but Intel just came out with an 140W part...before that people were furious about AMD releasing an 125W part but now, no one blinks an eye. (leaving perofmance aside)

 

While we might not see a new CPU from AMD for 2 years (said to be working on a new arch with Jim Keller) I think the Piledriver can keep up with Intel. I saw some of the benchmarks of DC (Devil's Canyon) and I am impressed but..still the same as always. Not a very big improvement that enthusiasts are after.

 

I have and I will always admit that Intel has it's own strong points but so does the FX. The more multi-threaded you go the better off you'll be with an FX even with a bigger TDP.

 

 

Final words: I will be happy to discuss any of this until you can keep an open mind. If you are so stuck on your opinion, please don't even..
 

I like competition, but it is hard to make that argument today that AMD makes the better CPU for purely power users. The 9590 can hold its own, but I would much prefer the 4770k simply because the TDP is a lot lower overall. Not sure why you would have 100W bulbs in your house - you should be using CFL's and even LED bulbs are becoming reasonable. A 100W equivalent CFL bulb consumes about 25W - bottom line CFL's are very efficient and last longer than incandescent bulbs. I guess this depends on your location, but the US has started to phase out incandescent bulbs. In your case, rather than having 9-10 lights on at 100w a piece (900-1000W), you'd be looking at 9-10 lights on at 25W (225-250W). LED lights are even more efficient, and typically consume about 6-8 W/each. If you are power user with a 9590 or 4770, you probably (doesn't mean ALL) have a nice power hungry GPU. Anything that keeps power usage down while retaining performance is a win win in my book. 

 

I miss the good ol' days of when AMD was kicking Intel's butt, and then finally Intel responded and they have been fairly dominant ever since. I will always choose the better product, but for me that right now is Intel. I like Intel's desktop CPU line up, and their mobile (ultrabook) CPU's are great. AMD also has some nice products with those nice APU's and performance per price value is great. I want nothing more than AMD to succeed to keep consumers winning with better CPU's & other products. 

2 people like this

Share this post


Link to post
Share on other sites
  • 0

Posted

Hey there.

 

Thanks for your feedback. First off, we do have LED's here as well but since we live in an old apartment (built in 1960's) it's quite hard to switch them all over, I started to do it tho ;) It was an example only anyway.

 

Sure, like I said, I dont have anything against admitting that TDP wise (perf-per-watt) Intel has better products but this topic was just to show that people who keep screaming that AMD is so behind and suck that much, that just isn't true. Even with the low amount of cash, problems with foundries etc they still can keep up. Like I said, PD is already 2 years old, on a bigger node (22vs32nm) but still has some life in it.

 

All in all, we all profit if we have two players on the market rather than just 1.

1 person likes this

Share this post


Link to post
Share on other sites
  • 0

Posted

Wait, these are Ivy Bridge, that is last generation.

Share this post


Link to post
Share on other sites
  • 0

Posted

Wait, these are Ivy Bridge, that is last generation.

 

Sure, there arent any Haswell to 9590 comparsions out there. :(

 

Didn't want to include APU's aswell, even Kaveris. While the iGPU is far better than Intel's (excluding Iris Pro just for the fact the CPUs that that cost a arm and a leg) the CPU itself is pure garbage.

Share this post


Link to post
Share on other sites
  • 0

Posted

This actually isn't possible. You are talking the same wattage that an 9590 takes under load what your 1090T takes idle? Yeah, no. Maybe from the wallsocket but the CPU itself, no.

 

you are right, this was wall socket total; still is much more then a i7. In the end of the year that costs more money just for electricity.

Share this post


Link to post
Share on other sites
  • 0

Posted

you are right, this was wall socket total; still is much more then a i7. In the end of the year that costs more money just for electricity.

 

Well, sure it's more than Intel since the TDP of AMD CPU's alone. But wallsocket just gives you an idea of the system overall idle-load consumptions. On Idle the CPU's are pretty much the same. Also we should factor in that pushing your CPU to full load isn't as easy as showing a benchmark, I doubt most of us run Prime all the time...Games etc use around 5-15% of the CPU (if coded right)

Share this post


Link to post
Share on other sites
  • 0

Posted

Wait, these are Ivy Bridge, that is last generation.

 He included the 4770k, and this is a Haswell designated chip

Share this post


Link to post
Share on other sites
  • 0

Posted

I realize this is not really viable, but what I'd like to see is amount of warranty complaints. CPU never breaks, right? Apparently not. Or perhaps it is easier to somehow damage their designs with static, mishandling or what else gives. Currently my count is 7 to 2, despite the sales being 1 to 10. And that's not a pretty picture.

Share this post


Link to post
Share on other sites
  • 0

Posted

I realize this is not really viable, but what I'd like to see is amount of warranty complaints. CPU never breaks, right? Apparently not. Or perhaps it is easier to somehow damage their designs with static, mishandling or what else gives. Currently my count is 7 to 2, despite the sales being 1 to 10. And that's not a pretty picture.

 

It's mostly wrong cooling, breaking pins (in AMD's case), static and what not. Can't really blame AMD on that but rather the customer. You wouldn't blame MS when Windows gets an virus would you now?

Share this post


Link to post
Share on other sites
  • 0

Posted

It's mostly wrong cooling, breaking pins (in AMD's case), static and what not. Can't really blame AMD on that but rather the customer. You wouldn't blame MS when Windows gets an virus would you now?

 

 

 

AMD does not warrant that your AMD processor will be free from design defects or errors known as "errata".

 

I do not intend to flat out blame AMD, not without having all the facts on table. I just find it curious and... disencouraging. I admit being a fanboi (of Intel; and fanboism does not require a valid reason), but then I also have test reports and I have expense reports to fill to justify doing what I do, so I just can't go and blame AMD, because I just don't like them and enjoy putting dents in their rep.

 

They aren't dying, they just fail at certain pretty stupid conditions - seconds upon coming back from sleep mode, stress test on a specific core, overheating at idle (but not throttling and load temperatures being just fine), to name some. And, I agree, any of these to anyone who even barely knows his trade would first point to whole holy myriad of other different causes - mainboard, bad contact, firmware, unclean power, bad RAM, corruped OS, drivers, error between the chair and the desk - name everything but the CPU. The first time I saw that, I said exactly that, arrogantly and repeatedly - please stop filling my ears with your urinary substance (you know the actual words used, liberally adding Russian verbal spicing, sadly so well-rooted in post-Soviet everyday talk).

 

Only if replacing them with the same model (and not even different batch number, which rules out errata) wouldn't solve every specific case. It can still be attributed to chance and coincidence, but that's one for the records already then.

 

Wrong cooling is an argument I won't take at all, chiefly because of the TDP debate. Their laughing-stock cooling? It's just that bad, seriously, and AMD hasn't cared to fix that for years, which is beyond sad at this point. The mounting mechanism? If one isn't careful, it flexes the board. And you can't blame people for not being careful enough. One just has to take precautions with these sort of things and design things in an increasingly foolproof way, even if the universe keeps finding bigger fools. It isn't exactly neuroscience, too - every other aftermarket brand has their own idea about the best design of backplates and screws.

 

Broken pins? All too easy as such, but in these cases very unlikely. See above - they don't die. They POST, they load, they work all their watts off unless some specific event triggers. PGA contact problem? Perhaps, but once again - not if the replacement works. I've dropped "all of them were exchanged under warranty" clause from my arguments, because I figure it is perhaps very difficult to assess the exact nature of the damage even for AMD themselves, it's just that much insanely complex stuff. So if the exact problem I forward appears on supplier's testbenches as well (and it has thus far), it's being shipped off and, apparently, AMD pays for it, regardless of who's fault it is.

 

You need not to take my word for all this, but you'd do best to square with the possibility there may be (but not definitely are) bad chips leaving the factory. Because that is what happens with every other thing in existance and AMD would not be an exception. If it's anybody's fault, coincidence or just bad luck that Intel tops them in this regard in my charts, is impossible to truly know.

 

Oh, and Windows example was, perhaps, not the best choice, not in a way you had intended. If an operating system (none in particular) was engineering in a way that allowed virus to easily compromise it, I would totally blame the vendor for such an oversight. And I frequently do, seeing the sad state and general discord the computer security is today. But I don't, once again, blame AMD. I just discourage from using them for reasons very wordly outlined above.

Share this post


Link to post
Share on other sites
  • 0

Posted

Well you can be a fanboi but at least unlike most of them/us you can stick to reason and we can actually have a decent discussion, I am also AMD fanobi but see? We have no problem sticking to reason.

 

Anyway. The thing is, I didnt mean that AMD doesn't have bad batches coming out. Every hardware company does. What makes it worse for AMD is that they don't own the fab those chips come out of so they most likely don't check every chip like Intel does. As far as I've heard AMD gives the design to GloFo/TSMC and they deal with the rest (don't take my word on this, could as well be a flat out lie).

 

Actually I dont think broken pins are very unlikely since the first time I built an AMD I almost broke half of them thanks to not seeing the little triangle on the socket itself. It all depends which pins you break - that gives you an idea what the CPU is capable of later on.

 

I actually like discussing this stuff with you. At least we're not the ones who go "OMGZZZ AMD SUCKKZZZ" or the other way around ;)

Share this post


Link to post
Share on other sites
  • 0

Posted

I am also AMD fanobi

 

Me too.

Share this post


Link to post
Share on other sites
  • 0

Posted

Probably the FX-8320.

 

 

This actually isn't possible. You are talking the same wattage that an 9590 takes under load what your 1090T takes idle? Yeah, no. Maybe from the wallsocket but the CPU itself, no.

 

Crossfire/SLi setups can make it go that high (talking from the wall here).

Share this post


Link to post
Share on other sites
  • 0

Posted

It doesn't really matter who's fanboi you are but what matters is that you can accept reality and accept facts as they are.

 

I was on another overclocking forum the other day and got into an argument with a person who said that when you have an lower TDP CPU it will clock better and when I threw him the facts that the 125W TDP 8150 clocks better than the ??W 4770k it was full of b*tchfit.


Crossfire/SLi setups can make it go that high (talking from the wall here).

 

 

That's exactly the thing. From the wall. That just shows you how much the entire system is using, not the CPU itself. Then we need to factor in the loss from PSU, GPU(s), HDD's, etc etc.

Share this post


Link to post
Share on other sites
  • 0

Posted

Sure, there arent any Haswell to 9590 comparsions out there. :(

 

Didn't want to include APU's aswell, even Kaveris. While the iGPU is far better than Intel's (excluding Iris Pro just for the fact the CPUs that that cost a arm and a leg) the CPU itself is pure garbage.

Wait, so Kaveri is pure garbage but Richland, Trinity, and Llano are on there?

 

I'm not really sure what you're trying to say there.  Heh.

 

Kaveri is fantastic, its just better with HSA code, TrueAudio and Mantle so is swept under the rug by people who aren't intertested in those.  I can't wait to see 2015s APUs...but I'm sure I can wait til 2016 to buy more kit.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0

  • Recently Browsing   0 members

    No registered users viewing this page.