8 cores for Battlefield 4!


Recommended Posts

Meh, I develop and do research, game and emulate, in every single aspect I have been more than happy with my AMD processor. Provided that it's coupled with decent components like high speed RAM and excellent GPU. going intel would result in 300 us dollars more for the performance that I currently have for my applications.

It's not that AMD processors are bad - in fact they are very competitive at lower price ranges. I'm speaking about the top-end products, particularly when you factor in overclocking. AMD offered the best performance during the Athlon 64 era and were reasonably priced but that lead slipped away and AMD's strategy of cores and clock speed over power and clock efficiency has left it uncompetitive for gaming, which has a major influence on the perception of the company.

 

Hopefully AMD's role in the upcoming consoles and its GPU expertise will lead to the company rebounding, particularly as Intel has been largely stagnating. The market desperately needs competition and at the moment ARM is more competitive than AMD.

Link to comment
Share on other sites

Hopefully AMD's role in the upcoming consoles and its GPU expertise will lead to the company rebounding, particularly as Intel has been largely stagnating. The market desperately needs competition and at the moment ARM is more competitive than AMD.

You are comparing an ARM.... against an x86 CPU? I'm going to put it you like this: In MATlab last time I was running a simulation from www.openslam.org on my desktop computer. It ran fast and allowed me to see what was going on with the uncertainty ellipsoids.

 

Because this is related to my research I just copied the simulation files onto dropbox and from there on my laptop for further discussion with my supervisor. When I show this to my supervisor he was not the slightest surprised... but I really was because while the simulation ran fine, it was running far slower compared to my desktop computer, as in Slow Motion.

 

For me that laptop allowed me to play Crysis 3 on medium, open around 15 tabs on firefox without slowing down... you name it, this was the first time I could actually see a difference between my laptop and my desktop, I never have seen a substantial difference like the one shown by the simulation.

 

Well, to put it lightly, if my laptop run it like that, I don't want to tell you what would happen even with the fastest current mobile ARM (non powerpc) processor. The fact that you may feel a browser faster... literally, means nothing.

 

(p.d. it's very widely known that ARM has yet to be a match for an slow x86 processor, different architectures for different purposes, x86 nevertheless is one of the most balanced ones and hence it's power over ARM)

Link to comment
Share on other sites

You are comparing an ARM.... against an x86 CPU?

Yes, in terms of the competition it poses to Intel. Intel is desperately trying to get into the mobile business where it is being outpaced by ARM at every turn. Obviously I wasn't suggesting that ARM is competitive with Intel from a performance perspective, which should have been patently obvious.

Link to comment
Share on other sites

No, what matters is real world performance. I follow the releases of both companies but unfortunately AMD just hasn't been competitive where it matters (gaming, productivity, etc). Only in heavily threaded apps does AMD show any advantage but that is negated by the much higher power usage. It would be great if AMD were competitive with Intel as I'd happily switch back but that simply isn't the case.

 

PS - I've been using AMD processors since the mid-90s, back in the K5 days. However, since the Core 2 Duo days I've been with Intel because they offer better performance.

 

Oh don't start with the TDP again, most of us run 100W light-bulbs in our houses but all of the sudden the 50W difference in a CPU is so big of a deal. What a load of crap. Most of the GPU's we run are 150W+, same goes for PSU's which most have 700W+ already. But uhhh-ohhh the CPU is 120W. Cry me a river. It comes up an $10 difference in a year. 

 

Also since BF's new engine is taking full advantage of AMD's 8 core tech then the "gaming" situation is going to change. Also what productivity programs don't take advantage? Go read some more Cinebench articles and while you're at it look up the youtube video where they say they only use compilers from Intel WHICH have been proven to damper with AMD which we still have no proof that they fixed or didn't do it.

 

Funnily enough on Linux and 150 dollar AMD CPU can beat an 300 dollar Intel CPU on multi-threaded with open source compilers. But that doesn't matter right? 

 

Then you go as far as talking about overclocking. Wow. Hypocrite much? While most AMD CPU's/APU's come with unlocked multiplier you need special K edition Intel CPU's for that which cost more than the regular ones, you're getting milked, admit it.

Link to comment
Share on other sites

Yes, in terms of the competition it poses to Intel. Intel is desperately trying to get into the mobile business where it is being outpaced by ARM at every turn. Obviously I wasn't suggesting that ARM is competitive with Intel from a performance perspective, which should have been patently obvious.

 

We'll see. ARM had the power advantage, Intel has the performance advantage. ARM is slowly gaining more capability and processing power, Intel is slowly lower the power footprint.

 

We'll see who hits the true sweet spot first. Intel does not, and will not have a full SOC anytime soon. Their LTE modems are separate packages and us a larger fabrication process. Qualcomm has a major advantage there.

 

As for BF4 and AMD, this is a great move. Mantle is how games should have been. I hope, for everyone's sake, that since AMD's Mantle is "open" Nvidia can adopt it. There is no competitive reason why Nvidia cannot gain from this announcement either. Embrace the same low level API for two of the most powerful graphics cards companies out there.

  • Like 1
Link to comment
Share on other sites

Oh don't start with the TDP again, most of us run 100W light-bulbs in our houses but all of the sudden the 50W difference in a CPU is so big of a deal. What a load of crap. Most of the GPU's we run are 150W+, same goes for PSU's which most have 700W+ already. But uhhh-ohhh the CPU is 120W. Cry me a river.

No, most people don't run 100W light-bulbs anymore. In fact they're increasingly difficult to get hold of these days. Most of my house is on low-energy lightbulbs, particularly now with LED spotlights - same with most of the people I know. As for the power difference, AMD's FX9590 consumes 140W more than Intel's 4770K. That's a major factor when the Intel processor outperforms it in most games and applications and costs virtually the same. I mean, you're talking about Intel processors having up to 50% better performance in games - that's not a minor difference.

 

A small difference in power isn't the end of the world if you're getting the performance to warrant it but that's just not the case with current AMD processors.

Link to comment
Share on other sites

I'm in no mood to argue with you. You just disregard all my points whit assumptions that "you don't use". You're saying the major factor is TDP and disregarding all the other things. The TDP is making about 10 bucks difference in a year in your pocket, that's a fact. Intel CPU's only have the advantage only thanks to engines not taking the 8 cores into play, which is chaning quickly thankfully due to consoles and Jaguar in both of them. Go fanboy somewhere else, like toms or anand since they are just like you, they have no clue what they are saying but they still think Intel is best at everything since the low TDP.

 

http://semiaccurate.com/2013/07/25/intel-keeps-up-the-unethical-sdp-scam-with-new-4-5w-parts/

 

Have a read. That's not just with the low SDP what they do, it's the same with high-end CPU's. Also learn what ACP means when coming to CPU's.

 

 

 

David Kanter from Real World Tech told the INQ, "It's absolutely true that AMD's TDP is a more conservative measure than Intel's TDP and the two cannot be compared

 

But I guess like every other Intel fanboy you'll find a way to disregard that as well right? Funny thing is, sure I've been a AMD fanboy for a long time but I'm actually running and Intel rig.

 

 

AMD TDP shows the worst case power draw a particular chip can experience when it's operating at max voltage.

A chip can easily draw a lot of power, but usually only for very short periods of times (like several microseconds). If enough power isn't provided, bits and bobs get lost along the way and calculation errors start cropping up, which is really bad news. So, one would need to be able to supply that much power to the CPU at any given moment, even though CPUs can't draw max current for extended periods ? even, say , 1/1000th of a second ? making it all very difficult. Over 1/1000th of a second, the CPU could draw between 75-150 watts, but average power usage might be 110W.

  • Like 2
Link to comment
Share on other sites

  • 2 weeks later...

aaaaaaand here we go again.

 

The FX9370 / 9590 wasn't even going to be released to the public, the CPU was originally aimed at enthusiasts to see what they could do with them. The reason they had such a high TDP was because they came highly factory overclocked. These CPU's were also in limited quantity and were OEM only. - We've been over this in far too many topics on this forum already.

 

The FX83XX were the last line of CPU's from AMD made for consumers which ran with a TDP of 140W.

 

Intels new Haswell line of server CPU's run at over 150W, imagine all them servers stealing the worlds energy!

Or we have the new Haswell-E chips, which will be at 130-140W the same as AMD's current chips, so what's your opinion on that?

 

Intels CPU's currently dominate in games more because they're more optimised for the games in question, and well game devs are lazy to optimise for more than 4 cores, which is all about to change.

 

I really can't be arsed to go into detail anymore on this matter, it's been brought up enough times on here.

Link to comment
Share on other sites

Games optimized for more than eight cores?

Jeez, I don't really want to spend 330 USD on an i7 CPU with eight virtual cores. :(

Link to comment
Share on other sites

Games optimized for more than eight cores?

Jeez, I don't really want to spend 330 USD on an i7 CPU with eight virtual cores. :(

Three letters. AMD

Link to comment
Share on other sites

I tried the BF4 beta the other night and was very impressed with the performance and responsiveness of the gameplay. Often with newly released games there are issues with SLI support but it was running at 2560x1600 at 60fps at high settings. Shame it won't be released on Steam.

Link to comment
Share on other sites

Three letters. AMD

No thanks. AMD has that whole cheap smell to them.

 

I want games to run on PS4-equivalent hardware at 720p till PS5 comes out. Doesn't seem like much to ask.

 

EDIT: 660 GTX and 3GB VRAM? What?

Link to comment
Share on other sites

Games optimized for more than eight cores?

Jeez, I don't really want to spend 330 USD on an i7 CPU with eight virtual cores. :(

 

hyperthreaded/virtual cores don't offer the performance advantage in a multithreaded app as a proper 8 core cpu does. so you may want to look elsewhere. 

Link to comment
Share on other sites

No thanks. AMD has that whole cheap smell to them.

 

I want games to run on PS4-equivalent hardware at 720p till PS5 comes out. Doesn't seem like much to ask.

 

EDIT: 660 GTX and 3GB VRAM? What?

 

heh

  • Like 1
Link to comment
Share on other sites

on that note. BF4 runs most excellently on my "old" X6 1090 with a GF780. CPU certainly isn't holding the graphics card back at all on this game. 

Link to comment
Share on other sites

Therese really no point in discussing or arguing about BF4 performance untill the final version is out.

I get great performance on my i5 setup, some people get terrible performance on i7, it's a beta and it's not optimized properly. Lets just pray it is once it comes out and wait. 

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.