Jump to content



Photo

"The Future Is Fusion"... is AMD right?


  • Please log in to reply
6 replies to this topic

#1 Andre S.

Andre S.

    Asik

  • Tech Issues Solved: 11
  • Joined: 26-October 05

Posted 22 May 2013 - 02:03

I've recently built a Media Center/Gaming PC to go with my new HDTV, and decided to give AMD's "Trinity" platform a chance with its A10 5700. At first, I didn't think this PC could do much better than run old games at low resolutions, but Trinity has blown me away. Dark Souls? 30-40fps at 1080p - better than on PS3. Trine 2? 45fps at 1080p, high settings. Recent, GPU-bound titles running at full HD, no significant compromises in graphical quality, and enjoyable framerates.

That an integrated GPU can offer what only a discrete solution could since hardware accelerated graphics began, is a revolution.

First let's look at the current state of things. For the most powerful all-around and specifically gaming machine, things are still as they have been for the past 8 years or so: the latest Intel CPU paired with a power-hungry graphics card with a monstrous cooler, either AMD Radeon or NVIDIA Geforce. Llano didn't change this, Trinity doesn't change this, and Richland won't change this either.

What AMD's APUs bring to the table, however, is the possibility of dropping the discrete card entirely, and for a fraction of the price, power and thermal envelope, getting still pretty darn good performance in most games, even at high resolutions. It's a significant tradeoff, sure, but one that can make sense for real gaming, something unthinkable just a year ago. For HTPCs, laptops, and basically every form factor where price, power draw, noise and heat are critical concerns, APUs are bound to make discrete graphics cards irrelevant very quickly.

As for desktop PCs where these considerations matter less, the future of discrete cards is not as clear as it was either. Of course, in their current iterations (Trinity/Richland), APUs cannot compete with any recent 100$+ graphics card, a relatively small investment for vastly superior performance. Yet, the two next-generation consoles, PS4 and Xbox One, are confirmed/rumored respectively to be AMD APUs, sporting 8GB of unified memory and performance in the 2 TFlops range. As absurd at this may sound, the next generation of consoles runs on integrated graphics, based on technology very similar to the 130$ Trinity running the HTPC in my living room. This makes my brain want to explode.

There are many advantages to this solution. Unified memory means more flexibility in how it is shared in each application, less overhead in GPU-CPU communication, and a vastly simplified programming model for developers. If AMD can make this work at state-of-the-art performance level on next-generation consoles, can't we hope for this to happen on desktop PC in the forseeable future?

AMD's slogan has been "The Future is Fusion" for nearly 5 years now, and I'm starting to believe it. Intel is quickly catching up in this territory as well, with Haswell promising performance similar to the Geforce GT 650M. It's an interesting take on things to see Intel playing catch-up on AMD when AMD has been largely perceived as the underdog in the past few years. Holding the CPU performance crown may be nice, but it never gave anyone playable framerates, and I'd venture to say that AMD CPUs are still fast enough for most uses in most circumstances. At least, that's what Intel is apparently thinking, with Haswell focusing essentially on power consumption (vs ARM) and graphics performance (vs AMD).

Oh yeah, and John Carmack seems to agree:

The current generation Fusion parts are really a separate CPU and separate GPU connected on the die by better or worse interconnects, but their vision is integrating them much more tightly such that they share cache hierarchies, address space, and page tables. I think its almost a foregone conclusion that its going to be the dominant architecture in the marketplace because there are these strong forces about how we are getting more shrinks on the dies, [and] we can stick more things on there. It’s going to just pay to integrate that, and you aren't going to be able to put as many transistors towards that if its a dedicated chip. You'll pick up a lot from this tight integration with cost benefits.




#2 Mindovermaster

Mindovermaster

    Neowinian Senior

  • Tech Issues Solved: 10
  • Joined: 25-January 07
  • Location: /USA/Wisconsin/
  • OS: Mint Debian LMDE
  • Phone: HTC ONE V

Posted 22 May 2013 - 02:16

Isn't that the new AMD APU they are going to put on the XBOX One?

#3 OP Andre S.

Andre S.

    Asik

  • Tech Issues Solved: 11
  • Joined: 26-October 05

Posted 22 May 2013 - 02:24

Isn't that the new AMD APU they are going to put on the XBOX One?

Reports are that APUs in next-gen consoles are custom designs, based on Kaveri (perhaps?) which is Richland's successor. In any case it's much more advanced than what's currently on the market for PCs.

#4 +Phouchg

Phouchg

    Resident Misanthrope

  • Tech Issues Solved: 9
  • Joined: 28-March 11
  • Location: Neowin Detainment Camp

Posted 22 May 2013 - 16:04

This being said (and true), I don't understand why Intel didn't pack Iris Pro in more of their upcoming CPUs. Just like it is with IB - the low end, where IGP is most likely to be used, got measly HD2500, but high end got HD4000, mostly *in vain*, because those people often opted for superpowered discrete GPU anyway.

Does Intel think that people will opt for more expensive knowing the difference? Some would. But they *don't* know the difference - they simply fully expect the new series to be better!

#5 threetonesun

threetonesun

    Neowinian Senior

  • Tech Issues Solved: 1
  • Joined: 26-February 02

Posted 22 May 2013 - 16:18

This being said (and true), I don't understand why Intel didn't pack Iris Pro in more of their upcoming CPUs. Just like it is with IB - the low end, where IGP is most likely to be used, got measly HD2500, but high end got HD4000, mostly *in vain*, because those people often opted for superpowered discrete GPU anyway.

Does Intel think that people will opt for more expensive knowing the difference? Some would. But they *don't* know the difference - they simply fully expect the new series to be better!


Well, the i3225 has the HD4000 on it.

Anyway, yes, I've always thought AMD was ahead of the game in this field, it's no surprise Sony and MS went with them for their new consoles.

#6 tim_s

tim_s

    Default

  • Joined: 07-January 13
  • OS: OSX (Macbook Pro i7), Windows 7 (Gaming), Gentoo
  • Phone: iPhone 5s

Posted 22 May 2013 - 20:27

I think the main drawback in this discussion for me would be "choice" and tied into a set solution. I like many others sway in the winds of change as the power heads of the technology world fight it out. Right now I have an Intel Machine with an Nvidia Graphics Card and could I live with a AMD CPU + Integrated future GPU? - sure! Do I want to have choice taken away from me? Answer: NO, does everyone think like me? Answer: NO

I think it is a good way to get a machine up and running with little cost but for some - this is just an extra cost and not a set future unless people love everything to be uniform.

#7 Arceles

Arceles

    Time Craymel

  • Tech Issues Solved: 1
  • Joined: 28-November 09
  • Location: 4th dimmension.
  • OS: Win 7 Ultimate / Win 8.1 Pro (With Start Menu Start8, otherwise is UNUSABLE) / Android 4.1.2 Jelly Bean
  • Phone: XT890 Motorola RAZRi (x86 processor)

Posted 22 May 2013 - 20:35

Now I can really say to intel hardcore faboys... that both xbox one and ps4 run on APUs made by AMD.