That an integrated GPU can offer what only a discrete solution could since hardware accelerated graphics began, is a revolution.
First let's look at the current state of things. For the most powerful all-around and specifically gaming machine, things are still as they have been for the past 8 years or so: the latest Intel CPU paired with a power-hungry graphics card with a monstrous cooler, either AMD Radeon or NVIDIA Geforce. Llano didn't change this, Trinity doesn't change this, and Richland won't change this either.
What AMD's APUs bring to the table, however, is the possibility of dropping the discrete card entirely, and for a fraction of the price, power and thermal envelope, getting still pretty darn good performance in most games, even at high resolutions. It's a significant tradeoff, sure, but one that can make sense for real gaming, something unthinkable just a year ago. For HTPCs, laptops, and basically every form factor where price, power draw, noise and heat are critical concerns, APUs are bound to make discrete graphics cards irrelevant very quickly.
As for desktop PCs where these considerations matter less, the future of discrete cards is not as clear as it was either. Of course, in their current iterations (Trinity/Richland), APUs cannot compete with any recent 100$+ graphics card, a relatively small investment for vastly superior performance. Yet, the two next-generation consoles, PS4 and Xbox One, are confirmed/rumored respectively to be AMD APUs, sporting 8GB of unified memory and performance in the 2 TFlops range. As absurd at this may sound, the next generation of consoles runs on integrated graphics, based on technology very similar to the 130$ Trinity running the HTPC in my living room. This makes my brain want to explode.
There are many advantages to this solution. Unified memory means more flexibility in how it is shared in each application, less overhead in GPU-CPU communication, and a vastly simplified programming model for developers. If AMD can make this work at state-of-the-art performance level on next-generation consoles, can't we hope for this to happen on desktop PC in the forseeable future?
AMD's slogan has been "The Future is Fusion" for nearly 5 years now, and I'm starting to believe it. Intel is quickly catching up in this territory as well, with Haswell promising performance similar to the Geforce GT 650M. It's an interesting take on things to see Intel playing catch-up on AMD when AMD has been largely perceived as the underdog in the past few years. Holding the CPU performance crown may be nice, but it never gave anyone playable framerates, and I'd venture to say that AMD CPUs are still fast enough for most uses in most circumstances. At least, that's what Intel is apparently thinking, with Haswell focusing essentially on power consumption (vs ARM) and graphics performance (vs AMD).
Oh yeah, and John Carmack seems to agree:
The current generation Fusion parts are really a separate CPU and separate GPU connected on the die by better or worse interconnects, but their vision is integrating them much more tightly such that they share cache hierarchies, address space, and page tables. I think its almost a foregone conclusion that its going to be the dominant architecture in the marketplace because there are these strong forces about how we are getting more shrinks on the dies, [and] we can stick more things on there. It’s going to just pay to integrate that, and you aren't going to be able to put as many transistors towards that if its a dedicated chip. You'll pick up a lot from this tight integration with cost benefits.