AMD Developing Hybrid CPU/GPU

In addition to the snazzy new website launched by AMD/ATI today, the chipmaker announced that they are developing a hybrid CPU/GPU system dubbed 'Fusion' for release in late 2008/early 2009:

"AMD intends to design Fusion processors to provide step-function increases in performance-per-watt relative to today's CPU-only architectures, and to provide the best customer experience in a world increasingly reliant upon 3D graphics, digital media and high-performance computing. With Fusion processors, AMD will continue to promote an open platform and encourage companies throughout the ecosystem to create innovative new co-processing solutions aimed at further optimizing specific workloads. AMD-powered Fusion platforms will continue to fully support high-end discrete graphics, physics accelerators, and other PCI Express-based solutions to meet the ever-increasing needs of the most demanding enthusiast end-users."

AMD also announced that 'integrated platforms with ATI chipsets' will be coming sometime in 2007

View: DailyTech

Report a problem with article
Previous Story

ATI Branding Takes a Back Seat at AMD

Next Story

Google Makes MacExpo Commitment

29 Comments

Commenting is disabled on this article.

Wow what a sweet deal...for AMD.

Now when you want to upgrade your vid card, you have to do your CPU too.

EVERYBODY wins!! Oh wait..only AMD wins.

best thing that they could develop for a laptop user that wants less heat, more power and faster speed.

lets hope that they come out with something cool

I actually can't see this taking off too well for PCs - laptops and other small devices, yea. The main reason being is that it limits the ability to configure your own computer to your specific needs (meaning you pay for bits you don't want/need).

Sounds like the same thing the Cyrix Media GX processor was, it was a CPU/GPU all on one chip. At least that's what I can remember of that chip, only messed with a few of them.

wow, even I saw that coming after the take over.
They will try and they will fail. unless ddr4 or whoknows what next type of ram is so blazing fast. Otherwise gfx needs dedicated ram (gddr3 etc).

Quote - badazzEVO8 said @ #9.2
and your spelling has 'flunked' you back further than pre-k

That comment made you a pre-k dropout. It was not a spelling error, but instead a grammar error. Oh, you are not supposed to start with "and."

Quote - miniM3 said @ #9.5
lol, I got a red card from the grammar police.
I'll return to my pre-k now!

You, all of Neowins, and I need to go back to pre-pre-k.

After reading the article, where this tech would especially get traction is on origami devices, or NT-embedded devices. In the future, when they solve heat and die size issues, it would probably be a good solution for the pocket video players. Otherwise, it appears this is a low-end solution for cheap desktops and laptops.

I believe Joey L's suggestion was a topic once at Anand's many years ago, and if I remember correctly, it was pretty-much agreed that a mobo-based Voodoo5 would barely be able to compete with a dedicated S3 Virge due to the speed of the system bus. Of course, back then the bus ran at 66, but still...even at 1066, I don't believe it would ever be as fast as one of today's (or tomorrow's) dedicated cards. Edit: Just to be clear, I am primarily talking about GPU interaction with memory here.

My main problem with Dual Purpose chips is that they would be wasted on business only machines, i would hate to see an unused gpu due to it being on a business machine. but then again with Vista you require a top notch graphics setup just to wobble windows around.

I think the GPU and CPU should be seperate but both fitting into Motherboard sockets, then theres an upgrade path, or even a Gaming and Business market to sell different spec'd GPU's and not an all in one gamer chip for all.

that's not a bad idea.

the cpu and gpu could flank the memory on both sides... you could buy your own gpu and a third-party heatsink much like what is done today with cpus.

you could even upgrade gpu and memory a-la-carte!!

Quote - TRC said @ #3.2
You'd still have to have separate video memory too. System memory just isn't fast enough. I like that idea though.

Yeah but they are able to put memory directly on the die nowadays. Not 1GB worth but they are making some nice breakthroughs with nanomemory.

Guys all the things your are discussing, are probably being discussed by AMD engineers at this moment. Come on seriously these guys arent some idiots who done and I.T course there are many scientist and engineers who probably have PhD's in nanotechnology etc..

In addition to the snazzy new website launched by AMD/ATI today, the chipmaker announced that they are developing a hybrid CPU/GPU system dubbed 'Fusion' for release in late 2008/early 2009:


id hardly call the site snazzy more like butt ugly