AMD Fusion will ship this year?

AMD has been working on a processor they are calling the Fusion. We first reported on it back in February when word started traveling that AMD was working on a project code named Llano. This new processor is a CPU and a GPU on the same die which means it can do both your computational processing and graphics processing in one piece of silicon. To compete with AMD, Intel released their Core i3 and Core i5 processor which have a CPU and a GPU on one chip but they are separate dies.

 Bit-Tech had a chance to talk to someone within AMD about the new Fusion processor, "I don’t think there’s a simple answer to that" said Bob Grim AMD spokesman, "if you look at the history of AMD, when we came out with dual-core processors, we built a true dual-core processor. When we came out with quad-cores, we built a true quad-core processor. What our competitors did was an MCM solution – taking two chips and gluing them together." 

Bit-Tech also asked if we would see any speed advantages when combining a GPU and CPU on a single die and Bob Grim said, "We hope so. We’ve just got the silicon in and we’re going through the paces right now – the engineers are taking a look at it. But it should have power and performance advantages. Going to the 32nm [manufacturing process] is also going to help. I believe Dirk (Meyer, CEO) has gone on record saying we’re going to ship some to customers this year, so hopefully we’ll be able to deliver on that promise. I’m confident that we will be - the silicon looks good. I don’t see any reason why we wouldn’t [hit Meyer’s target] based on where Fusion is now." Though he didn't really answer the question asked, he did mention that the processor may be released this year.

The Fusion processor was scheduled to be released in 2011, but if what AMD says is true, we may see the first child of the AMD/ATI merger later this year.

Report a problem with article
Previous Story

Too much technology: Was Obama right?

Next Story

Could PSP2 be announced at E3, Move coming in September?

32 Comments

View more comments

I'm being honest; from way back in the day of the K6-2; AMD were literally the processor 'don' and had CPUs that were extremely better than intel for a fraction of the price, back then processors all used to run hot.

What's changed? Intels don't run hot now (unless you overclock) but AMDs still do run hot even whilst on minimal usage, and I've found for years that AMD blatenly lie about the speeds, like the old 'AMD 3000' that runs at 1.8Ghz or something?

I prefer intel chips to be honest.

WICKO said,

You're right, AMD should just give up. Intel is obviously better. Competition sucks anyway! /s

I didn't say nor imply that. AMD had the lead for a while, then Intel came back and blew them out of the water. Competition is a wonderful thing, and I'd LOVE to see AMD come back into the heavyweight class again. But right now, they're in the lightweight class with seemingly no signs of becoming seriously competitive again.

n_K, so why doesn't my Q8200 run at 8.2GHz? Product numbers don't mean much and were always relative, except now Intel can be said to be going that route with their multiple lines of Core 2 processors. At the end of the day, no one cares.

n_K said,
What's changed I've found for years that AMD blatenly lie about the speeds, like the old 'AMD 3000' that runs at 1.8Ghz or something?

If you completely misconstrue the meaning of a number it isn't AMD's fault. If I was under the incorrect assumption the BMW model numbers refer to their horsepower would it be OK for me to call them liars?

WICKO said,
You're right, AMD should just give up. Intel is obviously better. Competition sucks anyway! /s

Where did the original poster insinuate anything like that? There is no use boasting about having a 'true quad core' or 'true dual core' if the net result is something that doesn't yield a performance and/or cost advantage over Intel. Engineering beauty maybe all very nice but when push comes to shove I don't know many people who make a decision based on the engineering beauty of the CPU - if engineering beauty were the benchmark of how great something is we'd all be sitting around staring at MIPS or SPARC64 from Fujitsu.

excalpius said,
Oh and Mr. Grim, Intel's "MCM solution" has been kicking your ass for years now...
I agree completely. So what that their solution is less pleasing to an engineer. It still mops the floor with the competition.
n_K said,
I'm being honest; from way back in the day of the K6-2...
I had a K6-2. It was terrible; literally, it was the worst processor I have ever owned. It wasn't until the Athlon (K7) that AMD started winning. Then, as you mention, Intel came out with the Core 2 Duo family and took back the crown.

n_K said,
I've found for years that AMD blatenly lie about the speeds, like the old 'AMD 3000' that runs at 1.8Ghz or something?
Those numbers (the 3000) represent the equivalent to the Pentium 4's processor speed. Intel was in a clockspeed war (the GHz number) prior to releasing the Core Duo lineup, and they were losing big to AMD because AMD's "slower" clockspeed computers were much faster. However, the average consumer considers speed as a factor of clockspeed, so AMD changed their processor names to reflect their Intel related clockspeed. The numbers tend to be meaningless nowadays.

Currently, I am in the Intel camp until AMD can manage to beat Intel. Now, I do have a tri-core AMD chip in my home desktop computer that I put together myself, but that's because the board is pretty old, but AMD (smartly) allowed it to work with the newer chips (circa about a year and a half ago). My work machine has a Core 2 Duo chip, and the laptop I am buying will have a Core i5 processor in it.

Edited by pickypg, May 17 2010, 3:53pm :

if you look at the history of AMD, when we came out with dual-core processors, we built a true dual-core processor. When we came out with quad-cores, we built a true quad-core processor

He obviously left out the tri-core processors they made, which were just faulty quad-core processors.


In all seriousness, though, I'm looking forward to the Fusion processors, as AMD have managed to beat NVIDIA when it comes to graphics cards, so let's see if they can beat Intel once again in the processor section. It was all good for them back in the single-core days.

Indeed. It appears as though the only thing keeping AMD in business these days is the success of ATI's recent line. Chalk one up for diversification! 8)

excalpius said,
Indeed. It appears as though the only thing keeping AMD in business these days is the success of ATI's recent line. Chalk one up for diversification! 8)

I sometimes wonder why AMD doesn't offer ATI chipsets for Intel. I know it sounds stupid but in the end if an Intel CPU is chosen for a device, wouldn't it be great to make even a small profit off each device shipped by offering a chipset for it? I'm sure Apple would love the option of being able to choose from more than just Nvidia and Intel by way of chipsets.

excalpius said,
Indeed. It appears as though the only thing keeping AMD in business these days is the success of ATI's recent line. Chalk one up for diversification! 8)

That is not entirely true.

AMD offers many CPU choices at a great price vs performance metric against similar performing Intel processors and they offer a slightly cheaper platform with much better performing onboard graphics. AMD doesn't have to beat Intel at the high end because the reality is most CPU's sold are in the mid to low end range and this is where the majority of both companies profit margins come from.

rawr_boy81 said,
I sometimes wonder why AMD doesn't offer ATI chipsets for Intel. I know it sounds stupid but in the end if an Intel CPU is chosen for a device, wouldn't it be great to make even a small profit off each device shipped by offering a chipset for it? I'm sure Apple would love the option of being able to choose from more than just Nvidia and Intel by way of chipsets.
I'd bet that Intel would shoot them down. I love their chips, but Intel is a shady company that is all about lock-in that makes Apple and Microsoft, on their worst days, look like the most open companies. Similar to them blocking nVidia from making Core i3, i5, and i7 chipsets, I imagine AMD would be blocked for the exact same reason (Intel simply wants to be the only player).

Other than that, I wish they would too. I also wish nVidia was still allowed to make them as well. It's ALWAYS good to have more than one player in town, and I wish companies would recognize this even as they justifiably seek profits. What if Intel's chipsets end up being terrible, but nVidia makes an awesome one? I will buy the best chipset and Intel's chips. Intel wins through licensing, and they win directly. However, if the chipset turns out to be terrible, then that would force me into AMD's open arms as it stands now.

pickypg said,
I'd bet that Intel would shoot them down. I love their chips, but Intel is a shady company that is all about lock-in that makes Apple and Microsoft, on their worst days, look like the most open companies. Similar to them blocking nVidia from making Core i3, i5, and i7 chipsets, I imagine AMD would be blocked for the exact same reason (Intel simply wants to be the only player).

Other than that, I wish they would too. I also wish nVidia was still allowed to make them as well. It's ALWAYS good to have more than one player in town, and I wish companies would recognize this even as they justifiably seek profits. What if Intel's chipsets end up being terrible, but nVidia makes an awesome one? I will buy the best chipset and Intel's chips. Intel wins through licensing, and they win directly. However, if the chipset turns out to be terrible, then that would force me into AMD's open arms as it stands now.

In the case of Intel, if they did allow AMD to licence the technology to create a chipset for Intel then Intel could easily get the FTC and EU off their back; "look, we are open and we're not blocking competitors; we even licensed our technology to AMD/ATI as to allow them to make chipsets for our CPU's!" - so it wouldn't be too far fetched if they went down that route.

i believe intel going to make the GPU on-die with next generation Sandy Bridge arch

late this year

i don't see what the point of taking up chip space to integrate graphics is?

i mean will the graphics on these chips even compete with a ultra low end dedicated video card?

and will squishing them together not gimp out the cpu side of things?

what are the tangible advantages of this technology?

overall it looks to me like a move from both cpu makers to cut out nvidia as much as possible. especially with fairly recent news on that subject that i don't recall neowin covering.

treemonster said,
i don't see what the point of taking up chip space to integrate graphics is?

i mean will the graphics on these chips even compete with a ultra low end dedicated video card?

and will squishing them together not gimp out the cpu side of things?

what are the tangible advantages of this technology?

overall it looks to me like a move from both cpu makers to cut out nvidia as much as possible. especially with fairly recent news on that subject that i don't recall neowin covering.


I wouldn't say that. the CPU integrated GPUs that are being talked about are just as bad as the normal chipset integrated graphics. There really won't be any competition between an Intel GPU and an nVIDIA one. At least not any time soon.

treemonster said,
i don't see what the point of taking up chip space to integrate graphics is?

i mean will the graphics on these chips even compete with a ultra low end dedicated video card?

and will squishing them together not gimp out the cpu side of things?

what are the tangible advantages of this technology?

overall it looks to me like a move from both cpu makers to cut out nvidia as much as possible. especially with fairly recent news on that subject that i don't recall neowin covering.

I've had a look over at notebook check and the raw numbers point to the latest built in only being marginally slower than Nvidia 9400M but that doesn't give the full picture of performance given that in terms of features (DirectX, OpenCL etc) it is severely lacking. Personally I'd sooner see Intel focus on lowering the power consumption of their existing chips than wedging a GPU into a device which I'll never utilise (discrete GPU FTW!).

rawr_boy81 said,
Personally I'd sooner see Intel focus on lowering the power consumption of their existing chips than wedging a GPU into a device which I'll never utilise (discrete GPU FTW!).
Actually, that's the entire reason to put an integrated GPU into the CPU: it noticeably lowers the overall power consumption of the system. Now, like you said, it's a bad GPU, but the principle stays the same (as Intel's integrated GPU that was not in the CPU is/was also terrible).

pickypg said,
Actually, that's the entire reason to put an integrated GPU into the CPU: it noticeably lowers the overall power consumption of the system. Now, like you said, it's a bad GPU, but the principle stays the same (as Intel's integrated GPU that was not in the CPU is/was also terrible).

What would happen if they bought a GPU company? I was thinking that maybe Intel could buy out Matrox who makes some pretty good GPU's that focus on multi-monitor displays and so forth. For many years I ran a Matrox G550 and it was rocksolid with Windows

I see AMD Fusion as a viable alternative to corporations who want to use something else more powerful than an Intel GMA on their computers without having a dedicated GPU.

Although many conclude so, AMD never lost the war against Intel. On the contrary, they are in a standstill. AMD's best move lately was merging with ATI and since then they have said on top while their CPU industry status toppled. As for Intel, their IGPs were amazing back in the ol' days and their CPUs were horrible in comparison to their AMD counterparts but now they've flipped the platform and now have good CPUs but lacking VGA technology (specifically their latest i4500MHD).

IOW, both companies have their own ball and chain, fortunately for us consumers and the sake of a competitive market they still have another leg to walk with.

As much as I support AMD and will continue to use then fusion which is essentially a System-on-a-chip idea, they really need to bring forward a new architecture for their processors so they can try take a stab at performance king again. Fusion is a niche and won't take off much as it won't offer much in terms of performance gains and only marginal space/power savings for OEM's which isn't exactly where they should be aiming imo.

Fusion is taking principle theory approach similar to that of what cell processor uses but in much simpler form by which not just graphics can be offloaded out of the main core processing but also maybe throwing cores to process other things like GPGPU items or physics or calculations etc. It's not like it hasn't been done in the past and for what it is worth it's more powerful on paper then in reality which is why AMD need to also look at K10 architecture and revise it same as intel done with core architecture and build up performance from ground up then work on offloading it elsewhere

I think you can cook an egg on such a chip, i guess they'll ship it with a heatpump. Nice in the winter you can heat your room with it.

Commenting is disabled on this article.