Rumor: All three next-gen game consoles to use AMD GPUs inside

Nintendo has already revealed that its next generation console, the Wii U, will have an AMD graphics processor inside. But will all three next generation consoles also have a GPU made by AMD? PC hardware web site HardOCP seems to believe so. In a new article on the site, the site claimed to have had some "operatives" who attended E3 last June and according to their unnamed sources it's likely that Sony's successor to the Playstation 3 and Microsoft's next console that will follow the Xbox 360 will have some kind of graphics chip created by AMD.

If true, that also means that AMD's big graphics chip rival, Nvidia, could be watching the next generation game console war from the sidelines. Sony uses an Nvidia developed GPU for the PS3 but Microsoft and Nintendo have AMD chips inside for the Xbox 360 and Wii, respectively.

Nintendo has also revealed that the Wii U, which is due for release sometime in 2012, will have a custom made multi-core PowerPC processor from IBM inside. HardOCP states that Microsoft's successor to the Xbox 360 console is looking like it will also have an IBM-created chip inside although it adds, "it is slightly, ever so slightly possible, that this could change." The story also claims that Microsoft could delay the launch of its next generation console due to the current sales success of its Kinect motion control camera for the Xbox 360.

As far as Sony's successor to the Playstation 3, HardOCP states, " ... we hear it is still unsettled between some kind of Bulldozer (would most likely be an APU) variant and a newer updated 32nm IBM cell processor." The story adds that Sony will make a final decision on this front "soon".

Report a problem with article
Previous Story

Israel uses Facebook to blacklist airline passengers

Next Story

Visual voicemail coming to Windows Phone

35 Comments

Commenting is disabled on this article.

Really, AMD graphics plus Intel performance would be killer. I wonder why console makers haven't got in touch with Intel for a specially tailored processor... why?

thartist said,
Really, AMD graphics plus Intel performance would be killer. I wonder why console makers haven't got in touch with Intel for a specially tailored processor... why?
price

thartist said,
Really, AMD graphics plus Intel performance would be killer. I wonder why console makers haven't got in touch with Intel for a specially tailored processor... why?

And yes, you'll find it very tough putting Intel and AMD on the same integrated motherboard... You could only imagine what sort of blame game will go on if an issue is found.

Why are they still stuck on using generations old PowerPC? Why isn't anybody considering Intel, the world fastest chip maker? Thats one reason why even next generation consoles won't be able to beat a year old pc in terms of computing power.

gzAsher said,
Why are they still stuck on using generations old PowerPC? Why isn't anybody considering Intel, the world fastest chip maker? Thats one reason why even next generation consoles won't be able to beat a year old pc in terms of computing power.

Trying to compare intel *pc* based (or amd pc based chips for example) compared to specialized processor's used in mobile, server, game console is comparing apple's to oranges.

Intel makes great pc chips (and server chips ... less it's ititanic chips) but has not been able to touch arm in mobile market with atom, and not been able to compete in graphics arena (with it's own gpu's). Different types of chips serve different types of computing and intel is not master of them all.

etempest said,

Trying to compare intel *pc* based (or amd pc based chips for example) compared to specialized processor's used in mobile, server, game console is comparing apple's to oranges.

Intel makes great pc chips (and server chips ... less it's ititanic chips) but has not been able to touch arm in mobile market with atom, and not been able to compete in graphics arena (with it's own gpu's). Different types of chips serve different types of computing and intel is not master of them all.

Agreed portable devices are a different breed altogether and Arm is better suited for these since it provides always on, low power consuming chips that provide enough computing power for running multiple portable apps simultaneously. However consoles and a high end multimedia pc are alike in more than just one way. They are all based around the x86 architecture and compared to mobile devices, provide much more computing horsepower (Arm is just catching up with the consoles http://www.neowin.net/news/arm...laystation-3-with-18-months.
Thats another 18 months to match 6yr old harware performance. To be on par with todays SandyBridge, it'll probably take a decade. )

So theres an idea how far behind todays consoles are compared to PCs. In fact, consoles of today are rescricting innovation in the gamings industry's visual department (read crysis 2). developers have to take into considerations 6yr old console harware when they design games where pc can push the boundaries much farther than what we see today. So its upto the next gen consoles to bring the next wave of advancement in computer graphics with some cutting edge hardware. Hardware that very sadly, i see nowhere mentioned in this article

The story also claims that Microsoft could delay the launch of its next generation console due to the current sales success of its Kinect motion control camera for the Xbox 360.

Damnit, EVERYBODY STOP BUYING THE KINECT!

KingCrimson said,

Yeah it's a stupid gimmicky device that serves no purpose.


I'm sure MS would want to release a new console before Sony? Who knows!

iKenndac said,

Damnit, EVERYBODY STOP BUYING THE KINECT!

Why? It's the logical next step in controls. I would be very interested in a 2nd generation Kinect product. The first one, yeah it's okay but refined it would be amazing.

Next step would be to project something visually, a hologram or some form of visible light based interaction.
After that would be tactile feedback or a direct brain connection.
Final step would be total unreality, (a holodeck)

Hmm I doubt I'll live to see the last one on that list but I think the next step is reachable within 20 years and the tactile feedback... Well we'll see.

Heh, I guess Sony don't learn their lesson about the Cell CPU from the PS3. Maybe with the PS5 they will open their eyes and ditch it and go with a real CPU.

And let's look at all the non-CISC CPUs out there in anything non-PC-related (and yes; I'm not just talking Cell, but ARM and even Intel's own XScale and Atom). If you want to put them into something that will run PC (non-native) software, you are going to have to deal with the bugbear of *all* non-CISC architectures - emulation. Microsoft has proven experience at translating CISC calls to non-CISC calls via software - however, even though there have been non-CISC CPUs that have brought plenty of horsepower, not to mention firepower, to bear on the problem (DEC's Alpha was *the* undefeated/undisputed champ at running non-native applications on NT for a reason), it still can't match native-architecture speed for the price. (Alpha didn't lose out because it couldn't match 486DX or Pentium in terms of sheer speed - it could, did, and even kicked both their butts, more often than not; the fact that it did so running non-native applications was a decided boggler in and of itself. It lost because 486DX and even Pentium were cheap enough to offset Alpha's performance advantages.)

PGHammer said,
And let's look at all the non-CISC CPUs out there in anything non-PC-related (and yes; I'm not just talking Cell, but ARM and even Intel's own XScale and Atom). If you want to put them into something that will run PC (non-native) software, you are going to have to deal with the bugbear of *all* non-CISC architectures - emulation. Microsoft has proven experience at translating CISC calls to non-CISC calls via software - however, even though there have been non-CISC CPUs that have brought plenty of horsepower, not to mention firepower, to bear on the problem (DEC's Alpha was *the* undefeated/undisputed champ at running non-native applications on NT for a reason), it still can't match native-architecture speed for the price. (Alpha didn't lose out because it couldn't match 486DX or Pentium in terms of sheer speed - it could, did, and even kicked both their butts, more often than not; the fact that it did so running non-native applications was a decided boggler in and of itself. It lost because 486DX and even Pentium were cheap enough to offset Alpha's performance advantages.)

Windows 8 will confirm some of what you are talking about, redefine some of what you are talking about, and also surprise you by breaking some of your assumptions.

The trick with 'cheap' translation is not having to ramp up non-CISC processing/price, but to add in a new translation concept.

The XBox 360 already does rather well at translation, beyond even what the Alpha and the FX!32 was capable of doing, which for the time was rather smart in doing realtime re-compiling instead of just generic translation.

There are so many good arguments for EVERY type of CPU architecture, but it comes down to how well the OS handles the architecture, and this is where numbers start to lie. If you look at simple things like the Snapdragon used in WP7, it looks like an middle of the road CPU/GPU combination. However when running WinCE, the original hardware 'performance' numbers Qualcomm published(using a Linux kernel/drivers) are 3x below what WP7/WinCE can produce due to the way WinCE works and its integration and optimzation of drivers and especially when you look at the Adreno GPU where advantages of DirectX over OpenGL ES that translate into a lot more performance than Qualcomm even expected.

The story of the PS3 and the Cell is another example of a poor OS architecture and development platform choking the the potential of the CPU. The Cell in the PS3 should run circles around an XBox 360 in theory, yet barely has enough 'advantage' over the XBox 360 CPU to compensate for the lack in RSX GPU that has to be shoved through the CPU to keep up with the XBox 360 in speed and graphics quality.

This is where the argument that the HAL layer, which is an architecture point that as what NT expects works really well, as NT itself doesn't have to deal with CPU architectural differences, and the HAL can be highly optimized to translate what NT expects to work well on any CPU/architecture.

The NT kernel layering and object based kernel model also helps, as it adapts to general shifts in how the hardware works beyond what the HAL offers as a base point, and a few small changes won't break anything because of the object model, so grand scale shifts in how NT itself works can be changed easily without the layers or subsystems knowing any different.

Windows 8 is going to be fun, and demonstrate why NT is a better OS model and architecture design than many of the younger generation never had curiosty to find out why. When I was studying OS theory, the idea of NT at the time was such a 'holy cow' moment, in that it threw away so much crap of OS models of the past, especially the UNIX model and the team was also determined to get some of the more 'advanced' OS concepts of the time to work in NT that gives it the extensibility (HAL, Hybrid Kernel, Kernel Layers, Client/Server, Object Based IPC, OS Subsystems).

Nobody noticed de updated Cell Processor? Now that's a processor I really would like to see on PCs! THAT is more or less the future, the PowerPC arquitecture, not ARM... for PCs.

Come to think about it I would purchase a windows PC with a cell processor right away... my... perhaps I'm a fanboy.

Arceles said,
Nobody noticed de updated Cell Processor? Now that's a processor I really would like to see on PCs! THAT is more or less the future, the PowerPC arquitecture, not ARM... for PCs.

Come to think about it I would purchase a windows PC with a cell processor right away... my... perhaps I'm a fanboy.

You can't really compare PowerPC and ARM. ARM have always been about low power consumption, this is why they practically own the mobile processor market and why even the likes of Intel struggle to break into it. PowerPC is, of course, more powerful but its not anywhere near as efficient as ARM's chips.

All 3 of the current generation consoles use PowerPC chips inside them, I fully expect the next generation to continue with this.

Arceles said,
Nobody noticed de updated Cell Processor? Now that's a processor I really would like to see on PCs! THAT is more or less the future, the PowerPC arquitecture, not ARM... for PCs.

Come to think about it I would purchase a windows PC with a cell processor right away... my... perhaps I'm a fanboy.

Ummm, No thanks. Please keep your dead to market cell CPUs off of my desktop.

Arceles said,
Nobody noticed de updated Cell Processor? Now that's a processor I really would like to see on PCs! THAT is more or less the future, the PowerPC arquitecture, not ARM... for PCs.

Come to think about it I would purchase a windows PC with a cell processor right away... my... perhaps I'm a fanboy.


If you actually understood what the Cell processor was, then you'd understand it would be absolutely useless in the PC market and has been in the console market as well. The Cell processor is great in one particular area: Crunching numbers. General purpose stuff it suffers greatly.

Arceles said,
Nobody noticed de updated Cell Processor? Now that's a processor I really would like to see on PCs! THAT is more or less the future, the PowerPC arquitecture, not ARM... for PCs.

Come to think about it I would purchase a windows PC with a cell processor right away... my... perhaps I'm a fanboy.

Perhaps they should make this product just for you. (you would be the only person silly enough to buy it)

Tony. said,
If you actually understood what the Cell processor was, then you'd understand it would be absolutely useless in the PC market and has been in the console market as well. The Cell processor is great in one particular area: Crunching numbers. General purpose stuff it suffers greatly.

bitcoin?

I don't see any need for MS to change the CPU side of things, stick with PPC and just tweak it a bit while you totally upgrade the GPU. Sticking with AMD and moving up to a DX11, or maybe even DX12 by then, GPU would be the best way to also keep backwards compatibility. It's the CPU change that screws things up.

nVidia is better IMO but AMD is cheaper and I am sure they will go that route to maximize profits and keep costs down. Oh well, maybe this will be ports between consoles will have less issues but that will also depend on the CPU used as well.

NoLiMiT06 said,
nVidia is better IMO but AMD is cheaper and I am sure they will go that route to maximize profits and keep costs down. Oh well, maybe this will be ports between consoles will have less issues but that will also depend on the CPU used as well.

Sorry, but the whole 'nVidia is better' thing is an opinion. AMD/ATi have some great GPU's on the market. Just because nVidia is usually more expensive doesn't always mean it's better because of that.

AMD has great Performance/price ratios, and over the past couple of years, they've usually been on top.

See the actual main problem is that AMD and nVidia haven't gone head-to-head in recent years with new hardware. In the past, they usually release new products around the same time to compete with each other (a yearly thing), but every release has been one on top of the other. 6 months it's AMD, the next it's nVidia, then it's AMD and so on.

Tony. said,

See the actual main problem is that AMD and nVidia haven't gone head-to-head in recent years with new hardware. In the past, they usually release new products around the same time to compete with each other (a yearly thing), but every release has been one on top of the other. 6 months it's AMD, the next it's nVidia, then it's AMD and so on.

I remember when new video card releases used to be a huge deal, and it helped a great deal that they had distinctive names. Now it seems like new cards are coming out constantly and the names don't stand out at all, it's just GeForce (bunch of random numbers) or Radeon (bunch of random numbers). No new features anymore either, each new card is just a little faster. I don't even pay attention to them anymore, it's become boring. It's the same with CPUs these days.

Tony. said,

Sorry, but the whole 'nVidia is better' thing is an opinion. AMD/ATi have some great GPU's on the market. Just because nVidia is usually more expensive doesn't always mean it's better because of that.

AMD has great Performance/price ratios, and over the past couple of years, they've usually been on top.

See the actual main problem is that AMD and nVidia haven't gone head-to-head in recent years with new hardware. In the past, they usually release new products around the same time to compete with each other (a yearly thing), but every release has been one on top of the other. 6 months it's AMD, the next it's nVidia, then it's AMD and so on.

I think it really depends on the focus of the GPU. Nvidia never made a custom GPU for anyone. The one in PS3 is the same as one they make on PC. The same for Xbox. The recent when current ATI is a much better fit a better in performance / price is simply because they spent those transistor budget on games. While Nvidia has been pushing the GPGPU with their GPU.
On a Game console we dont want GPGPU power, we want pure Graphics. That is why Nvidia lose out.

TRC said,

I remember when new video card releases used to be a huge deal, and it helped a great deal that they had distinctive names. Now it seems like new cards are coming out constantly and the names don't stand out at all, it's just GeForce (bunch of random numbers) or Radeon (bunch of random numbers). No new features anymore either, each new card is just a little faster. I don't even pay attention to them anymore, it's become boring. It's the same with CPUs these days.

no new features anymore? well AMD (since i buy them) they got Eyefinity, directx 11, HD3D since 5000 series, then opengl 4.0 for 5000. 4.1 for 6000. in latest catalyst they got AMD Steady Video technology for 6000 cards.
of course now its alot of cards just with a little improvement. specially from 5000 to 6000, but 7000 promises to be better if they can get a new Southern Islands chips.
not so bad but its not like their are useful for everyone to upgrade from 5000 to 6000 just for 1 feature, or from 4000 to 6000 just for directx11 if you dont even play videogames.

Tony. said,

Sorry, but the whole 'nVidia is better' thing is an opinion.

You don't say... And what do you think the "IMO" in his post means?