Xbox One and PS4 to have different game memory allocations

The battle between Microsoft's Xbox One and Sony's PlayStation 4 in terms of their hardware and performance continues as a new report claims Sony's next-generation game console allows game developers to use up to 4.5 GB of its 8 GB of RAM. The other 3.5 GB is reserved for the PS4's operating system.

However, the report from Eurogamer, based on unnamed sources, also claims that Sony might let games access an additional amount of RAM, up to one more GB, in the PS4's memory pool under special circumstances. The article makes it clear that not all PS4 games will be allowed to access this extra amount of RAM.

The PS4's game memory usage compares well to Microsoft's Xbox One, which allocates 5 GB of its own 8 GB of RAM for game developers. The Eurogamer article claims that the Xbox One's RAM requirements are fixed and can't be changed, unlike the PS4.

As we have reported before, the 8 GB of RAM inside the PS4 is of the GDDR5 variety, clocking at 5500 MHz for 170.6 GB/s of bandwidth. This compares to 8 GB of the older DDR3 RAM for the Xbox One, with a clock speed of 2133 MHz for just 68.3 GB/s of bandwidth, along with 32 MB of eSRAM that is supposed to have 102 GB/s of embedded memory bandwidth. A recent rumor is claiming Microsoft might be able to increase the speed of that embedded memory to 192GB/s for the final retail version of the console.

Source: Eurogamer

Report a problem with article
Previous Story

Nokia exec hints Microsoft should speed up Windows Phone app development

Next Story

Free app brings Galaxy S4 feature to almost all Android phones

53 Comments

Commenting is disabled on this article.

Minuscule difference in the amount of RAM allocated, and the RAM isn't even the same kind of RAM so it cannot be compared directly. This is assuming that games will use the maximum of 4.5-5.5gb of RAM anyways, because the games most likely won't near launch (that's my guess anyways).

Just to be very clear...

You cant directly compare GDDR to DDR in a few ways such as mhz. Another thing to consider is that GDDR has a higher latency then DDR.

5 is not better 3 in this case You should reflect this in your writing and google can back up any research you need.

"As we have reported before, the 8 GB of RAM inside the PS4 is of the GDDR5 variety, clocking at 5500 MHz for 170.6 GB/s of bandwidth. This compares to 8 GB of the older DDR3 RAM for the Xbox One, with a clock speed of 2133 MHz for just 68.3 GB/s of bandwidth, along with 32 MB of eSRAM that is supposed to have 102 GB/s of embedded memory bandwidth."

I can tell by your response you have great knowledge of the technology.

In no way did I say DDR3 is better then all GDDR5 aspects but I did say the writing needs to explain this better. your and moron if you cant use google just like I said in my original post. Keep in mind GDDR5 is not as good at managing small data chunks and will perform all of those take slower while being faster at larger data chunks such as hmmm i dont know.... GFX?

Classic case of "Taking the good with the bad"

So did that for you since you and the writer are both lazy. I am just quoting a great explanation of just ONE example because you CAN use GOOGLE too!

•DDR3 runs at a higher voltage that GDDR5 (typically 1.25-1.65V versus ~1V)
•DDR3 uses a 64-bit memory controller per channel ( so, 128-bit bus for dual channel, 256-bit for quad channel), whereas GDDR5 is paired with controllers of a nominal 32-bit (16 bit each for input and output), but whereas the CPU's memory contoller is 64-bit per channel, a GPU can utilise any number of 32-bit I/O's (at the cost of die size) depending upon application ( 2 for 64-bit bus, 4 for 128-bit, 6 for 192-bit, 8 for 256-bit, 12 for 384-bit etc...). The GDDR5 setup also allows for doubling or asymetric memory configurations. Normally (using this generation of cards as example) GDDR5 memory uses 2Gbit memory chips for each 32-bit I/O (I.e for a 256-bit bus/2GB card: 8 x 32-bit I/O each connected by a circuit to a 2Gbit IC = 8 x 2Gbit = 16Gbit = 2GB), but GDDR5 can also operate in what is known as clamshell mode, where the 32-bit I/O instead of being connected to one IC is split between two (one on each side of the PCB) allowing for a doubling up of memory capacity. Mixing the arrangement of 32-bit memory controllers, memory IC density, and memory circuit splitting allows of asymetric configurations ( 192-bit, 2GB VRAM for example)
•Physically, a GDDR5 controller/IC doubles the I/O of DDR3 - With DDR, I/O handles an input (written to memory), or output (read from memory) but not both on the same cycle. GDDR handles input and output on the same cycle.

The memory is also fundamentally set up specifically for the application it uses:
System memory (DDR3) benefits from low latency (tight timings) at the expense of bandwidth, GDDR5's case is the opposite. Timings for GDDR5 would seems unbelieveably slow in relation to DDR3, but the speed of VRAM is blazing fast in comparison with desktop RAM- this has resulted from the relative workloads that a CPU and GPU undertake. Latency isn't much of an issue with GPU's since their parallel nature allows them to move to other calculation when latency cycles cause a stall in the current workload/thread. The performance of a graphics card for instance is greatly affected (as a percentage) by altering the internal bandwidth, yet altering the external bandwidth (the PCI-Express bus, say lowering from x16 to x8 or x4 lanes) has a minimal effect. This is because there is a great deal of I/O (textures for examples) that get swapped in and out of VRAM continuously- the nature of a GPU is many parallel computations, whereas a CPU computes in a basically linear way.

Source: http://www.techspot.com/commun...ry-and-gddr5-memory.186408/

Edited by 1WayJonny, Jul 27 2013, 12:08am :

The PS4 RAM allocation is strange, but makes sense with the PS4 having a hard VRAM availability lock due to the OS.

As for the RAM speeds, it is time that people bringing up this topic step back and realize that this does not equal faster or higher quality graphics.

There is a vast different in how each OS handles GPU assets. With NT's GPU scheduling model and the new features in DX11.2, the OS only needs a tiny amount of high speed RAM.

Reference the Texture Tile technologies, using this model, 16mb of high speed RAM is more than enough to drive 60fps at 1080p, and the Xbox is going beyond that with a 32mb high speed cache.

The PS3 does not have a kernel level GPU scheduler, nor can it without replacing the FreeBSD based OS.

Currently only Windows NT offers kernel level GPU scheduling, which is why Microsoft switched to the WDDM/WDM 1.0 in Vista; an early variation of this technology was also used in the Xbox 360.

With the GPU scheduling technologies that ONLY exist in NT, the Xbox One will be able to offer much higher resolution textures without the need for lower resolution textures to be loaded and transition between them.

This increases not only the texture quality but also increases performance and frees up RAM, as the game doesn't have to manage multiple levels of asset texture qualities.

http://msdn.microsoft.com/en-u.../windows/apps/bg182880.aspx

(These change are also why DX11.2 will probably never be back ported to Windows 7, as its WDDM/WDM does not have the newer features needed, and the 8.1 kernel and WDDM would have to be strapped onto Windows 7 to get DX11.2 to work.)

Here is a simple video of the tiled resources, which is just one significant advantage when used with the GPU scheduler technologies exclusive to NT and why it is important to gaming.

http://www.youtube.com/watch?v=EswYdzsHKMc


The RAM speed is somewhat irrelevant on NT with the WDDM technologies that were designed to handled shared GPU assets and GPU management as NT only needs the higher speed eSRAM cache for the GPU to maintain fluid higher resolution graphics.

Even in PC games, ones that use the extra VRAM that is virtually created by NT's WDDM, are able to load the full resolution textures without a loss in GPU performance, as the OS is able to swap them in out and more efficiently than the game would do by having multiple resolution assets and having to manage them at different drawing distances.

(I wish someone would do an in-depth article on how the WDDM/WDM 1.3 in NT (8.1) works and how it correlates to DirectX 11.2 - which is also used by the Xbox One. I have yet to see any site do a more technical explanation of the technologies and how it affects gaming and why it matters to end users. If I get any extra time, I may post one to the forums.)

tomtom404 said,
In reply to very detailed and relevant comment, I would gladly read an article you may write on this subject.

Same here. VERY informative post. I would love to hear more about this.

You're dead on, people need to read up on DX11.2 and the hardware supported tile feature. They've accounted for the slower DDR3 vs GDDR5 bandwidth with these features and other GPU tweaks.

But will it be enough?
The GPU of the PS4 is stronger and the memory bandwidth is a lot higher.
It wont matter for most games, as they will be made for the lowest denominator.
But for the exclusives this can matter. But it will be years before people are able to take full advantage of the systems.
PS3 is only recently showing its full power.
And if anyone would look at God Of War 2, the PS2 title, it has graphics like (if not better) the first PS3 titles.

And now its the same architecture as Xbox (altho 360 tricore design is based on the Cell CPU) and computers... It's hard to make any predictions.

Shadowzz said,
But will it be enough?
The GPU of the PS4 is stronger and the memory bandwidth is a lot higher.
It wont matter for most games, as they will be made for the lowest denominator.
But for the exclusives this can matter. But it will be years before people are able to take full advantage of the systems.
PS3 is only recently showing its full power.
And if anyone would look at God Of War 2, the PS2 title, it has graphics like (if not better) the first PS3 titles.

And now its the same architecture as Xbox (altho 360 tricore design is based on the Cell CPU) and computers... It's hard to make any predictions.

Of course the 'lowest denominator' is always a concern, as well as developers that understand the newer technologies or the OS's role in managing the GPU.

However, the good news is that games on the PC today already demonstrate a bump in features and performance, even if the game designers are completely unaware of technology in Windows 7/8.

In most games, end users with even mediocre GPUs will find that on Windows 7 and especially Windows 8/8.1 that they can turn up the in game texture settings to High/Ultra and the distance drawing or LOD to HIgh/Ultra and not lose any FPS even though the game is now handling far larger assets. (This is most noticeable on integrated graphics or 512mb or less dedicated VRAM systems, as the GPU itself doesn't have the inherent RAM to pull of the higher resolution settings.)

End users can also 'circumvent' and/or give more control over to Windows to increase game performance. Some games will run faster in 'Fullscreen Windowed' than direct 'Fullscreen' mode, as this gives the DWM control over final writes to the display which often can be faster.

Another trick with games that refuse to use the extra RAM offered by the WDDM in Windows 7 - the end user can run the game in compatibility mode set for 'Windows XP SP3', and then the game will not notice the WDDM, and will accept the full 2-3gb of RAM the OS is offering it for VRAM.


Virtually every game designed around DX8.1/9 and older games designed for the XP era of gaming run faster and can turn up the textures to higher levels when running on Windows 7/8. (Often 25-40% faster)

-This increase should get another bump with Windows 8.1 based on our preliminary testing, both CPU and GPU bound games gain an increase in performance of 20-30% over Windows 8. (Especially on newer video cards able to use WDM 1.2 or WDM 1.3 drivers.)

So at the very least, even with a poorly ported game, the Xbox One will still be able to offer a bump in performance and run with higher quality texture settings at the very least.


Also remember we do not fully know the changes MS made to the APU, but they have disclosed that they did more than adjust clock speeds and customize the number of stream processors, which was the extent of Sony's involvement with AMD.

So we do know there are several new technologies in the MS version of the APU the Xbox One is using, it is not just an off the shelf version. MS Hardware Engineers were behind the Xenon design and it changed GPUs for a generation, so there could be a lot of significant changes in the APU design.

It should be evident that any performance estimates or 'technical' information is unreliable as it is not considering what advances MS has made to the APU design. Even the 'known' eSRAM isn't being considered when you see theoretical performance numbers be thrown.

Blah Blah Blah, who cares... Just give us some kickass games, then an only then will both consoles will sale great.

They need to stop giving us games like CoD:BO2, way to much BS watching a video an not enough action. Dumb down the graphics a bit an give us more hours of game play instead of today standards of 6 to 8hrs of game play.

At least now I dont have to read comments on here, reddit, etc about how Xbox would hurt due to less memory being able to be used (as if LCD wasn't enough reason).

...Unless of course on the console GPU's used shared RAM...........(I would hope the GPU's selected for the consoles have descreet RAM separate from the system ram......)

Maybe its me and consoles are completely different then PC (but I doubt it)......
Most AAA PC games use about 1gb of ram...... (I lazily looked at one game......Crysis 3)......
So is all this "who's system is better for games" based on amount of memory available........is all rubbish because no games come close to using enough ram to worry about?????

I'm sure there are games that exceed 1b (as I only really tested 1 game), but I'd be be surprised if we are anywhere near needing/using 4gb of ram for a game at this point in time.....

Correct me if I'm wrong, but if you're a PC gamer, chances are if you're playing something like Skyrim, you're going to modify the hell out of the game anyway. Throw in 2-4k textures, enb injectors, and various other items into the mix, suddenly you'll find yourself needing that 4GB unlocker to allow the game to utilize 4GB of memory.

playfulsteve said,
Maybe its me and consoles are completely different then PC (but I doubt it)......
Most AAA PC games use about 1gb of ram...... (I lazily looked at one game......Crysis 3)......
So is all this "who's system is better for games" based on amount of memory available........is all rubbish because no games come close to using enough ram to worry about?????

It's been a while since AAA PC games used just 1GB RAM. And you have to remember the RAM usage indicator shown in Task Manager is only system RAM, NOT the GPU RAM which is 1GB-4GB and contains your textures.

PS4 and XB1 are unified memory systems. Both CPU and GPU use the same RAM. It's 4.5GB TOTAL RAM in the PS4 including what the CPU and what the GPU need.

Although games may not need all that RAM at this point, who knows what you'll see a few years from now when these new consoles have been out for a while?

JaykeBird said,
Although games may not need all that RAM at this point, who knows what you'll see a few years from now when these new consoles have been out for a while?

This, and when they finally start to use threading on x86 properly \o/.

In the end it will be almost the same. Even with PS3 was touted as having more graphical power it did not use it to take away potential Xbox buyers. Same will apply here. Both are very capable consoles. My money went to Xbox for the entertainment aspect and the 1080p Kinect.

I miss the days of Expansion packs. Remember the n64... Was hoping they added a slot for this so in 2 years they can just add a little more ram or whatever via the slot.

MrAnalysis said,
I miss the days of Expansion packs. Remember the n64... Was hoping they added a slot for this so in 2 years they can just add a little more ram or whatever via the slot.

With these machines being closer to computers than ever, I don't see why they couldn't simply just do a yearly refresh. XboxTwo, 2014, same price with 25% more power... then XboxThree... etc.

If it can be done with cell phones, tablets, laptops, etc...

Simple, people don't want to need to upgrade a $500 (to them) entertainment machine every year just to be able to play the latest games. There was no technological hurdle that would have kept MS from upgrading the 360 over the course of its life, it was as much a PC as the One is (NT based Kernel, PowerPC CPU, Radeon based GPU). Consoles are static for many years in terms of their specs specifically so that developers have a static target the develop against, and consumers feel that it is a good investment as they will only need to make the hardware purchase once every so many years.

wernercd said,

With these machines being closer to computers than ever, I don't see why they couldn't simply just do a yearly refresh. XboxTwo, 2014, same price with 25% more power... then XboxThree... etc.

If it can be done with cell phones, tablets, laptops, etc...


Because game developers need to be targeting one thing... Having various different units, all,with different specs, would be a disaster...

wernercd said,

With these machines being closer to computers than ever, I don't see why they couldn't simply just do a yearly refresh. XboxTwo, 2014, same price with 25% more power... then XboxThree... etc.

If it can be done with cell phones, tablets, laptops, etc...


Don't even suggest this!! This would be a NIGHTMARE for gamers.
Imagine purchasing a game and then being greeted with a screen that says "You need to purchase the 2014 model in order to play this game".

My solution with the expan pack would still allow games to be played it would be like a directx 10 card and then putting a directx 11 in. The dirext x 10 card still plays games just fine but miss's a few features out.

They could go the same route here, miss a few details out in games and possibly a game mode but nothing more the games would still be playable.

I guess they do this already though with the addons such as cameras and stuff, you can still play the games without them but having them makes the experience more nice.

Sraf said,
the 360 over the course of its life, it was as much a PC as the One is (NT based Kernel, PowerPC CPU, Radeon based GPU).

Can't believe people are still saying this after all these years. The 360 does not use the NT kernel (regardless of what you'll find on the internet). Even MS have said this.

NoClipMode said,

Can't believe people are still saying this after all these years. The 360 does not use the NT kernel (regardless of what you'll find on the internet). Even MS have said this.


Huh wut.
According to Microsoft, it is a common misconception that the Xbox and Xbox 360 use a modified Windows 2000 kernel.[37] They claim that the Xbox operating system was built from scratch but implements a subset of Windows APIs. The idea that it does, indeed, run a modified copy of the Windows kernel still persists in the community.

They denied the OS being a deliberative from 2000. I can bet you a thousand dollar that the kernel the 360 OS is build upon, is in fact, NT.

What does it matter, honestly? Lol.

The point of the original post was that Microsoft could've made a new beefed-up version of the Xbox 360 with better specs and stuff, but they didn't.

M_Lyons10 said,

Because game developers need to be targeting one thing... Having various different units, all,with different specs, would be a disaster...

Game developers don't need to target one thing... Otherwise Android and PC games wouldn't be where they are... It wouldn't be hard to target the latest Console and "downconvert" for older versions... especially considering most console games have PC ports that have to worry about the same thing.

Edited by wernercd, Jul 30 2013, 3:30pm :

Tha Bloo Monkee said,

Don't even suggest this!! This would be a NIGHTMARE for gamers.
Imagine purchasing a game and then being greeted with a screen that says "You need to purchase the 2014 model in order to play this game".

MS could put in requirements that a game has to support 3 versions of the console... Just like the iPhone supports 2-3 versions back. It would be less of a nightmare than supporting Android, and plenty of companies succeed in that arena across phones, tablets and many OS versions.

Easy? Maybe not... but far from impossible especially if MS keeps things under control and keeps assurances in place.

The side of this rumour that I heard was that yeah, at first games would have 4.5GB of RAM but that could be upped to 5.5GB. I believe Sony did a similar thing with the PS3, where originally something like 192MB of RAM was reserved for the system but over time that was reduced as Sony optimised further and further.

Either that or games that really need that extra RAM will have to forego certain features - perhaps ingame video sharing or background downloads or something.

Actually, Eurogamer updated their article. The flexible RAM amount is apparently only 512MBs. It's now definitely 5GB for games on the Xbox One versus 4.5GB on the PS4 with the possibility of another additional 512MBs.

You're almost certainly right. Developers will target the lowest common denominator so we'll probably get near-identical games on most systems. If they plan to really use that extra 500Mb of RAM, it'll be in trivial ways you probably won't care about (XBOne version has extra birds in the Sky!).
Then again, the PS4 has more graphical horsepower so perhaps the XBone versions, although capable of having more...stuff...will be less capable of rendering it.

Or developers will just not care to push each console to its limit with their multiplats.

Who is to say a firmware update won't optimize this in the future. It will also take years before you see developers pushing the limits on these boxes. People are picking fights over effects that most will not even notice. You have to give it to these clowns... with so many exclusives, we just about have to buy both to play the gems. Heck, I guess it's cheaper than a new laptop.

Patrick Danielson said,
Who is to say a firmware update won't optimize this in the future. It will also take years before you see developers pushing the limits on these boxes. People are picking fights over effects that most will not even notice. You have to give it to these clowns... with so many exclusives, we just about have to buy both to play the gems. Heck, I guess it's cheaper than a new laptop.

DDR3 is really old, my 3 year old PC uses it ffs.

It might be "adequate" but it pales in comparison to Sonys offering.

boo_star said,

DDR3 is really old, my 3 year old PC uses it ffs.

It might be "adequate" but it pales in comparison to Sonys offering.

And 3 years ago graphics cards were using GDDR5. Whats your point? They're both pretty old.

I'd bet your PC does not have DDR3 running at 2133MHz though. DDR3 also has lower latency than GDDR5. Then theres the 32MB of eSRAM in the XBO, which might be small but if used as a frame buffer can come in very handy.

Edited by NoClipMode, Jul 27 2013, 7:12am :

OK, so what you are saying is that PC gaming and XBOX 1 will never achieve the greatness of PS4 because it uses GDDR5 shared system/graphics memory. That is unless AMD takes over the market and pretty much all PC CPUs will be replaced with their APUs and GDDR5. It would totally change the current architecture. Nobody considers the new OS Sony will need either. It really doesn't matter in the end because you made your mind up. 70 FPS compared to 60FPS makes all the difference. You are loyal to a brand. In my eyes it won't matter... I will have both. No need to justify when I can choose which I like best after I get my hands on them. No worries on how the devs handle it, LOL. Seen this speculation for too many years to care now... VooDoo vs. ATI, AMD vs. Intel, Radeon vs. GeForce - blah, blah, blah. What do we get, maybe a couple extra shadows, ice shavings, or birds. The first Xbox was twice as powerful as the PS2 and the games looked so similar, it didn't matter then either. It's going to be the experience that counts.

boo_star said,

DDR3 is really old, my 3 year old PC uses it ffs.

It might be "adequate" but it pales in comparison to Sonys offering.


Yeah - and that Cell processor that Sony is using is going to kick the Xbox's ass too -- it's so much more poweful!!! Oh, wait...

Fezmid said,

Yeah - and that Cell processor that Sony is using is going to kick the Xbox's ass too -- it's so much more poweful!!! Oh, wait...

Have not seen 360's being used in scientific number crunching situations.

Shadowzz said,

Have not seen 360's being used in scientific number crunching situations.

Haven't seen gamers care if their game console can be used in scientific number crunching situations. Unless there's a game out there where the most crunched number wins?

Fezmid said,

Haven't seen gamers care if their game console can be used in scientific number crunching situations. Unless there's a game out there where the most crunched number wins?

Cause marketing