AnandTech: Xbox One vs. PS4 - Hardware comparison


Recommended Posts

I honestly don't see the cloud being useful in many situations. The wildly variable performance and latency of the internet make it completely useless for anything that needs to be kept reasonably in sync with the box.

Link to comment
Share on other sites

Wow. No wonder microsoft wants to push their cloud idea so hard. Cause the system is even weaker at gaming than we thought.

Link to comment
Share on other sites

768 isn't enough though, benchmarks of PC games barely get past 30FPS at 1080p with a PC equivalent as nearest we can get to Xbox One GPU, those extra 384 shaders is the difference between 30 and 60FPS at 1080p.

That seems highly unlikely...

(Even considering a linear increase in framerate with the number of shader units, the best you'd get is a 50% improvement, i.e. 45fps versus 30fps. But it doesn't really work that way either)

Link to comment
Share on other sites

That would require an always-on online connection, something that has been established by the internet commentariat as a non-starter.

I think you'll see it used mainly for multiplayer to start in which case being always online is needed anyways.

After that if a developer is smart about it for SP they should be able to detect if you are online, if you are then use the cloud for a nice boost, if not then don't. Either way it'll just be a bonus, not night and day for the game IMO.

Link to comment
Share on other sites

Kotaku: Only 90% of Xbox One GPU usable for games, PS4 GPU up to 66% more powerful - http://kotaku.com/th...angel-509597078

You forgot to mention only 6 cores and only 5GB memory?

PSA: Kotaku article is about games running state and those are similar to app states on WP and W8. This is not a doom and gloom article and also does not mention the 66% more powerful PS4 (don't know either way but it is not in that article). The 90% GPu access is the running state of the game as per the article.

Link to comment
Share on other sites

So like it is on the PC? I know for sure my PC is still doing loads of stuff in the background even when I play a game and the desktop and any open apps are still there in memory and held ready to be displayed when I either close the game or alt-tab out. If you want true multitasking on a console then some % is held back for this exact purpose. If you're fine with that then the benefits are clear, I like the fact I won't have to quit my game fully if I want to switch over to the dashboard or some other app to just check something out for a bit and then go back to gaming.

Link to comment
Share on other sites

Except the pc does not deliberately hold back a portion of your gpu power if you happen to have something open and idle in the background while playing a game.

Link to comment
Share on other sites

Except the pc does not deliberately hold back a portion of your gpu power if you happen to have something open and idle in the background while playing a game.

I may be wrong but I think they're reserving a portion of the GPU for rendering non-game related assets (e.g. the dashboard, a video running in the background).

Anyway, it's clear that the PS4 has superior hardware but I don't think it'll show in multi-platform titles. The real battle for the image quality crown will be waged with exclusives. Microsoft has a software advantage (DirectX 11) and Sony has a hardware advantage (GDDR5 RAM + more powerful GPU).

Link to comment
Share on other sites

I wonder what the cost will be for such a console with such expensive memory. Is Sony planning on selling their console at loss again? OR is theconsole going to be so expensive

Link to comment
Share on other sites

I wonder what the cost will be for such a console with such expensive memory. Is Sony planning on selling their console at loss again? OR is theconsole going to be so expensive

Sony's camera isn't going to cost as much as the Kinect 2.

Sony most likely haven't been spending big bucks like MS to secure things like the NFL either (est 400 million).

Link to comment
Share on other sites

I may be wrong but I think they're reserving a portion of the GPU for rendering non-game related assets (e.g. the dashboard, a video running in the background).

Anyway, it's clear that the PS4 has superior hardware but I don't think it'll show in multi-platform titles. The real battle for the image quality crown will be waged with exclusives. Microsoft has a software advantage (DirectX 11) and Sony has a hardware advantage (GDDR5 RAM + more powerful GPU).

DirectX isn't really a software advantage, it's just different. Not to mention that it's far more feasible to just cut out the API altogether and just go straight to the hardware on a console.

Link to comment
Share on other sites

Except the pc does not deliberately hold back a portion of your gpu power if you happen to have something open and idle in the background while playing a game.

I am guessing it does.

Sony's camera isn't going to cost as much as the Kinect 2.

Sony most likely haven't been spending big bucks like MS to secure things like the NFL either (est 400 million).

Kinect is a few more sensors and possibly 1 or 2 additional silicons compared to any standard webcam. Pretty cheap stuff when you buy in volumes.

Link to comment
Share on other sites

the 10mb edram die that the xbox360 had did some other stuff too. it gave the 360 free 4xFSAA, z buffering and alpha blending with zero performance hit. The xbox one should have the same logic on the 32mb esram die,and maybe some other stuff too. looking at just raw power of compute units doesn't tell the whole story.

  • Like 1
Link to comment
Share on other sites

Sony's camera isn't going to cost as much as the Kinect 2.

Sony most likely haven't been spending big bucks like MS to secure things like the NFL either (est 400 million).

I was actually talking about the cost of the whole console not just the camera sensor. Not sure how you relegate to the camera only. Didn't know the camera had expensive memory. Anyway, I'm looking to see how much it's going to cost because I'm seeing Sony selling at a loss and it might be years before they can turn a profit. At the same token I wonder who will turn a profit first, Microsoft or Sony? I think that's where the real winners will be made.

Link to comment
Share on other sites

the 10mb edram die that the xbox360 had did some other stuff too. it gave the 360 free 4xFSAA, z buffering and alpha blending with zero performance hit. The xbox one should have the same logic on the 32mb esram die,and maybe some other stuff too. looking at just raw power of compute units doesn't tell the whole story.

The 10MB of edram typically wasn't big enough for MSAA at a native resolution of 720p(that's why you hear alot about tiled rendering on the 360). It's also not quite "free" AA, low cost would probably be more accurate.

And no, the esram wont be able to magically make the one's gpu on par with the ps4's gpu.

Link to comment
Share on other sites

The 10MB of edram typically wasn't big enough for MSAA at a native resolution of 720p(that's why you hear alot about tiled rendering on the 360). It's also not quite "free" AA, low cost would probably be more accurate.

And no, the esram wont be able to magically make the one's gpu on par with the ps4's gpu.

according to the VP of engineering at ATI at the time, the 10mb edram die could do 1280x768 4x multisample antialiasing with no performance penalty. I think I would take his word over yours any day of the week.

and I wouldn't be so confident with saying the xbox one gpu wont be on par of with the ps4 gpu.

take a look at these tests with AA

http://www.hardocp.c.../2#.UaUaXsDD9dg

a GTX 580 playing F.E.A.R 3 at 1920x1200 with 4xAA had its framerate performance HALVED compared to nvidias FXAA technology. imagine the difference without any AA at all.

Xbox gets free AA among other things. That is a huge ****in deal.

Link to comment
Share on other sites

I would wager a modern GPU would actually be a lot more heat efficient than those in the current consoles.

They effectively are.

But here we are talking about basically the same GPU, same arch. The more powerful one will dissipate more heat. The difference might be small tough since both are mostly mid range gpu.

Link to comment
Share on other sites

Except the pc does not deliberately hold back a portion of your gpu power if you happen to have something open and idle in the background while playing a game.

It's actually way worse on PC. While developers will be able to take advantage of 90% of the Xbox One's GPU, it's unlikely they're taking advantage of even 70% of most PC video cards. Widely varying configurations + dedicated VRAM + heavy layers of abstraction means a lot of wasted potential.
Link to comment
Share on other sites

according to the VP of engineering at ATI at the time, the 10mb edram die could do 1280x768 4x multisample antialiasing with no performance penalty. I think I would take his word over yours any day of the week.

and I wouldn't be so confident with saying the xbox one gpu wont be on par of with the ps4 gpu.

take a look at these tests with AA

http://www.hardocp.c.../2#.UaUaXsDD9dg

a GTX 580 playing F.E.A.R 3 at 1920x1200 with 4xAA had its framerate performance HALVED compared to nvidias FXAA technology. imagine the difference without any AA at all.

Xbox gets free AA among other things. That is a huge ****in deal.

I don't think we even know that the esram is setup to function the same way as the edram in the 360. Besides, this is speaking strictly of hardware MSAA. Something developers have been moving away from in favor of post-process antialiasing for many reasons. So even if it were set up to function exactly the same as the 360, it could end up being a moot point most of the time anyway.

As for F.E.A.R. 3... Yea, that's often what happens when you enable MSAA in a deferred renderer. That's one of the reasons many modern games don't offer msaa.

It's actually way worse on PC. While developers will be able to take advantage of 90% of the Xbox One's GPU, it's unlikely they're taking advantage of even 70% of most PC video cards. Widely varying configurations + dedicated VRAM + heavy layers of abstraction means a lot of wasted potential.

That isn't quite what i meant, and i know about pc's not being able to access the full power due to various reasons. What i meant was that the pc doesn't cut you off of some of the performance you do have available just cause it can multitask. As in you get all of that, say, 70%.

Link to comment
Share on other sites

I don't think we even know that the esram is setup to function the same way as the edram in the 360.

exactly, you don't know, which is exactly why I told you not to be so confident that the xb1 gpu wont be on par with the ps4 gpu. there could be a lot of custom logic that takes work away from the compute units.

Besides, this is speaking strictly of hardware MSAA. Something developers have been moving away from in favor of post-process antialiasing for many reasons. So even if it were set up to function exactly the same as the 360, it could end up being a moot point most of the time anyway.

all the post process aliasing ive seen causes a texture blurring cluster****. If MSAA is free on xb1, im sure developers would use that instead,since they don't have to care about a performance penalty. In turn, this will cause the ps4 to have shittier quality,which again makes the whole ps4 gpu more powerful argument fall flat on its face.

As for F.E.A.R. 3... Yea, that's often what happens when you enable MSAA in a deferred renderer. That's one of the reasons many modern games don't offer msaa.

But like I said, if its there on xbox with no penalty, they would use that instead and don't have to care about a performance hit.

Link to comment
Share on other sites

exactly, you don't know, which is exactly why I told you not to be so confident that the xb1 gpu wont be on par with the ps4 gpu. there could be a lot of custom logic that takes work away from the compute units.

Honestly, something like that is highly unlikely. Even if i were to be a bit overly optimistic as you are. And besides, if there was a whole lot of secret sauce that could make it match or beat the ps4's gpu, don't you think microsoft would have touted that instead of giving us some completely useless crap like '5 BILLION TRANSISTORS"?

all the post process aliasing ive seen causes a texture blurring cluster****. If MSAA is free on xb1, im sure developers would use that instead,since they don't have to care about a performance penalty. In turn, this will cause the ps4 to have shittier quality,which again makes the whole ps4 gpu more powerful argument fall flat on its face.

But like I said, if its there on xbox with no penalty, they would use that instead and don't have to care about a performance hit.

On the current consoles, blurring can be true to some degree with post-AA due to the relics for gpu's they have. But with the higher resolutions and much higher quality implementations available on the pc, which will also likely be usable on the next gen hardware, that's much less of an issue. Also, reasons for using post-AA don't apply just to performance. Standard MSAA actually doesn't work on various things.

Link to comment
Share on other sites

DirectX isn't really a software advantage, it's just different. Not to mention that it's far more feasible to just cut out the API altogether and just go straight to the hardware on a console.

It's the standard for game development on PCs and given the architectural similarities between PC hardware and next-gen consoles, I'd say it's an advantage (albeit a slight one). To my knowledge, the PS4 will use OpenGL 4.2 which is something developers use for PCs as well but it isn't as popular as DirectX 11.1. Cutting out the API entirely to go direct-to-metal is probably the best option from a performance perspective but I imagine it would make things harder in terms of development.

Anyway, console developers already have low-level access to the hardware which is why the image quality gap isn't as big as it should be given the performance gap. As for the next generation, it's nice to know that they'll be more future-proof than the current generation. The Xbox 360 supports a specific version of DirectX 9 but unfortunately, it was released before DirectX 10.

Link to comment
Share on other sites

Honestly, something like that is highly unlikely. Even if i were to be a bit overly optimistic as you are. And besides, if there was a whole lot of secret sauce that could make it match or beat the ps4's gpu, don't you think microsoft would have touted that instead of giving us some completely useless crap like '5 BILLION TRANSISTORS"?

unlikely based on what? compute units doesn't tell you anything at all. its all architecture. how much work is being done per clock cycle. you can have half the compute units but do twice the work per clock cycle, and in the end you have two similar performing devices. we've seen Microsoft use some custom logic in their previous gpu, what makes you think its unlikely they would do more of the same?

On the current consoles, blurring can be true to some degree with post-AA due to the relics for gpu's they have. But with the higher resolutions and much higher quality implementations available on the pc, which will also likely be usable on the next gen hardware, that's much less of an issue.

not really. 1080p PC gaming, try it. you can see it. it still is an issue,and it bugs the hell out of me.

Also, reasons for using post-AA don't apply just to performance. Standard MSAA actually doesn't work on various things.

sure it doesn't work on some things, but you can actually combine MSAA and post-AA and have the benefits of not having performance hits.

Link to comment
Share on other sites

unlikely based on what? compute units doesn't tell you anything at all. its all architecture. how much work is being done per clock cycle. you can have half the compute units but do twice the work per clock cycle, and in the end you have two similar performing devices. we've seen Microsoft use some custom logic in their previous gpu, what makes you think its unlikely they would do more of the same?

We know they're both using amd's current gcn architecture(used in the desktop radeon 7700-7900). Which means their abilities can be directly compared without issue. And since they're of the same architecture, with the one's variant being an even more cut down one than the ps4's, there's basically no chance of what you describe happening(twice as much done per clock).

With the architectures being the same that leaves the only noticeable difference the two consoles being the esram. And no matter what you want to believe, that simply wont make up for the difference. It may help in some ways, but it wont make them equal.

not really. 1080p PC gaming, try it. you can see it. it still is an issue,and it bugs the hell out of me.

I didn't say it was a complete non-issue, i said it was less of one.

I've also seen post-aa @ 1080p quite a bit. I usually find that texture blur is quite hard to spot unless i'm actively looking for it.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.