AnandTech: Xbox One vs. PS4 - Hardware comparison


Recommended Posts

Let me ask you this. What are larger textures to you. Please think for a moment when you answer that question. Also, what does more bandwidth get you? What do you need to post-process so much? have you defined how excess post-processing makes a game better? Shader power? If you have enough shaders to do everything to begin with, over processing does what?
Console games are starving for more power. They've always been and they'll still be this generation. Don't worry, no cycle will be wasted on "over-processing", and no cycle will go unused either. On any console.
Nothing is hard about real-time rendering and both machines shouldn't break a sweat doing it at 1080p..
No. Both machines will be used at 100% of their power all the time in state-of-the-art AAA titles. If there's any computing resources left they'll be put to use. If a developer choose not to, then the next one will and his game will look better. It's a fierce competition.
Link to comment
Share on other sites

Yes. Assuming developers take full advantage of each platform, they'll be able to do more with the PS4 than with the Xbox One, just like they'll able to do much more with either one of these consoles than with the 360 or PS3. Give engineers more powerful tools and they'll make better products.

Tools don't end with hardware, they begin with great software, engines and intelligent design concepts.

Prerendering is well and good but it's not the magical solution to anything. You cannot pre-render shadows and you cannot pre-render antialiasing, for instance, both of which the PS4 might be able to do somewhat better than Xbox One.

Shadows and anti-aliasing are long solved problems and very easy to do at 1080p.

Yes, I'm a software developer with a bachelor in computer graphics and multimedia. I actually worked at EA with the Frostbite 2 engine. But I suppose if you won't trust PS4/AMD engineers to know what they're doing then why trust anyone with any kind of degree?

So because you worked at EA, you don't trust MS / AMD engineers? haha

Console games are starving for more power. They've always been and they'll still be this generation. Don't worry, no cycle will be wasted on "over-processing", and no cycle will go unused either. On any console.

You still ignored the question I asked, but anyway, there is a point of diminishing returns.. you can claim to want to push that out as farrrrrrrrrrrrrrrrrr as you want and throw away as much money as you want, but you can't escape that reality.

No. Both machines will be used at 100% of their power all the time in state-of-the-art AAA titles. If there's any computing resources left they'll be put to use. If a developer choose not to, then the next one will and his game will look better. It's a fierce competition.

exactly my point I've been stating all along and if you engineer a game for these systems they're going to look more the same than not.

Link to comment
Share on other sites

Tools don't end with hardware, they begin with great software, engines and intelligent design concepts.
Sure...
Shadows and anti-aliasing are long solved problems and very easy to do at 1080p.
You have no idea what you're talking about.
So because you worked at EA, you don't trust MS / AMD engineers? haha
Microsoft has chosen to build a slightly less powerful gaming machine in exchange for, perhaps, lower production costs, thermal envelope, better supply, integrated Kinect, a better TV experience, something else, who knows? It's a tradeoff and they very well know what they're doing; I don't pretend to know or understand exactly what the idea is behind the specs. What's clear is that the Xbox One is a slightly less powerful gaming machine, and that this should mean slightly less spectacular graphics.
Link to comment
Share on other sites

Stop and think here for a minute.. Please.

Let me ask you this. What are larger textures to you. Please think for a moment when you answer that question. Also, what does more bandwidth get you? What do you need to post-process so much? have you defined how excess post-processing makes a game better? Shader power? If you have enough shaders to do everything to begin with, over processing does what?

Nothing is hard about real-time rendering and both machines shouldn't break a sweat doing it at 1080p..

What the F... you cannot surely be for real :/

Link to comment
Share on other sites

You have no idea what you're talking about.

Prove me wrong then :) Show me 1. Why the design of the Xbox cant possibly do 1080p, in your own language. 2) why it HAS to be the PS4 design. Do it!

Microsoft has chosen to build a slightly less powerful gaming machine in exchange for, perhaps, lower production costs, thermal envelope, better supply, integrated Kinect and perhaps a better TV experience, I'm not sure. It's a tradeoff and they very well know what they're doing; I don't pretend to know or understand exactly what the idea is behind the specs. What's clear is that the Xbox One is a slightly less powerful gaming machine, and that this should mean slightly less spectacular graphics.

You're the one assuming you know better than Microsoft and yet, admitting you don't know why they did what they did and ignoring every valid reason I've stated page after page that they probably thought of as well. Makes no sense. How do you know there is a tradeoff when neither company has delivered anything playable? how do you know their design isn't going to work or be fully capable at 1080p?

Link to comment
Share on other sites

What the F... you cannot surely be for real :/

So you're saying you know more than Microsoft, Microsoft Research, ALl of the developers working on next gen software for the Xbox One?

Here is a paper that shows 1080p realtime rendering on gtx 680 at 30fps second with absolutely NO pre-processing, no engineered system other than a CPU and Discrete video card. It's a solved problem that they're brute forcing here in a tech demo

http://www.unrealengine.com/files/misc/The_Technology_Behind_the_Elemental_Demo_16x9_(2).pdf

No sane developer is going to push anything that hard for decades, but they can take what they have learned from this knowledge, apply it logically to their game engine and provide a kick ass game.

if absolute realtime rendering is your goal, the PS4 will fall flat on its face..

So again, what are we talking about here? this is the type of processing knowledge they implemented on the x360 of all devices, imagine how they can tune this for the current gen on both sides.. there is nothing lacking in the xbox one that removes what they were able to achieve on the 360 so applying the same patterns with exponentially more power mean achieving astonishing results without having to brute force but by using good engines, good coding & optimizing for the platform

all said and done, sitting 5 to 8 feet from your screen does have a HUGE impact on diminishing returns of over processing..

Link to comment
Share on other sites

anyway, its been fun. I spend most of my computing experience on map reduce jobs and database systems with gobs of data, but it too is solved problems and while you can solve some of them with brute force hardware, sometimes the "engineered" system winds hands down.

and I can't wait to see how they BOTH pan out.

Its just painfully obvious spec differences aside, NEITHER have *ANY* problems handing 1080p displays. As with any computing device, time is the biggest compromise as tomorrow there will always be something bigger, better and faster.

Link to comment
Share on other sites

Sorry for off-topic, but I hope you guys will have an announcement to make about independent developers. As it currently stands Xbox One could be the only major gaming platform to not allow self-publishing, and I'd hate to see it lose that share of the cake. XBLIG/XNA had a lot of potential but was poorly managed, hopefully you'll have better plans this time around. :)

I'm not at Microsoft any more so there's no "you guys" :-)

(and I never worked on anything remotely Xbox related)

I'd completely disagree with that. Higher price and better quality wins over cheap every time.

Were you referring to my post? I never mentioned the word "cheap." And your statement is easily proven false. Though as I said, I expect the retail prices to be the same. And quality != raw horsepower of one component. Anyway, the PS3 had more expensive components, it didn't help it "win" anything. Quite the opposite, Sony has been imperiled since the launch of the PS3. Whereas the Xbox business has been growing nicely despite market trends and some initial quality problems (RROD).

Link to comment
Share on other sites

Your vision of 3D computer graphics is akin to saying databases only ever do SELECT * FROM `table`.

developers are bothered with that stuff.. i'm the one that matches the data to the technology and builds out the capability and infrastructure to meet the demands of the business and SLA's we agree to. Whether its scaling out sideways with a distributed system or scaling out vertical with a huge server or doing hybrid with exadata type appliances or going at it ourselves.

And its a lot more than a select statement..

One could spend an infinite amount of money and resources on problems, just as you can throw an infinite amount of money and resources at tech, but any sane business person knows that never works out.

Should I spend money to buy a san with 700 disk or 1100 disks when all i'm doing is trying to distribute a workfload to complete a specific job? am I better off with 700 disks and optimized flash cash and optimized process or should I go for 1100 just because I can.. if I go for a 1100 disks am I doing it optimally? same with nodes.. if I need a map reduce of 4096 cpus, am I better going with 2000 mid size nodes or 4096 small nodes? where are my REAL bottle necks and performance problems? are they real? are they superficial? can I solve them through other means? is it the software? the tool? the job?

obviously, I think Microsoft thought of all that, and that's the difference between your views and mine. Just as I expect Sony thought of it too. In the end, they will probably meet in the same places just through different paths.

So how do the PS4 and Xbox One compare to GTX Titan?

Neither come anywhere near it in capability or spec.. that's what, a 1100.00 card right now? heh

But at 1080p/60.. all 3 would look the same :)

Link to comment
Share on other sites

And its a lot more than a select statement..

This is exactly what people have been saying to you for the past n pages.

Your obsession with 1080p is exactly the same thing as if I was to come in to your place of work, tell your boss to fire you because you've overengineered everything - because all a database ever needs to do is SELECT, while ignoring everyone pointing out the existence of INSERTS, UPDATES, JOINS, subqueries etc.

You're seeing one small part of the picture, and what I especially don't get is why you keep arguing when you've admitted you're arguing from a position of ignorance.

There are no views here, only your own ignorance.

Link to comment
Share on other sites

You're wasting your time, he clearly is too deeply mired in his love of Microsoft to even pay attention for a second.

The only person he'd listen to is Brandon, but I doubt he'll care to set him straight.

Hah!

I already stated a few pages back my thoughts. Assuming this information is correct (which seems reasonable but not yet certain), then it seems the PS4 will in fact have a fairly straightforward GPU advantage in the form of additional shader units. It's also reasonable to assume that these aren't there just for bragging rights, and in at least some cases there's sufficient capability to keep them busy.

The memory bandwidth question is more complicated. I don't know why Anand hopes the ESRAM is set up as a cache for the main memory. That actually seems like a less-than-ideal usage to me, but then I'm going from limited information. My understanding of the X360 architecture was that the EDRAM it had was set up to be a perfect place for the frame buffer, which requires a relatively small amount of very fast memory to achieve high resolution, high framerate output, and to enable things like anti-aliasing. Unfortunately for the Xbox 360, the world moved to 1080p rather quickly and its 10MB buffer wasn't quite large enough to serve that purpose (assuming at least double buffering) without some more complicated tricks (which reduced the overall effective bandwidth). Despite that though, using the small but very fast embedded memory as a frame buffer with its own channel to the GPU, that leaves all the general-purpose memory (and its bandwidth) free for other purposes.

When I read that the X1 had 32MB of ESRAM, I assumed this was primarily to serve the frame buffer. Now that it's sufficient size for 1080p output (and maybe higher), some of the trickery of using the Xbox 360's EDRAM will go away. If there's some space and bandwidth left, then by all means it could be used as a cache. But the simplest approach seems to be to use it for the frame buffer (and maybe crank up your AA level to saturate it), and then use the general purpose RAM for everything else. It's easy to argue that the PS4 approach is more flexible, because it is. But whether it's "simpler" is a more complicated question... While more restrictive, a dedicated frame buffer can be simplifying in a different way.

In general I expect the end result is going to look essentially the same. Undoubtedly you'll see some showcase titles with a very particular effect or supremely complex scene claim that they can only do it on the PS4. The number of cases where these claims are true and for non-Sony titles is probably going to be a very small number. The number of people who can tell the difference, even smaller. I also wouldn't rule out the same happening in the other direction. It's certainly possible some multi-platform games will look better on the X1, for example, if the game is otherwise identical but runs at a higher AA level. Not saying that will happen, just that I wouldn't rule it out given the current information.

Overall, I predict we'll see two similar or identically priced consoles. The PS4 has slightly more graphics grunt but you'll probably never notice. The Xbox comes with Kinect and TV integration and maybe a larger hard drive. Since my daily routine includes far more voice control of Netflix and Hulu while I cook dinner than it does scrutinizing subtle differences in video game water effects, it's an easy decision for me :-)

Link to comment
Share on other sites

So if the new Xbox One OS is based of Windows 8 would that mean someone will work out a way to get XBOX games working on a PC? Or Turn your Xbox One into a Complete Windows 8 Pro machine.

Link to comment
Share on other sites

But at 1080p/60.. all 3 would look the same :)

You seem to be fixated on the final rasterization aspect and ignoring the rest. While I find it unlikely that the difference will be substantial, it is fairly straightforward to see that having additional shader units means greater capacity for, well, shading (or anything else you can coax a modern shader unit into doing). This means more detailed effects, more complex meshes, etc. Yes, you're pushing the same number of pixels, but there are more resources available to spend perfecting their arrangement :-)

Link to comment
Share on other sites

snip

Not quite the commentary I wanted.

The issue ongoing here is the overriding focus on just one facet of the topic (the framebuffer) when there are a myriad of other tasks that also need to be completed and factored in when talking about bandwidth allocation.

I don't have a comment on either console as I'm never going to buy them, as I said earlier in the thread - I'm quite happy being part of the "PC Gaming Master Race" - I'd just like this framebuffer silliness to be finally cleared up.

Whoever "wins" this war, I win too. Because both consoles raise the bottom line for engine development over the next x years.

EDIT: Oh you posted exactly what I wanted after I finished the above and posted it, I love you.

Link to comment
Share on other sites

Prerendering is well and good but it's not the magical solution to anything. You cannot pre-render shadows and you cannot pre-render antialiasing, for instance, both of which the PS4 might be able to do somewhat better than Xbox One.

Since I love to be pedantic, I'll point out that AA is probably a bad example. AA is cheap, computationally. I suspect they're evenly matched there (or if anything, perhaps favoring the X1). Though only time will tell for sure.

EDIT: Oh you posted exactly what I wanted after I finished the above and posted it, I love you.

:-)

Link to comment
Share on other sites

Prove me wrong then :) Show me 1. Why the design of the Xbox cant possibly do 1080p, in your own language. 2) why it HAS to be the PS4 design. Do it!

But I never said or implied the Xbox One cannot do 1080p. I actually said that even the 360 could do 1080p, at 60fps with that, provided a sufficiently simple scene to render. Actually, my old Radeon 9600SE from 2004 could do 1080p at 60fps provided all it had to render was a rotating teapot with flat shading. You can achieve the resolution and framerate you want with pretty much any hardware provided you're not rendering something computationally expensive.

Obviously both consoles support 1080p and both consoles will achieve similar framerates and similar level of visual quality. Given its faster hardware, however, games will end up looking somewhat better on PS4.

You're the one assuming you know better than Microsoft and yet, admitting you don't know why they did what they did and ignoring every valid reason I've stated page after page that they probably thought of as well. Makes no sense. How do you know there is a tradeoff when neither company has delivered anything playable? how do you know their design isn't going to work or be fully capable at 1080p?
I never said their design wasn't going to work or not be fully capable at 1080p. Obviously their design will work for what they intend it to do, i.e. great-looking games at 1080p and 30-60fps. That said, PS4 should achieve slightly better visual quality because it can do more calculations per frame. This is the logical conclusion to make from the specs we have. You're trying to argue that the extra power in PS4 will be useless, which is inane.
I'm not at Microsoft any more so there's no "you guys" :-)
Oops, sorry about that. Since when?
Link to comment
Share on other sites

I don't understand why people think graphics are going to be important from what I see of both Xbox and PS4 there is nothing really spectacular about either console. The best engine I have seen so far has been the Frostbite engine BF4 running on PC. But I don't graphics will win the console wars next gen.

Link to comment
Share on other sites

Oops, sorry about that. Since when?

No worries. About a month now, though I was actually on sabbatical for a few months before that.

Link to comment
Share on other sites

What's interesting to me is the idea that the specs of the One really won't matter, depending on how much computing devs offload to the cloud. I read an article on Venturebeat that said that devs could offload AI processing, physics calculations, and even some rendering tasks to the cloud, and over time, the net raw processing power will increase, as MS replaces their servers.

That would require an always-on online connection, something that has been established by the internet commentariat as a non-starter.

Link to comment
Share on other sites

The image is still a fixed amount of pixels, if you can process every pixel and fill the display with 700 shaders, 400 more won't do squat.. why is that so hard to grasp?

768 isn't enough though, benchmarks of PC games barely get past 30FPS at 1080p with a PC equivalent as nearest we can get to Xbox One GPU, those extra 384 shaders is the difference between 30 and 60FPS at 1080p.

Link to comment
Share on other sites

That would require an always-on online connection, something that has been established by the internet commentariat as a non-starter.

I wonder if it can switch on the fly. Use the "cloud" when it's available otherwise limit itself to the box.

Link to comment
Share on other sites

No developer is going to want to be seen as having a game performing differently whether you are connected to the cloud or not. It is either going to be online required and have the cloud processing AI, etc.. or it's not going to have it at all.

It's the same reason multiplatform games look very similar if not identical on either console, they can't have a game looking noticeably better on one over the other or people will cry.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.