Jump to content



Photo

AnandTech: Xbox One vs. PS4 - Hardware comparison


  • Please log in to reply
311 replies to this topic

#166 +Brandon Live

Brandon Live

    Seattle geek

  • 9,766 posts
  • Joined: 08-June 03
  • Location: Seattle, WA

Posted 22 May 2013 - 23:17

But at 1080p/60.. all 3 would look the same :)


You seem to be fixated on the final rasterization aspect and ignoring the rest. While I find it unlikely that the difference will be substantial, it is fairly straightforward to see that having additional shader units means greater capacity for, well, shading (or anything else you can coax a modern shader unit into doing). This means more detailed effects, more complex meshes, etc. Yes, you're pushing the same number of pixels, but there are more resources available to spend perfecting their arrangement :-)


#167 Athernar

Athernar

    ?

  • 3,001 posts
  • Joined: 15-December 04

Posted 22 May 2013 - 23:19

snip


Not quite the commentary I wanted.

The issue ongoing here is the overriding focus on just one facet of the topic (the framebuffer) when there are a myriad of other tasks that also need to be completed and factored in when talking about bandwidth allocation.

I don't have a comment on either console as I'm never going to buy them, as I said earlier in the thread - I'm quite happy being part of the "PC Gaming Master Race" - I'd just like this framebuffer silliness to be finally cleared up.

Whoever "wins" this war, I win too. Because both consoles raise the bottom line for engine development over the next x years.

EDIT: Oh you posted exactly what I wanted after I finished the above and posted it, I love you.

#168 +Brandon Live

Brandon Live

    Seattle geek

  • 9,766 posts
  • Joined: 08-June 03
  • Location: Seattle, WA

Posted 22 May 2013 - 23:31

Prerendering is well and good but it's not the magical solution to anything. You cannot pre-render shadows and you cannot pre-render antialiasing, for instance, both of which the PS4 might be able to do somewhat better than Xbox One.


Since I love to be pedantic, I'll point out that AA is probably a bad example. AA is cheap, computationally. I suspect they're evenly matched there (or if anything, perhaps favoring the X1). Though only time will tell for sure.

EDIT: Oh you posted exactly what I wanted after I finished the above and posted it, I love you.


:-)

#169 Andre S.

Andre S.

    Asik

  • 7,708 posts
  • Joined: 26-October 05

Posted 22 May 2013 - 23:56

Prove me wrong then :) Show me 1. Why the design of the Xbox cant possibly do 1080p, in your own language. 2) why it HAS to be the PS4 design. Do it!

But I never said or implied the Xbox One cannot do 1080p. I actually said that even the 360 could do 1080p, at 60fps with that, provided a sufficiently simple scene to render. Actually, my old Radeon 9600SE from 2004 could do 1080p at 60fps provided all it had to render was a rotating teapot with flat shading. You can achieve the resolution and framerate you want with pretty much any hardware provided you're not rendering something computationally expensive.

Obviously both consoles support 1080p and both consoles will achieve similar framerates and similar level of visual quality. Given its faster hardware, however, games will end up looking somewhat better on PS4.

You're the one assuming you know better than Microsoft and yet, admitting you don't know why they did what they did and ignoring every valid reason I've stated page after page that they probably thought of as well. Makes no sense. How do you know there is a tradeoff when neither company has delivered anything playable? how do you know their design isn't going to work or be fully capable at 1080p?

I never said their design wasn't going to work or not be fully capable at 1080p. Obviously their design will work for what they intend it to do, i.e. great-looking games at 1080p and 30-60fps. That said, PS4 should achieve slightly better visual quality because it can do more calculations per frame. This is the logical conclusion to make from the specs we have. You're trying to argue that the extra power in PS4 will be useless, which is inane.

I'm not at Microsoft any more so there's no "you guys" :-)

Oops, sorry about that. Since when?

#170 Melfster

Melfster

    Neowinian Senior

  • 1,699 posts
  • Joined: 04-August 05

Posted 23 May 2013 - 00:03

I don't understand why people think graphics are going to be important from what I see of both Xbox and PS4 there is nothing really spectacular about either console. The best engine I have seen so far has been the Frostbite engine BF4 running on PC. But I don't graphics will win the console wars next gen.

#171 +Brandon Live

Brandon Live

    Seattle geek

  • 9,766 posts
  • Joined: 08-June 03
  • Location: Seattle, WA

Posted 23 May 2013 - 00:04

Oops, sorry about that. Since when?


No worries. About a month now, though I was actually on sabbatical for a few months before that.

#172 soniqstylz

soniqstylz

    Neowin Trophy Slore

  • 8,683 posts
  • Joined: 30-September 06
  • Location: In your panty drawer

Posted 23 May 2013 - 03:03

What's interesting to me is the idea that the specs of the One really won't matter, depending on how much computing devs offload to the cloud. I read an article on Venturebeat that said that devs could offload AI processing, physics calculations, and even some rendering tasks to the cloud, and over time, the net raw processing power will increase, as MS replaces their servers.


That would require an always-on online connection, something that has been established by the internet commentariat as a non-starter.

#173 TheLegendOfMart

TheLegendOfMart

    Neowinian Senior

  • 9,281 posts
  • Joined: 01-October 01
  • Location: England

Posted 23 May 2013 - 06:18

The image is still a fixed amount of pixels, if you can process every pixel and fill the display with 700 shaders, 400 more won't do squat.. why is that so hard to grasp?

768 isn't enough though, benchmarks of PC games barely get past 30FPS at 1080p with a PC equivalent as nearest we can get to Xbox One GPU, those extra 384 shaders is the difference between 30 and 60FPS at 1080p.

#174 BajiRav

BajiRav

    Neowinian Senior

  • 10,660 posts
  • Joined: 15-July 04
  • Location: Xbox, where am I?
  • OS: Windows 8.1, Windows 8
  • Phone: Lumia 920

Posted 23 May 2013 - 11:11

That would require an always-on online connection, something that has been established by the internet commentariat as a non-starter.

I wonder if it can switch on the fly. Use the "cloud" when it's available otherwise limit itself to the box.

#175 TheLegendOfMart

TheLegendOfMart

    Neowinian Senior

  • 9,281 posts
  • Joined: 01-October 01
  • Location: England

Posted 23 May 2013 - 11:14

No developer is going to want to be seen as having a game performing differently whether you are connected to the cloud or not. It is either going to be online required and have the cloud processing AI, etc.. or it's not going to have it at all.

It's the same reason multiplatform games look very similar if not identical on either console, they can't have a game looking noticeably better on one over the other or people will cry.

#176 Blackhearted

Blackhearted

    .....

  • 3,240 posts
  • Joined: 26-February 04
  • Location: Ohio
  • Phone: Samsung Galaxy S2 (VM)

Posted 23 May 2013 - 11:16

I honestly don't see the cloud being useful in many situations. The wildly variable performance and latency of the internet make it completely useless for anything that needs to be kept reasonably in sync with the box.

#177 OP +Audioboxer

Audioboxer

    Hermit Arcana

  • 36,169 posts
  • Joined: 01-December 03
  • Location: UK, Scotland

Posted 24 May 2013 - 11:22

Kotaku: Only 90% of Xbox One GPU usable for games, PS4 GPU up to 66% more powerful - http://kotaku.com/th...angel-509597078

and Jonathan Blow

Posted Image

#178 Blackhearted

Blackhearted

    .....

  • 3,240 posts
  • Joined: 26-February 04
  • Location: Ohio
  • Phone: Samsung Galaxy S2 (VM)

Posted 24 May 2013 - 11:27

Wow. No wonder microsoft wants to push their cloud idea so hard. Cause the system is even weaker at gaming than we thought.

#179 +Brandon Live

Brandon Live

    Seattle geek

  • 9,766 posts
  • Joined: 08-June 03
  • Location: Seattle, WA

Posted 24 May 2013 - 20:36

768 isn't enough though, benchmarks of PC games barely get past 30FPS at 1080p with a PC equivalent as nearest we can get to Xbox One GPU, those extra 384 shaders is the difference between 30 and 60FPS at 1080p.


That seems highly unlikely...

(Even considering a linear increase in framerate with the number of shader units, the best you'd get is a 50% improvement, i.e. 45fps versus 30fps. But it doesn't really work that way either)

#180 George P

George P

    Neowinian Senior

  • 19,009 posts
  • Joined: 04-February 07
  • Location: Greece
  • OS: Windows 8.1 Pro 64bit
  • Phone: HTC Windows Phone 8X

Posted 25 May 2013 - 11:56

That would require an always-on online connection, something that has been established by the internet commentariat as a non-starter.


I think you'll see it used mainly for multiplayer to start in which case being always online is needed anyways.

After that if a developer is smart about it for SP they should be able to detect if you are online, if you are then use the cloud for a nice boost, if not then don't. Either way it'll just be a bonus, not night and day for the game IMO.