Jump to content



Photo

AnandTech: Xbox One vs. PS4 - Hardware comparison


  • Please log in to reply
311 replies to this topic

#16 OP Audioboxer

Audioboxer

    Hermit Arcana

  • 35,500 posts
  • Joined: 01-December 03
  • Location: UK, Scotland

Posted 22 May 2013 - 15:15

Engineered systems for the win. We have what, 3 resolutions to aim for. 720p, 1080p and 4k. Even if you max out at 4k resolution that is 12,746,752 pixels, the Xbox One shouldn't have any problem holding 60fps because of the effective bandwidth with the ESRAM and GDDR3 averaging out.

Has the memory actually been really confirmed one way or another? I know sony mentioned gddr5 but I don't think there was a solid commitment.

GDDR5 is very limited supply and big $$

1080p won't push GDDR3 alone..


It was confirmed at their presser.... You don't think it's a solid commitment?

By the way 4K is for video only, not gaming.


#17 spacer

spacer

    I'm awesome

  • 6,161 posts
  • Joined: 09-November 06
  • Location: Connecticut, USA
  • OS: Windows 7
  • Phone: Nexus 4

Posted 22 May 2013 - 15:20

Didn't I read somewhere that the Nextbox will reserve 3GBs of RAM for the OS and all of the non-gaming related crap? So isn't that, in and of itself, a huge difference in potential gaming performance between the two? Obviously the PS4 will have some RAM reserved, but I don't remember reading about it needing anywhere close to 3GBs.

#18 spudtrooper

spudtrooper

    Neowinian Senior

  • 3,095 posts
  • Joined: 19-October 10
  • OS: Windows 8
  • Phone: Nokia 920

Posted 22 May 2013 - 15:20

It was confirmed at their presser.... You don't think it's a solid commitment?

By the way 4K is for video only, not gaming.


if 4k is for video only, then gddr5 is wasted entirely..


#19 OP Audioboxer

Audioboxer

    Hermit Arcana

  • 35,500 posts
  • Joined: 01-December 03
  • Location: UK, Scotland

Posted 22 May 2013 - 15:21

if 4k is for video only, then gddr5 is wasted entirely..


Seriously? :s

Didn't I read somewhere that the Nextbox will reserve 3GBs of RAM for the OS and all of the non-gaming related crap? So isn't that, in and of itself, I huge difference in potential gaming performance between the two? Obviously the PS4 will have some RAM reserved, but I don't remember reading about it needing anywhere close to 3GBs.


Yeah it's 3GB

#20 NateB1

NateB1

    Neowinian Senior

  • 1,659 posts
  • Joined: 09-January 07

Posted 22 May 2013 - 15:23

What's interesting to me is the idea that the specs of the One really won't matter, depending on how much computing devs offload to the cloud. I read an article on Venturebeat that said that devs could offload AI processing, physics calculations, and even some rendering tasks to the cloud, and over time, the net raw processing power will increase, as MS replaces their servers.

If this is the case, then the One has a *huge* advantage over the PS4, which makes spec comparisons like this almost irrelevant. Who cares about the CPU/memory speed if there are massive datacenters that can perform computation tasks?

The only thing that might make a slight difference is the GPU - I'm surprised it's so much less powerful than the PS4.


#21 spudtrooper

spudtrooper

    Neowinian Senior

  • 3,095 posts
  • Joined: 19-October 10
  • OS: Windows 8
  • Phone: Nokia 920

Posted 22 May 2013 - 15:23

Didn't I read somewhere that the Nextbox will reserve 3GBs of RAM for the OS and all of the non-gaming related crap? So isn't that, in and of itself, I huge difference in potential gaming performance between the two? Obviously the PS4 will have some RAM reserved, but I don't remember reading about it needing anywhere close to 3GBs.


Does this really matter? Most pc games barely use any memory and if the OS is already in ram along with every service and supporting feature for the game it means the game engine is the only thing being loaded. "Engineered" systems have the benefit of being optimized so the OS isn't having to guess. Having the OS running all the time while the game can remain thin and optimized for resources could be win win

BUT, we shall see! definitely interested in seeing how these systems compete but i'm not worried about the ram situation. I don't know of a single game I play on my PC that uses more than a couple gb of ram if even that much.

#22 BajiRav

BajiRav

    Neowinian Senior

  • 10,332 posts
  • Joined: 15-July 04
  • Location: Xbox, where am I?
  • OS: Windows 8.1, Windows 8
  • Phone: Lumia 920

Posted 22 May 2013 - 15:23

if 4k is for video only, then gddr5 is wasted entirely..

Neither console is powerful enough for 4K graphics.

#23 spudtrooper

spudtrooper

    Neowinian Senior

  • 3,095 posts
  • Joined: 19-October 10
  • OS: Windows 8
  • Phone: Nokia 920

Posted 22 May 2013 - 15:28

Seriously? :s


You don't need GDDR5 memory to play back 4k content and all the decoding is handled by specialized chips anyway. 4kp is what, 3.8gbps uncompressed and gddr3 can do what, 6 times that? all the external interfaces will handle the load anyway and we will never see uncompressed 4kp during this things product lifecycle.. if anything it will be h264 on bdroms at high resolution but still compressed and capable of playing back no problem on ps4 or xone

with all that said, anyone have an effective rate of straight GDDR5 vs the effective rate of 32m eSRAM + GDDR3?

Microsoft is doing something cool to keep cache misses and latency to little or nothing.. didn't they say the Kinect does 2gbps alone? they got some magic sauce for sure, can't wait to see the ps4 and xbox one opened up and know whats inside.

clarification: video is assumed to be 24fps.. i don't for see 48p 4k being on consumer levels for a long time.. would need 10gE to watch that if not on raw film :)

#24 Cocoliso

Cocoliso

    Neowinian

  • 87 posts
  • Joined: 13-July 09

Posted 22 May 2013 - 15:42

not really, consoles usually had CPU's that you couldn't get in a standard PC... aka custom Power Chips, RISC chips that are custom designed like the N64, etc..... now we are moving more towards standard CPU's with a few extra on die features


Moreover the new Xbox specs look like a PC from 2 years ago. So i guess PCs will be stuck again with gfx from 5 years ago, **** you MS!

#25 BajiRav

BajiRav

    Neowinian Senior

  • 10,332 posts
  • Joined: 15-July 04
  • Location: Xbox, where am I?
  • OS: Windows 8.1, Windows 8
  • Phone: Lumia 920

Posted 22 May 2013 - 15:45

not really, consoles usually had CPU's that you couldn't get in a standard PC... aka custom Power Chips, RISC chips that are custom designed like the N64, etc..... now we are moving more towards standard CPU's with a few extra on die features

First Xbox was an x86 class processor. Xbox 360/PS3 are based on PPC (5 I think ?) which were also used in Macs until 2007. Macs are a type of PC etc. etc.
Consoles have been specialized PCs is what I was trying to get at.

#26 vetneufuse

neufuse

    Neowinian Senior

  • 16,285 posts
  • Joined: 16-February 04

Posted 22 May 2013 - 15:59

First Xbox was an x86 class processor. Xbox 360/PS3 are based on PPC (5 I think ?) which were also used in Macs until 2007. Macs are a type of PC etc. etc.
Consoles have been specialized PCs is what I was trying to get at.


The PPC chips in the xbox360, and Wii / Wii U where custom designed not off the shelf chips you got in the Macs... they had the same instruction sets, but they had differences also that made them not the same...

the first xbox was only an intel Celeron chip, and that was just to save time and money to get a system out so it was kinda a one off at the time... everyone else had custom designed chips.... in the end we are all from the same lines.... CISC or RISC basically.. heck even now days intel CISC chips have RISC cores... we went from the old day of make everything custom, to today customize the normal

#27 Cocoliso

Cocoliso

    Neowinian

  • 87 posts
  • Joined: 13-July 09

Posted 22 May 2013 - 16:02

First Xbox was an x86 class processor. Xbox 360/PS3 are based on PPC (5 I think ?) which were also used in Macs until 2007. Macs are a type of PC etc. etc.
Consoles have been specialized PCs is what I was trying to get at.


They were still quite different than any x86 architecture around, especially the PS3 with the cell cpus. In any case as a PC user I thought we were finally going to get a jump on the GFX side for a few years but MS pull this console with already outdated hardware and we all know what this mean games will be made for the lowest common denominator.

#28 Blackhearted

Blackhearted

    .....

  • 3,172 posts
  • Joined: 26-February 04
  • Location: Ohio
  • Phone: Samsung Galaxy S2 (VM)

Posted 22 May 2013 - 17:45

if 4k is for video only, then gddr5 is wasted entirely..

Lol. To say that gddr5 is wasted on a gaming machine. You've gotta be insane.

What's interesting to me is the idea that the specs of the One really won't matter, depending on how much computing devs offload to the cloud. I read an article on Venturebeat that said that devs could offload AI processing, physics calculations, and even some rendering tasks to the cloud, and over time, the net raw processing power will increase, as MS replaces their servers.

If this is the case, then the One has a *huge* advantage over the PS4, which makes spec comparisons like this almost irrelevant. Who cares about the CPU/memory speed if there are massive datacenters that can perform computation tasks?

The only thing that might make a slight difference is the GPU - I'm surprised it's so much less powerful than the PS4.


In a perfect would, maybe something like that might be feasible. But this ain't a perfect world. There are too many issues that'll stop that whole 'cloud computing' idea from allowing the xbox one to become more powerful of a machine.

#29 spudtrooper

spudtrooper

    Neowinian Senior

  • 3,095 posts
  • Joined: 19-October 10
  • OS: Windows 8
  • Phone: Nokia 920

Posted 22 May 2013 - 18:01

Lol. To say that gddr5 is wasted on a gaming machine. You've gotta be insane.


If the machine is designed to only play games up to 1080p and movies at 4k, yes, gddr5 is wasted. There aren't enough pixels to justify the bandwidth/fill rate. 1080p/60 is what, ~3gbps and no game developer is wasting cycles redrawing every pixel of every scene but rather modeling textures and lighting already active in memory or on ROP

In a perfect would, maybe something like that might be feasible. But this ain't a perfect world. There are too many issues that'll stop that whole 'cloud computing' idea from allowing the xbox one to become more powerful of a machine.


Doesn't make sense. The problems with games aren't really CPU/GPU, its the fact they're largely scripted, process driven and event base - you play one, you play them all. ONe of the announced features were dynamic maps and dynamic multiplayer so the worlds you play would be different each time you play them. This is possible because of the cloud. That's the kind of stuff that will make gaming fun if you ask me.

The graphics are already amazing, but again, come on, we're talking 1080p. HD is already said and done, we're talking about more interactivity, more personalized experiences, more interaction and more store. More WIN

#30 Atomic Wanderer Chicken

Atomic Wanderer Chicken

    Assistant Special Agent Chicken in charge

  • 3,699 posts
  • Joined: 20-August 12
  • Location: Black Mesa Research Facility, USA
  • OS: Windows 95 with Microsoft Plus
  • Phone: Motorola MicroTAC Elite

Posted 22 May 2013 - 18:06

I am more interested in the quality of the components and which console will last longer. LOL, there are 30+ year old Ataris that work still and many people's Xboxs and Playstations died within several years of using them!



Click here to login or here to register to remove this ad, it's free!