AnandTech: Xbox One vs. PS4 - Hardware comparison


Recommended Posts

What the bloody hell are you talking about, do you understand computer hardware AT ALL?

Things like antialiasing, anisotropic filtering, post processing, dynamic lighting ETC cannot be done in advance.

Link to comment
Share on other sites

These enhancements can all be done to your textures in advance because you optimize for single resolutions and the scaling can be done on the fly pretty easily. There is enough "improvement quality" processing power to handle doing the same improvements to every pixel in order to do 1080p on both platforms.

Textures are resolution independent, try again.

Link to comment
Share on other sites

Microsoft lost that money due to poor product design, it was nothing to do with using overly powered components. Powerful components do not destroy computers that have been properly designed.

That's just ignoring everything that one has to understand in building a system, they've learned from those design choices and chose not to make them again. Is it better to risk heat issues by over compensating or is it better to achieve design goals by having streamlined hardware tuned exactly to specification?

Link to comment
Share on other sites

more hardware doesn't make better looking games if all the hardware can push the same 1080p displays. a GeForce 680 pushing a 1080p display would be a GeForce 680 wasting a lot of resources.

I'm sorry, but what you said is retarded. A GAME ON THE PS4 CAN RUN WITH BETTER DETAILS AT THE SAME FRAMERATE AND RESOLUTION AS THE XBOX BECAUSE THE HARDWARE HAS HIGHER SPECS.

Link to comment
Share on other sites

It's better to design the product properly. As I pointed out, powerful components do not destroy computers, bad design does.

Link to comment
Share on other sites

So basically your just trying to say its perfectly fine for the Xbox One on @ 1080p/60 therefor the PS4 is overkill?

I'm saying that the memory speed won't matter, they were different design goals. Sony went GGDR5 and no EDRAM or ESRAM because they wanted something simpler. Microsoft on the other hand can achieve the same effective throughput by leveraging their positive experiences with eDRAM but using faster eSRAM and save money with GDDR3. Those are engineering choices. They're going to reflect system that behave nearly identical but have differing ways of achieving their desired end results.

Link to comment
Share on other sites

In the end, I think the Xbox sounds like the better deal (and plan).
Sorry for off-topic, but I hope you guys will have an announcement to make about independent developers. As it currently stands Xbox One could be the only major gaming platform to not allow self-publishing, and I'd hate to see it lose that share of the cake. XBLIG/XNA had a lot of potential but was poorly managed, hopefully you'll have better plans this time around. :)
Link to comment
Share on other sites

I'm sorry, but what you said is retarded. A GAME ON THE PS4 CAN RUN WITH BETTER DETAILS AT THE SAME FRAMERATE AND RESOLUTION AS THE XBOX BECAUSE THE HARDWARE HAS HIGHER SPECS.

If it only takes 700 shaders to process every pixel of a 1080p display, it doesn't matter if you have 1150 of them, only 700 are needed to get the job done.

Having MORE available doesn't mean you can DO more as you're still working with the same freaking data set..

Sorry for off-topic, but I hope you guys will have an announcement to make about independent developers. As it currently stands Xbox One could be the only major gaming platform to not allow self-publishing, and I'd hate to see it lose that share of the cake. XBLIG/XNA had a lot of potential but was poorly managed, hopefully you'll have better plans this time I'm around. :)

indy game development won't make or break the platform and even though Microsoft's solution has never been perfect, I've had ton of indy games to play and it hasn't been a problem.

Link to comment
Share on other sites

If it only takes 700 shaders to process every pixel of a 1080p display, it doesn't matter if you have 1150 of them, only 700 are needed to get the job done.

Having MORE available doesn't mean you can DO more as you're still working with the same freaking data set..

On the Xbox, one pixel will be a block of color and on the PS4 the same pixel will have bouncing ######.

Link to comment
Share on other sites

I'm saying that the memory speed won't matter, they were different design goals. Sony went GGDR5 and no EDRAM or ESRAM because they wanted something simpler. Microsoft on the other hand can achieve the same effective throughput by leveraging their positive experiences with eDRAM but using faster eSRAM and save money with GDDR3. Those are engineering choices. They're going to reflect system that behave nearly identical but have differing ways of achieving their desired end results.
It's hard to believe a system with 50% more shaders and much higher system memory bandwidth won't perform significantly better than one that only compensates with 32MB cache - and as Anand pointed out, it's not even clear whether that SRAM will be used as a cache buffer. Of course, you'll keep repeating the only purpose of bandwidth is being able to achieve 1080p/60fps and anything beyond that is pointless, but I guess I'm speaking for everyone else.
Link to comment
Share on other sites

On the Xbox one pixel will be a block of color and on the PS4 the same pixel will have bouncing ######.

hahahaha

Link to comment
Share on other sites

I'm saying that the memory speed won't matter, they were different design goals. Sony went GGDR5 and no EDRAM or ESRAM because they wanted something simpler. Microsoft on the other hand can achieve the same effective throughput by leveraging their positive experiences with eDRAM but using faster eSRAM and save money with GDDR3. Those are engineering choices. They're going to reflect system that behave nearly identical but have differing ways of achieving their desired end results.

Yeah i got that but why shoot down GDDR5, the price?

Link to comment
Share on other sites

If it only takes 700 shaders to process every pixel of a 1080p display, it doesn't matter if you have 1150 of them, only 700 are needed to get the job done.

You really should read up on computer graphics instead of making simplistic arguments here. With more shaders available, you could do more shader passes or longer ones. This all translates in more image fidelity. Video games are not movies. They compute the picture you see on screen in real-time. There are many more variables to it than framerate and resolution.
Link to comment
Share on other sites

It's hard to believe a system with 50% more shaders and much higher system memory bandwidth won't perform significantly better than one that only compensates with 32MB cache - and as Anand pointed out, it's not even clear whether that SRAM will be used as a cache buffer. Of course, you'll keep repeating the only purpose of bandwidth is being able to achieve 1080p/60fps and anything beyond that is pointless, but I guess I'm speaking for everyone else.

I don't see why its that hard to understand. You have a finite resource, a finite goal to achieve. Once you achieve it, anything in excess is simply non utilized resources.

In the IT/Programming world there is this long often and little understood theory that when you have a problem, you don't necessary throw more at it to get it done. You can throw more developers at a project and it may take longer, you can throw more computing cycles at it and it may take longer. If you look at things from a more simple approach and engineer for those simple solutions, they often work the best.

That absolutely holds true here.

1080p is a fixed resolution. 60fps is the goal for the refresh/frame rate. So you have a fixed pixel count, a fixed throughput of operations you need to achieve that if you exceed those fixed goals, you're just increasing idle time of wasted resources. You can't make something look better by having to shaders work on the same shading function as the end result is still the same end result.

Now if you need something LARGER than 1080p and you're looking for screaming refresh rates and processing more data, then yes, with more power you can process more data.

The point is, we're talking about fixed problems here that are easy to engineer for with modern hardware.

Heck, I don't see the PS4 Nor the xone having problem even doing 120hz 3d gaming at full resolution, i'm sure the frame interpolation for 3d is designed into the GPU

You really should read up on computer graphics instead of making simplistic arguments here. With more shaders available, you could do more shader passes or longer ones. This all translates in more image fidelity. Video games are not movies. They compute the picture you see on screen in real-time. There are many more variables to it than framerate and resolution.

I'm pretty sure Microsoft designed for some oversampling already. Remember, we're talking about displays where people set back 5+ feet and all that extra stuff is for not.. there are a ton of variables you ignore by focusing so much on PC gaming trying to translate to a console.

what would be the end result of over processing an image that still only has a fixed pixel fidelity?

Link to comment
Share on other sites

1080p is a fixed resolution. 60fps is the goal for the refresh/frame rate. So you have a fixed pixel count, a fixed throughput of operations you need to achieve that if you exceed those fixed goals, you're just increasing idle time of wasted resources. You can't make something look better by having to shaders work on the same shading function as the end result is still the same end result.

The IMAGE present on the screen however is NOT FIXED. It has to be processed in realtime, and the better the hardware you process it with the more efficient the process is. It's a very simple premise, why the hell do you fail to grasp it so completely?

Link to comment
Share on other sites

You really should read up on computer graphics instead of making simplistic arguments here. With more shaders available, you could do more shader passes or longer ones. This all translates in more image fidelity. Video games are not movies. They compute the picture you see on screen in real-time. There are many more variables to it than framerate and resolution.

You're wasting your time, he clearly is too deeply mired in his love of Microsoft to even pay attention for a second.

The only person he'd listen to is Brandon, but I doubt he'll care to set him straight.

Link to comment
Share on other sites

The IMAGE present on the screen however is NOT FIXED. It has to be processed in realtime, and the better the hardware you process it with the more efficient the process is. It's a very simple premise, why the hell do you fail to grasp it so completely?

The image is still a fixed amount of pixels, if you can process every pixel and fill the display with 700 shaders, 400 more won't do squat.. why is that so hard to grasp?

If you have problems with image quality, you can address that with your texture and image pre/post processing and not really have to apply on the fly processing.. again, another beautiful feature of consoles having a single architecture to design for..

i'm off for beer.. you guys have fun! :)

You're wasting your time, he clearly is too deeply mired in his love of Microsoft to even pay attention for a second.

The only person he'd listen to is Brandon, but I doubt he'll care to set him straight.

I don't love Microsoft. i'll own both a ps4 and xbox.. I love engineering and building systems. I'd be fired if I over engineered for unneeded capacity. (and the same goes for under engineering)

Link to comment
Share on other sites

Because you're wrong and obviously know sweet FA about how computer hardware or 3D rendering actually work.

I don't love Microsoft. i'll own both a ps4 and xbox.. I love engineering and building systems. I'd be fired if I over engineered for unneeded capacity. (and the same goes for under engineering)

Then I pity anyone you build systems for because you obviously know sod all about how these things actually work.

Link to comment
Share on other sites

I don't see why its that hard to understand. You have a finite resource, a finite goal to achieve.

No, you do not. A video game is not a movie. It's not pre-determined what the frame will look like. It will only look as good as you can compute it to be given the speed of the hardware in the 1/60th of a second you have to render it.

You could render 10 baddies on-screen or you could render 100. If you're on a slower system you'll render 10, on a faster system you'll render 100. You could render blood on every creature, or just on the ones nearby. You could add specular lighting to those puddles behind you, or could not have the time to. You could do volumetric lighting of the sun rays between the clouds, or you could use a more simple lighting model that takes less resources to process. You could use 4x Multisample Antialiasing for extremely accurate removal of jaggies, or an approximate fullscreen FXAA, or none at all.

Have you ever looked at the graphical options of a game like Crysis 3? There's much more to it than resolution and framerate right? By enabling/disabling the right options, you can achieve 1080p/60fps on pretty much any video card; only, you'll not get the same foliage density, the same quality water rendering, the amount of props, particles, effects, etc.

Video games are not movies. What you see on-screen is what the hardware just computed on-the-fly in 1/30th or 1/60th of a second. Change the hardware, you change what you see. Because the PS4 has faster hardware, it's likely to do better-looking graphics, even though the resolution and framerate might be the same.

I rest my case.

  • Like 3
Link to comment
Share on other sites

I don't love Microsoft. i'll own both a ps4 and xbox.. I love engineering and building systems. I'd be fired if I over engineered for unneeded capacity. (and the same goes for under engineering)

What engineer has absolutely zero awareness of the workload the systems he supposedly works on? Despite it being explained to you multiple times by multiple people, you're being far too pig-headed to accept that there is more to 3D graphics than a damned framebuffer.

It's not about Microsoft, it's not about Sony. It's about the simple fact you're flat out ignorant when it comes to 3D computer graphics.

Link to comment
Share on other sites

If you have problems with image quality, you can address that with your texture and image pre/post processing and not really have to apply on the fly processing.. again, another beautiful feature of consoles having a single architecture to design for..
Larger textures requires guess what? More bandwidth! Post-processing requires guess what? Shader power! It's ALL on-the-fly processing, it's called real-time rendering for a reason. Sigh.
Link to comment
Share on other sites

No, you do not. A video game is not a movie. It's not pre-determined what the frame will look like. It will only look as good as you can compute it to be given the speed of the hardware in the 1/60th of a second you have to render it.

You could render 10 baddies on-screen or you could render 100. If you're on a slower system you'll render 10, on a faster system you'll render 100. You could render blood on every creature, or just on the ones nearby. You could add specular lighting to those puddles behind you, or could not have the time to. You could do volumetric lighting of the sun rays between the clouds, or you could use a more simple lighting model that takes less resources to process. You could use 4x Multisample Antialiasing for extremely accurate removal of jaggies, or an approximate fullscreen FXAA, or none at all.

Have you ever looked at the graphical options of a game like Crysis 3? There's much more to it than resolution and framerate right? By enabling/disabling the right options, you can achieve 1080p/60fps on pretty much any video card; only, you'll not get the same foliage density, the same quality water rendering, the amount of props, particles, effects, etc.

Video games are not movies. What you see on-screen is what the hardware just computed on-the-fly in 1/30th or 1/60th of a second. Change the hardware, you change what you see. Because the PS4 has faster hardware, it's likely to do better-looking graphics, even though the resolution and framerate might be the same.

I rest my case.

You can rest your case all you want, but the beauty of consoles is that they are engineered systems. You do design the game to fit the requirements of the platform you are developing for. Its one reason that set PC gamers in such an uproar is that their games were optimized for 360 and ported to pc.

I never said they were like movies, but any good developer is going to optimize their entire workflow to take advantage of the platform so that they don't have to do in post-processing what they can do in preprocessing.

and again, we're still working with finite conditions that are very easy to plan and engineer for especially since the hardware to get us here is well established and well known.

Also, look at the front page, people are so na?ve about Microsoft they think Microsoft couldn't have possibly engineered the hardware alongside AMD that they're debating the validity of something AMD posted.

Its mostly because you guys are just so fixated on what you want to hear, rather than the reality there of.

Do ANY of you work in computing field? processing? scaling? data operations? do any of you ever look at things from a mathematical perspective? Do you not realize the vast differences of a 1080p screen 5 feet from your face running at 1920x1080 and the different requirements thereof? comparing the engineering goals of a console connected to a TV to a video card connected to a computer display with differing goals and differing viewing angles is vastly vastly vastly different.

With a console, you absolutely optimize all all points. YOu don't need to develop for so many unknowns/resolutions/processors/gpus/drivers/patch levels/versions.. yaddy yaddy yadda

Link to comment
Share on other sites

Larger textures requires guess what? More bandwidth! Post-processing requires guess what? Shader power! It's ALL on-the-fly processing, it's called real-time rendering for a reason. Sigh.

Stop and think here for a minute.. Please.

Let me ask you this. What are larger textures to you. Please think for a moment when you answer that question. Also, what does more bandwidth get you? What do you need to post-process so much? have you defined how excess post-processing makes a game better? Shader power? If you have enough shaders to do everything to begin with, over processing does what?

Nothing is hard about real-time rendering and both machines shouldn't break a sweat doing it at 1080p..

Link to comment
Share on other sites

You do design the game to fit the requirements of the platform you are developing for.
Yes. Assuming developers take full advantage of each platform, they'll be able to do more with the PS4 than with the Xbox One, just like they'll able to do much more with either one of these consoles than with the 360 or PS3. Give engineers more powerful tools and they'll make better products.

Prerendering is well and good but it's not the magical solution to anything. You cannot pre-render shadows and you cannot pre-render antialiasing, for instance, both of which the PS4 might be able to do somewhat better than Xbox One.

Do ANY of you work in computing field? processing? scaling? data operations?
Yes, I'm a software developer with a bachelor in computer graphics and multimedia. I actually worked at EA with the Frostbite 2 engine. But I suppose if you won't trust PS4/AMD engineers to know what they're doing then why trust anyone with any kind of degree?
  • Like 3
Link to comment
Share on other sites

I am more interested in the quality of the components and which console will last longer. LOL, there are 30+ year old Ataris that work still and many people's Xboxs and Playstations died within several years of using them!

This..

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.