PS4 and Xbox One resolution / frame rate discussion


Recommended Posts

I don't feel like the problem is with people are saying the PS4 is more powerful, more so people undervaluing the power of the X1. For example, the 40-50% claim which is bonkers.

That's the difference in resolution between 900p and 1080p. That's not to say the PS4 is overall that much more powerful, just that it manifests as a substantial difference in resolution.

 

In addition, when people usually discuss DX12, a lot of people usually shout up in rage saying that it won't make the X1 more powerful than the PS4. We know the GPU is not as good, we're just discussing the impacts of the X1 and DX12. 

Many were touting DX12 as a gamechanger for the XB1, so in that context it's right to look at comments from developers (and Microsoft itself) about the impact it will have. Obviously any improvement is welcome, especially if it will help improve DX12 adoption on PC for multiplatform titles.

 

Also cloud, cloud is a hilarious one for me. The cloud is definitely feasible for gaming application, just not in real time. This could give a big visual benefit in games which the PS4 couldn't achieve. For example, totally destructible cities in Crackdown (If that happens). Then again, it doesn't suddenly mean the GPU is more powerful, just means your relieving the CPU from work.

The problem is that we haven't seen any meaningful implementations. Further, developers are free to implement cloud solutions of their own via companies like Google and Amazon, as well as Sony (if it chooses to do so). Microsoft touted it as a major feature, yet it's turned out to be utterly irrelevant. That doesn't mean the situation won't change in the future but for now it appears it was all hype. Even if the cloud proves to be extremely powerful, which it hasn't so far, it will still be limited to a limited number of scenarios in a small number of games. Multiplatform games won't utilise it and they make the majority of titles.

Link to comment
Share on other sites

1. That's the difference in resolution between 900p and 1080p. That's not to say the PS4 is overall that much more powerful, just that it manifests as a substantial difference in resolution.

2. it will still be limited to a limited number of scenarios in a small number of games.

1. Except based on ocular sciences it's not that big of a difference.

2. You have no way of knowing this.

Link to comment
Share on other sites

1. Except based on ocular sciences it's not that big of a difference.

 

Based on mathematical fact, what you say is untrue. 

 

900p = 1440000 pixels 

 

vs

 

1080p =  2073600 pixels

 

That is a 44% difference. "ocular science" tells me you're wrong.

Link to comment
Share on other sites

Based on mathematical fact, what you say is untrue. 

 

900p = 1440000 pixels 

 

vs

 

1080p =  2073600 pixels

 

That is a 44% difference. "ocular science" tells me you're wrong.

 

Perhaps you should do some research then rather than narrowing everything down to a single number? Because that number doesn't actually represent what our eyes perceive.

Link to comment
Share on other sites

Perhaps you should do some research then rather than narrowing everything down to a single number? Because that number doesn't actually represent what our eyes perceive.

More pixels means more taxing on hardware. Xbox 1 has shown that it can't handle a lot of games at 1080p. Simple facts.

Link to comment
Share on other sites

More pixels means more taxing on hardware. Xbox 1 has shown that it can't handle a lot of games at 1080p. Simple facts.

 

 

Numbers dont lie, that's true... but visually the difference is so miniscule its not worth the debate.

 

Yes the PS4 has more juice under the hood, better hardware and all, sure the numbers say 40%+ better or whatever, but putting the games side by side it doesn't matter...

 

Dirty Larry said it best...

Link to comment
Share on other sites

Numbers dont lie, that's true... but visually the difference is so miniscule its not worth the debate.

 

Yes the PS4 has more juice under the hood, better hardware and all, sure the numbers say 40%+ better or whatever, but putting the games side by side it doesn't matter...

 

Dirty Larry said it best...

 

Of course it matters, there is a difference. a 44% difference.

Link to comment
Share on other sites

Of course it matters, there is a difference. a 44% difference.

 

When I put on PS4 games and Ones games, the difference in graphics from 900 to 1080 isn't as drastic in real world results.

 

Did you know that Ryse was only 900p prior to being told by the pros.?

It still looks considerably better than a lot of native 1080 games out there.

Link to comment
Share on other sites

Of course it matters, there is a difference. a 44% difference.

44% more pixels doesn't mean 44% difference in image quality.

In fact 900p will look better on a 900p 27 inches display than 1080p on a 1080p 60+ inches display. Things will just be smaller. But nobody got a 900 display ...

What really matter is how much the console needs to stretch the image. 1080/900 = 1.2. You lose 1 pixel every 6 pixels. Not too hard to interpolate for the xbox one to be honest. That's why 900p is used. Anything higher but not yet 1080p would probably give a worse result. At 900p the interpolation is easy enough that you are not even close to lose 44% of the image quality.

This said the console still need to interpolate and unlike 4k display that can interpolate 1080p by just doubling the pixels without loosing any image quality 900p to 1080p will see a slight lost in image quality. Nothing to bad but coupled with the crappy FXAA used by consoles it definitely leads to a blurrier image.

Link to comment
Share on other sites

It's about the weight given to such an opinion. He's perfectly free to make such a comment but it doesn't carry the weight of a major developer like CD Project RED, which is an established studio at the cutting edge of games development. Further, the comments made defy existing trends (i.e. that substantially more XB1 games are unable to hit 1080p, which wouldn't be the case if they were 'nearly identical'). It's like asking a McDonald's employee about haute cuisine - they may be knowledgeable but I'd put more weight into the opinion of a three star Michelin chef. I think that's a perfectly reasonable position.

 

 

That's a poor, poor metaphor.

 

It'll be more like comparing a line cook who works at a huge restaurant with a guy who runs his own small restaraunt, where he's the only one employed.

 

The part about McDonalds makes absolutely no sense.

Link to comment
Share on other sites

1. Except based on ocular sciences it's not that big of a difference.

That has nothing to do with the claim made. We're talking about the raw performance, not the perceived difference.

 

2. You have no way of knowing this.

As I explained, most games are multiplatform meaning that they won't take advantage of Microsoft exclusive cloud features as they have to consider all platforms. We've also seen that very few developers have expressed an interested in it, especially outside of XB1 exclusives. It's been 18 months and we haven't had any meaningful cloud implementation, with none that I'm aware of on the horizon. Could that change in the future? Sure, though it seems very unlikely.

 

Numbers dont lie, that's true... but visually the difference is so miniscule its not worth the debate.

 

Yes the PS4 has more juice under the hood, better hardware and all, sure the numbers say 40%+ better or whatever, but putting the games side by side it doesn't matter...

And yet people will be lining up for the next-next-gen consoles because they run at 4K, just as they were for the XB1 and PS4. As a PC gamer I consider 1080p to be a low resolution and that's without even considering the visual fidelity of current consoles, so the idea that 900p is more than enough is simply wrong. Visual fidelity is one of the main selling points of the current consoles, so it's only fair they be judged on that.

Link to comment
Share on other sites

More pixels means more taxing on hardware. Xbox 1 has shown that it can't handle a lot of games at 1080p. Simple facts.

The fact is that PS4's GPU has better specs.

XBO can handle all games at 1080p if devs spend enough time. (probably has equal or more first party AAA exclusives at 1080p)

Link to comment
Share on other sites

That has nothing to do with the claim made. We're talking about the raw performance, not the perceived difference.

 

 

As I explained, most games are multiplatform meaning that they won't take advantage of Microsoft exclusive cloud features as they have to consider all platforms. We've also seen that very few developers have expressed an interested in it, especially outside of XB1 exclusives. It's been 18 months and we haven't had any meaningful cloud implementation, with none that I'm aware of on the horizon. Could that change in the future? Sure, though it seems very unlikely.

 

 

And yet people will be lining up for the next-next-gen consoles because they run at 4K, just as they were for the XB1 and PS4. As a PC gamer I consider 1080p to be a low resolution and that's without even considering the visual fidelity of current consoles, so the idea that 900p is more than enough is simply wrong. Visual fidelity is one of the main selling points of the current consoles, so it's only fair they be judged on that.

 

4K is a much bigger leap in visuals.  And PC is a different target altogether.  A moving target.  Dual 980's or Titans or whatever are not the norm, its out there but not the norm.  The norm is mid-range PC's that still run/look better than PS4/One (better processing/gpu power) but not drastically better.

 

I'm gonna take a stab at this and say, you are not in the norm when it comes to your gaming rig.  you are outside of it.

Link to comment
Share on other sites

1. That has nothing to do with the claim made. We're talking about the raw performance, not the perceived difference.

 

2. As I explained, most games are multiplatform meaning that they won't take advantage of Microsoft exclusive cloud features as they have to consider all platforms. We've also seen that very few developers have expressed an interested in it, especially outside of XB1 exclusives. It's been 18 months and we haven't had any meaningful cloud implementation, with none that I'm aware of on the horizon. Could that change in the future? Sure, though it seems very unlikely.

 

1. Then would you be fine if Sony decided to put that raw performance towards something other than 1080p? I think the reality is that the difference is assumed to be important because of an expected difference in an end result. But the reality is the resulting differences aren't that great unless you paint them so. If the perceived difference doesn't matter then this whole debate about resolution doesn't matter. So why is Sony wasting their raw power on something that apparently doesn't matter? Oh right, because 1080p is a buzz word and sells games / consoles to ignorant gamers.

 

2. The majority of games have always been multi-platform, I don't understand how this is relevant. And more assumptions doesn't make what you're saying here any more relevant. Because it's just predictions based on really, nothing. We can claim there's not really anything out using it. But the reality is that there's just not much out at all on either platform.

  • Like 3
Link to comment
Share on other sites

1. Then would you be fine if Sony decided to put that raw performance towards something other than 1080p? I think the reality is that the difference is assumed to be important because of an expected difference in an end result. But the reality is the resulting differences aren't that great unless you paint them so. If the perceived difference doesn't matter then this whole debate about resolution doesn't matter. So why is Sony wasting their raw power on something that apparently doesn't matter? Oh right, because 1080p is a buzz word and sells games / consoles to ignorant gamers.

 

2. The majority of games have always been multi-platform, I don't understand how this is relevant. And more assumptions doesn't make what you're saying here any more relevant. Because it's just predictions based on really, nothing. We can claim there's not really anything out using it. But the reality is that there's just not much out at all on either platform.

 

So we should have just stayed on 720p like last gen? :/ Every new console generation with faster hardware has always meant better graphics. Sony is not wasting anything allowing developers full access to the technical specs of the GPU. 3GB of RAM is set aside for the OS, and multitasking/suspend&resume all work fine.

 

1080p just so happens to be the native resolution of 90% of the HDTVs out there now. Moving forward from upscaling isn't a waste, it's where we should be considering 1080p HDTVs have been around for years.

 

If anything they've thought very carefully about the future, have you seen the recent project Morpheus SDK update? 60FPS/120FPS and low latency. 

Link to comment
Share on other sites

So we should have just stayed on 720p like last gen? :/ Every new console generation with faster hardware has always meant better graphics. Sony is not wasting anything allowing developers full access to the technical specs of the GPU. 3GB of RAM is set aside for the OS, and multitasking/suspend&resume all work fine.

 

1080p just so happens to be the native resolution of 90% of the HDTVs out there now. Moving forward from upscaling isn't a waste, it's where we should be considering 1080p HDTVs have been around for years.

 

If anything they've thought very carefully about the future, have you seen the recent project Morpheus SDK update? 60FPS/120FPS and low latency. 

 

I'm not saying that. You're missing the point of my post. On one hand 44% difference is a huge difference, but somehow it's not about wether or not we actually notice the difference. Even if we couldn't see the difference, apparently the fact it's 44% different still matters because... it's 44% different.

 

The difference between 900p and 1080p is far less than 720p vs 1080p. All I'm trying to say is if the end result isn't what matters, why are we making such a ruckus about a statistic which only effects the end result? The only reason we want 1080p on our games is because the gaming industry has pushed the resolution as some godly thing. Because it's 'native' or whatever. But the truth is most gamers can't actually tell the difference between 1080p and 900p because our eyes are just not that good. Without having the differences pointed out to us AND the differences being side-by-side in stills (not in motion). To me what this means is I might just rather have my 1080p games dropped to 900p if it means freeing up '44%' of their resources. I think the PS4 would get even better looking games by sacrificing in an area that doesn't actually matter beyond a certain point.

Link to comment
Share on other sites

I'm not saying that. You're missing the point of my post. On one hand 44% difference is a huge difference, but somehow it's not about wether or not we actually notice the difference. Even if we couldn't see the difference, apparently the fact it's 44% different still matters because... it's 44% different.

 

The difference between 900p and 1080p is far less than 720p vs 1080p. All I'm trying to say is if the end result isn't what matters, why are we making such a ruckus about a statistic which only effects the end result? The only reason we want 1080p on our games is because the gaming industry has pushed the resolution as some godly thing. Because it's 'native' or whatever. But the truth is most gamers can't actually tell the difference between 1080p and 900p because our eyes are just not that good. Without having the differences pointed out to us AND the differences being side-by-side in stills (not in motion). To me what this means is I might just rather have my 1080p games dropped to 900p if it means freeing up '44%' of their resources. I think the PS4 would get even better looking games by sacrificing in an area that doesn't actually matter beyond a certain point.

 

Most people (myself anyway) who say 44% do so as it makes sense of what "180p" is. When we talk about rendering differences it makes much more sense to say it how it is mathematically. No one goes oh the Snapdragon 805 processor is only 5 Snapdragons more than the 800. They say the CPU is "insert x%" faster on the Snapdragon 805. This is the way it works, and has done so since I started building my home computers and researched benchmarks for CPUs/GPUs/Memory.

 

And no, we don't just want 1080p as it's industry jargon. As I pointed out already, many of us just want our TVs to output a native image. Do you run your PC monitor sub-native? Doubt it. Or apply a PC wallpaper on your computer that isn't native to the screen resolution? It's satisfying to get what your TV/monitor is capable of, used, as agree with me or not an upscaled image is never going to look as good as a native image. 

 

Easier to accept last generation running at 720p on 1080p TVs, or even sub-720p a lot of the time as the hardware was the 1st generation of HD consoles. This time around we've had 1080p TVs since the PS3 and 360 launched. 10~11 years ago? I guess you could say it's my personal opinion MS should have launched with a better GPU, and if costs were an issue, maybe management shouldn't have bundled in Kinect and spent more on the GPU. That is hindsight, and I guess Mattrick walked partly due to that, but unfortunately hindsight isn't much use on a fixed hardware console - You cannot modify the innards down the line to catch up. Sony learned that going with CELL.

Link to comment
Share on other sites

Most people (myself anyway) who say 44% do so as it makes sense of what "180p" is. When we talk about rendering differences it makes much more sense to say it how it is mathematically. No one goes oh the Snapdragon 805 processor is only 5 Snapdragons more than the 800. They say the CPU is "insert x%" faster on the Snapdragon 805. This is the way it works, and has done so since I started building my home computers and researched benchmarks for CPUs/GPUs/Memory.

 

And no, we don't just want 1080p as it's industry jargon. As I pointed out already, many of us just want our TVs to output a native image. Do you run your PC monitor sub-native? Doubt it. Or apply a PC wallpaper on your computer that isn't native to the screen resolution? It's satisfying to get what your TV/monitor is capable of, used, as agree with me or not an upscaled image is never going to look as good as a native image. 

 

Easier to accept last generation running at 720p on 1080p TVs, or even sub-720p a lot of the time as the hardware was the 1st generation of HD consoles. This time around we've had 1080p TVs since the PS3 and 360 launched. 10~11 years ago? I guess you could say it's my personal opinion MS should have launched with a better GPU, and if costs were an issue, maybe management shouldn't have bundled in Kinect and spent more on the GPU. That is hindsight, and I guess Mattrick walked partly due to that, but unfortunately hindsight isn't much use on a fixed hardware console - You cannot modify the innards down the line to catch up. Sony learned that going with CELL.

 

Processor speed does not directly relate to resolution. It's a poor comparison because a faster cpu is exponentially more effective in hardware than what we get out of more pixels on screen. You're comparing different things here, entirely different things.

 

Your eyes can only perceive so much detail. It doesn't matter if there's 1080 vertical pixels or 900 vertical pixels in many standard viewing setups because your eyes simply cannot tell the difference (and this is a fact). A processor being faster is the elimination of a bottleneck. In hardware, that's signficant. But when it comes to screen resolution the bottleneck is more often than not the user's vision. Not the screen/gpu. Raw numbers in this scenario are a misrepresentation of the reality of the topic.

 

So unless we'd like to continue ignoring how people's eyes work why don't we stop trying to make this sound worse than it is? Because I've yet to see anyone disprove that people most always cannot tell the difference between 900p and 1080p. So why does 1080p even matter. Why does it matter if it's 'native'. What do you gain out of being able to call it 'native resolution'? A few more pixels, pixels most people can't even tell weren't there? You can claim it'll never look as good as a native image yet it's scientifically provable (and tested) that people would be hard pressed to notice without the differences being pointed out or without a side-by-side comparison. 

 

If two red cars are slightly leaning more towards yellow and more towards green (I mean, barely leaning in either direction but enough that there's a difference). If you can't tell the difference between them unless you put both vehicles side-by-side does the difference really matter? I don't think so. They're both still red. You're effectively saying that you can tell it's not the real 'candy-apple red' even though it's proven that most people can't tell the difference between reproduction and real candy-apple red paints. Your 'native' tag is effectively 'candy-apple' here. Just a title that amounts to a marginal if not unnoticeable change.

 

So what this leads us to is why are these buzzwords like 'native' and '1080p' so important if we can prove that neither of them play such a pivotal role as you suggest? Are you really sure it's not just industry jargon that people just fail to actually understand? I think it is, and as a result marketing can use it as a tool.

Link to comment
Share on other sites

Processor speed does not directly relate to resolution. It's a poor comparison because a faster cpu is exponentially more effective in hardware than what we get out of more pixels on screen. You're comparing different things here, entirely different things.

Your eyes can only perceive so much detail. It doesn't matter if there's 1080 vertical pixels or 900 vertical pixels in many standard viewing setups because your eyes simply cannot tell the difference (and this is a fact). A processor being faster is the elimination of a bottleneck. In hardware, that's signficant. But when it comes to screen resolution the bottleneck is more often than not the user's vision. Not the screen/gpu. Raw numbers in this scenario are a misrepresentation of the reality of the topic.

So unless we'd like to continue ignoring how people's eyes work why don't we stop trying to make this sound worse than it is? Because I've yet to see anyone disprove that people most always cannot tell the difference between 900p and 1080p. So why does 1080p even matter. Why does it matter if it's 'native'. What do you gain out of being able to call it 'native resolution'? A few more pixels, pixels most people can't even tell weren't there? You can claim it'll never look as good as a native image yet it's scientifically provable (and tested) that people would be hard pressed to notice without the differences being pointed out or without a side-by-side comparison.

If two red cars are slightly leaning more towards yellow and more towards green (I mean, barely leaning in either direction but enough that there's a difference). If you can't tell the difference between them unless you put both vehicles side-by-side does the difference really matter? I don't think so. They're both still red. You're effectively saying that you can tell it's not the real 'candy-apple red' even though it's proven that most people can't tell the difference between reproduction and real candy-apple red paints. Your 'native' tag is effectively 'candy-apple' here. Just a title that amounts to a marginal if not unnoticeable change.

So what this leads us to is why are these buzzwords like 'native' and '1080p' so important if we can prove that neither of them play such a pivotal role as you suggest? Are you really sure it's not just industry jargon that people just fail to actually understand? I think it is, and as a result marketing can use it as a tool.

You missed the point, I'm not comparing resolution to unrelated hardware components, like CPU speed. I was explaining why we use mathematical descriptors to compare like for like hardware. Being able to support a higher resolution is a benefit of one GPU that is faster than an other. Whatever percentage faster it is.

As for the rest of your post I will agree to disagree. In my opinion it's complete nonsense to try and state it's a fact we cannot tell the difference between 900p and 1080p.

  • Like 1
Link to comment
Share on other sites

As for the rest of your post I will agree to disagree. In my opinion it's complete nonsense to try and state it's a fact we cannot tell the difference between 900p and 1080p.

 

As I've said, we can't tell the difference unless we are either shown the difference, or are looking at side-by-side comparisons (in stills, which games are never in stills). There have been tons of visual studies done on what people can and cannot see at certain distances and resolutions on screen. This is why phone screens have such high PPI now. Because we hold them so close we can easily pick out lower resolutions. But when you're 10-15' away from the screen that becomes far more difficult. People don't realize it, but the reason they stick all those televisions in the store in isles that are ~5-7' wide is because when you stand closer the picture looks better on higher res screens making the more expensive displays seem like better purchases. This is the only reason 4k even becomes relevant these days (ie if you're ~4' from your 55" TV).

 

The only reason to boost the resolution any more than we currently have now is for aliasing purposes. A more natural crushing of pixels will surpass anything artificial aliasing can do (which is why some games are allowing 4K resolutions despite monitors not supporting 4k). I tried this with the recent Lord of the Rings game and it looked fantastic. But then again, I was around ~1.5' feet from my 2560x1080 screen. Any greater distance and I'd probably not have noticed the change.

 

So, based on the way that human eyes work I can confidently claim that, when it involves normal viewing conditions, people really can't tell the difference. And often they see differences that aren't even there.

 

http://www.cnet.com/news/why-4k-tvs-are-stupid/

 

 

This math, or just looking at your TV, tells you that you can't see individual pixels. What's interesting is that a 720p, 50-inch TV has pixels roughly 0.034 inch wide. As in, at a distance of 10 feet, even 720p TVs have pixels too small for your eye to see.

 

 

 

A few years ago I did a TV face-off with trained TV reviewers and untrained participants with Pioneer's Kuro plasma (768p) against several 1080p LCDs and plasmas. Not one person noticed the Kuro wasn't 1080p. In fact, most lauded it for its detail. Why? Its contrast ratio was so much better than on the other TVs that it appeared to have better resolution. The difference between light and dark is resolution. If that difference is more pronounced, as it is on high-contrast ratio displays, they will have more apparent resolution.
Link to comment
Share on other sites

I guess I'll find my old scart cables, put my PS3 in 480p and move my couch back a bit? Sounds like a plan -_-

Link to comment
Share on other sites

So, based on the way that human eyes work I can confidently claim that, when it involves normal viewing conditions, people really can't tell the difference. And often they see differences that aren't even there.

 

http://www.cnet.com/news/why-4k-tvs-are-stupid/

Nonsense. I've seen plenty of 4KTVs and can easily tell them apart from 1080p, even at a distance. Certainly I can tell plasma apart, as the sub-1080p resolutions are very noticeable. I have a 1600p IPS display and can easily see the pixels, without putting my face right up to the screen.

 

I remember all the claims that there was no difference between 128k/192k MP3s and WAV files yet the reality is that if you know what you listening for

  • Like 1
Link to comment
Share on other sites

Nonsense. I've seen plenty of 4KTVs and can easily tell them apart from 1080p, even at a distance. Certainly I can tell plasma apart, as the sub-1080p resolutions are very noticeable. I have a 1600p IPS display and can easily see the pixels, without putting my face right up to the screen.

 

I remember all the claims that there was no difference between 128k/192k MP3s and WAV files yet the reality is that if you know what you listening for

Link to comment
Share on other sites

This topic is now closed to further replies.