PS4 and Xbox One resolution / frame rate discussion


Recommended Posts

If this weren't true then 4K wouldn't be what we'd be seeing next. We'd be seeing 1550p as the new HD at 44% more pixels than 1080p.

The reason we're not seeing 1550p and we're going to 4k has nothing to do with what is "substantial". They're going to 4k because companies want to avoid having to do some sort of algorithm to display existing content on a odd ball resolution like 1550 that will likely result in some sort of artifacts. If you have 2 pixels, a red and a blue and you go to 4 pixels it's easy to just make two reds and two blues. If you go to 3 pixels then you have a red, maybe a purple in the middle, and a blue. Now your hard edge looks blurry. Or maybe you do two reds and a blue on one line and alternate with one red and two blues on the next, now you have jaggies. Since a lot of content is already 1080p they just doubled it to 2160p for 4k. Your 1080p content looks exactly the same on a 60" 4k TV as it did on a 60" 1080p display (on the 4k you just have 2x2 pixels acting as each of your old pixels) If you try to display 1080p content on a 1550p display what are you doing to do with that extra 470 pixels?

Whether or not a percent is an objective fact, numbers can be skewed in meaning by context. And what you guys are doing is tampering with the context and selecting the single data point that helps your side while ignoring the nature of the very topic we are discussing.

 

You guys can care about that 44% difference all you want, but the truth and objectivity is in the entire picture. Not just a single number.

There are two DIFFERENT issues being discussed here.

1) Is there a "substantial" difference between 1080p and 900p

2) Is this difference perceivable by the average person

1) can be true even if 2) is not (and I'm not saying it is or isn't). I personally did not comment on 2) as it's subjective. It also depends on a number of factors like how big the display is, how close you sit to it, etc. No matter if you can personally observe the difference or not a 30%+ difference in ANYTHING is "substantial" in my book. I'm curious as to what your threshold is since that's apparently not enough. If you're criteria is that you have to be able to personally see the difference then there is no "substantial" difference between anything microscopic either. A single cell organism isn't substantially different in size from a subatomic particle because you can't see the difference. That's an absurd criteria.

As for point 2) given the topic of this thread it stands to reason people who participate in it include a good portion of people who CAN perceive the difference. (maybe they have HUGE TVs, great vision, sit really close, whatever... it doesn't matter, just because you can't see it doesn't mean NO ONE can) If you can't and don't care why do you even read this thread, that's what it's all about? Even if they can't see the difference though it's also representative of how much power developers are able to squeeze out of the consoles. People like to see the difference even if they don't notice it personally. A lot of people will choose a car with a 180mph top speed over a similarly priced car with a 120mph top speed even though they'll never drive the car over 80mph. Both would be fine, they'll never actually notice the difference, but it's nice for some people to know they got the one that can go faster.

  • Like 1
Link to comment
Share on other sites

So you have years of experience in a field you claim doesn't exist (or is a lie)? How does that work? [emoji14]

I guess everything that can be done with "clouds" has been done, right?

It is substantial % but if people couldn't tell that KZ was running at 50% lower resolution on PS4 then 30% lower is going to be even more difficult especially when most people won't be running every game side by side.

Can we say XB1 is more efficient because it is running 100% more operating systems (200% if you count the hypervisor) on "inferior hardware"?

% are meaningless here because PS4 is pushing more pixels but lacks AF that XB1 is doing. So are more pixels better or improved textures better?

Ha ha. You know what I mean. The cloud of course does work but not in the way MS sold it.

Of course things will progress with it but right now and for a good few years what MS promised isn't possible. Maybe next gen.

Link to comment
Share on other sites

And it would look better running at 1080p. It looked great on my machine running at 1600p.

 

If Microsoft hadn't decided to force Kinect upon everyone and instead invested in adequate hardware for the XB1 then this wouldn't be an issue.

 

 

The point is, you don't even know until someone tells you.

 

You, or myself didn't know that a portion of KillZone was sub '1080p' until we were told...

 

Nobody knew Ryse was only 900p until we were told...

 

900p vs 1080p... Just isn't that big a deal... On paper it's a BIG deal... But once again, when I pick up a controller and game... I don't notice it at all...

 

Unless side by side (event then hard to tell) or someone tells you, You usually don't even know...

 

I care more about can I play this game, than 900p vs 1080p..  Like you said and others... I own AC:Unity, and that's just horrible ... The difference in looks between PS4 and One is whatever, because both look more than good enough...But the game is just unplayable...

 

Don't get me wrong, I do want a game to be as pretty as it can be... I do want that in my $400 and $500 purchases... But I don't lose any kind of sleep over 900p vs 1080p...

 

If I have to take 900p but the PS Camera or Kinect can add something to the game... I'm gonna go for features all day... The graphical hit is minor compared to an added feature...

Link to comment
Share on other sites

The point is, you don't even know until someone tells you.

 

You, or myself didn't know that a portion of KillZone was sub '1080p' until we were told...

 

Nobody knew Ryse was only 900p until we were told...

 

900p vs 1080p... Just isn't that big a deal... On paper it's a BIG deal... But once again, when I pick up a controller and game... I don't notice it at all...

The difference is there and it is certainly is noticeable in side-by-side comparisons. An apt analogy would be the difference between a vehicle that does 30mpg and another than does 45mpg - the difference might not be perceivable without comparison but that doesn't mean it's not important.

 

I care more about can I play this game, than 900p vs 1080p..  Like you said and others... I own AC:Unity, and that's just horrible ... The difference in looks between PS4 and One is whatever, because both look more than good enough...But the game is just unplayable...

 

Don't get me wrong, I do want a game to be as pretty as it can be... I do want that in my $400 and $500 purchases... But I don't lose any kind of sleep over 900p vs 1080p...

This is about consumers being informed. Some people will care more about exclusives or brand loyalty than performance, which is fine. However, many will care about performance and visual fidelity. If I were interested in buying a console I would certainly pick the PS4 over the XB1, just like last generation I would have chosen the X360 over the PS3. It's not just resolution either - some games on XB1 run at half the framerate, which is even more noticeable.

 

I'm not trying to claim that gaming on the XB1 is like rubbing salt into an open wound. All I'm pointing out is that, exclusives aside, the PS4 offers the better gaming experience. Both consoles are underpowered though.

Link to comment
Share on other sites

You posted a screenshot demonstrating the visual difference between 900p and 1080p, which clearly shows that 900p is more blurry. If you want to pretend that resolution isn't important then that's fine but if that's the case then why bother with the next-gen consoles at all? Their primary selling point is their graphical capability. As I pointed out, some XB1 games are running at just 720p - that's unacceptable.

 

 

If you want truth and objectivity then I'll simplify my argument:

 

1) The PS4 outperforms the XB1

2) The XB1 struggles to hit 1080p

 

Happy now?

 

The screenshot shows a difference, whether or not it is significant is a different matter (and frankly relative to the person). There are two issues with the representation here. First, this is a still shot (most games aren't still-life as far as I know). Second, I can assume you're not looking at this on a television. Higher PPI displays will magnify the blurriness (PC Monitors and Phones, to name a few).

 

We bother with next gen consoles because resolution is not the single factor we judge our purchases on. Next gen graphics is not 1080p or not, it's lighting. physics, particle effects, animation quality, AI, etc. You cannot boil it all down to 1080p.

 

 

The reason we're not seeing 1550p and we're going to 4k has nothing to do with what is "substantial". They're going to 4k because companies want to avoid having to do some sort of algorithm to display existing content on a odd ball resolution like 1550 that will likely result in some sort of artifacts. If you have 2 pixels, a red and a blue and you go to 4 pixels it's easy to just make two reds and two blues. If you go to 3 pixels then you have a red, maybe a purple in the middle, and a blue. Now your hard edge looks blurry. Or maybe you do two reds and a blue on one line and alternate with one red and two blues on the next, now you have jaggies. Since a lot of content is already 1080p they just doubled it to 2160p for 4k. Your 1080p content looks exactly the same on a 60" 4k TV as it did on a 60" 1080p display (on the 4k you just have 2x2 pixels acting as each of your old pixels) If you try to display 1080p content on a 1550p display what are you doing to do with that extra 470 pixels?

There are two DIFFERENT issues being discussed here.

1) Is there a "substantial" difference between 1080p and 900p

2) Is this difference perceivable by the average person

1) can be true even if 2) is not (and I'm not saying it is or isn't). I personally did not comment on 2) as it's subjective. It also depends on a number of factors like how big the display is, how close you sit to it, etc. No matter if you can personally observe the difference or not a 30%+ difference in ANYTHING is "substantial" in my book. I'm curious as to what your threshold is since that's apparently not enough. If you're criteria is that you have to be able to personally see the difference then there is no "substantial" difference between anything microscopic either. A single cell organism isn't substantially different in size from a subatomic particle because you can't see the difference. That's an absurd criteria.

As for point 2) given the topic of this thread it stands to reason people who participate in it include a good portion of people who CAN perceive the difference. (maybe they have HUGE TVs, great vision, sit really close, whatever... it doesn't matter, just because you can't see it doesn't mean NO ONE can) If you can't and don't care why do you even read this thread, that's what it's all about? Even if they can't see the difference though it's also representative of how much power developers are able to squeeze out of the consoles. People like to see the difference even if they don't notice it personally. A lot of people will choose a car with a 180mph top speed over a similarly priced car with a 120mph top speed even though they'll never drive the car over 80mph. Both would be fine, they'll never actually notice the difference, but it's nice for some people to know they got the one that can go faster.

 

All I'm trying to do is set a baseline instead of arbitrarily choosing who does and does not notice a difference. We can't say it's significant or insignificant based on anecdotal evidence. Bringing in televisions is a nice way to generalize to an understanding of what the standard consensus by a particular industry is when it comes to a meaningful difference in resolution. Using case-by-case reads of who thinks what is big or small is not going to get us anywhere.

 

We must define what a big difference is in resolution based on a general standard, not what we as individuals perceive. When I say it's not a big difference I speak from the point of view of how difference industries have jumped upwards in resolution. 480p -> 720p -> 1080p -> 4K is a good indicator of what the television industry thinks are significant enough changes that everyone can appreciate (be it someone with good or bad vision, etc based on focus groups and other studies they've done). And since these games are meant to be played on televisions I figured it was only logical to go with the standards of the intended displays.

 

If we can't go by industry trends then what else is there to put us all on the same page? If we want to be truly objective, then we shouldn't be using our personal opinions and circumstances to dictate what the general population should be caring about.

 

I'm sure different people have different levels of vision, seating arrangements, etc. But the way people represent these circumstantial points is as if they are objective truths which can be misleading from the outside looking in. To those who may look to topics such as these to assist with their understanding and perhaps even buying of these consoles we shouldn't be making wild claims about what is and is not a problem based on our own unique circumstances without making it apparent that those factors are the basis of our opinions.

 

So on the second part of your post we agree that this differentiation is largely subjective. But I disagree that the word 30% is ever significant when out of context. Percentages in and of themselves are an abstraction of a relationship between two numbers. They are meaningless without context. Saying 30% more pixels on-screen means nothing without a visual demonstration of the difference. And even so, these things need a side-by-side comparison for the differences to be apparent. With the HD resolution jumps, these differences were noticeable even without direct, side-by-side exposure to the different display types. In fact, the same can be said of 1080p to 4K, but even then it took a long look for me to pick up the difference between standard HD and UHD on my Mac (a week or so of use before switching back to a non UHD display).

 

This is very easy to understand, and is a phenomenon that's been happening far before the resolution race. Games from the 90's looked much better at the time than they do now. Most of us weren't even aware of how bad they looked until we'd experienced something better for an extended duration.

 

I think this just goes to show that incremental changes are far less significant than the sum of those changes, and often aren't even noticed by the general population. In the 90's I was completely okay with waiting several minutes for an image to load on my PC but now I get frustrated when it takes more than a couple seconds. Yes, the differences are huge but between then and now things slowly got better. This thread is basically arguing about the difference between an image load time between 10mbs and 14mb/s connection. Yes, 4mb/s faster is a 40% increase in speed but is it really that noticeable outside of a large file download? Is 900p vs 1080p really that significant outside of staring at the textures side by side looking for a difference?

Link to comment
Share on other sites

All I'm trying to do is set a baseline instead of arbitrarily choosing who does and does not notice a difference...

All I'm trying to do is understand your definition of "substantial". You reject others use of the word by definitively claiming "900p vs 1080p is not substantial" and dismissing numerical facts like percentages. So I'm trying to get a baseline for your use of the word instead of arbitrarily choosing it's value an a case by case basis based on your own personal perception.

If you can see the difference has NOTHING to do with if the difference is substantial or not. If you had two 20" TVs and one had a 8,640p display and the other had a 86,400p display then the second one has a "substantially" higher resolution display. There is HUGE difference between the number of pixels they have but on a 20" display with that high of a pixel density no one is going to be able to tell the difference. The fact that you can't see the difference doesn't mean it's not "substantial" one. There is a "substantial" difference between the resolutions of those two TVs.

As for what TV chose it has nothing to do with the industry getting together and trying to figure out what a "substantial" change would be. There is 480 digital TV because analog was 480 and it's there just for backwards compatibility. They went from that to 720 and 1080 not because they thought the difference was "substantial" but because they went as high as they could go with the bandwidth they had and the compression schemes they could use for digital at the time. Broadcast is 720p or 1080i (Note the p and i) which is roughly equivalent in the bandwidth required because 720p is 921,600 pixels and 1080i is 1,036,800 pixels (the i means it's interlaced and so it only refreshes half the 1080 lines each cycle). Now in order to display the 1080i picture the TVs had to have the full 2,073,600 pixels even if only half were refreshed each cycle. So for applications where there wasn't a broadcast bandwidth constraint (such as Blu-Ray movie players and video game consoles that are connected directly to the TV) they could refresh the entire TV screen at once, i.e. 1080p. 1080p has thus been the gold standard in content creation for TVs since TV went digital. Now is really the first time they've tried to update it (480,720, and 1080 all came out at the same time) with 4k and I've already explained why they picked that number. Again this has nothing to do with the video industry getting together and deciding what it "substantial" or not. They didn't go higher because streaming it would use even more bandwidth and the TVs would be even more expensive and they didn't go any lower because not having an even multiple of the existing 1080p content would result in artifacts for the strange pixel numbers.

So again, I'm just trying to understand the way you are using the term "substantial". If it's an individual's opinion you telling someone else "900p vs 1080p is not substantial" makes no sense. Maybe "900p vs 1080p is not substantial TO YOU" but you can't say it's not to anyone else. If we aren't going to base it on opinion then we need to set a baseline. Does it have to be double the height like 1080p to 4k for you to consider it "substantial"? I'm curious as to where that line is for you. Also if you don't think 900p to 1080p is substantial does that mean you don't think the difference between 720p to 900p is substantial either? Is it then a true statement that the Xbox One doesn't support substantially higher resolution gaming then the Xbox 360? (Or the PS4 for that matter as even 1080p isn't double the height of 720p.)

Link to comment
Share on other sites

It's not substantial because, objectively, you cannot judge a visual upgrade based on a abstract representation of that difference. 100% more of 1% is still just 1%. You get what I mean? 40% sounds significanpu on it's own. But not in context of the diminishing visual returns of resolution. I'm sure if television companies could have gotten away with it, they'd have gone with a lower resolution. 4k I'd not the only even resolution available after 1080p.

If that were true, then why do we have so many disparate computer resolutions in the same aspect ratio? Perhaps computer monitors display pixels differently other than their ppi? I don't know. I'm certainly not claiming the numbers were chosen solely on the idea that the distance was noticeable. But if that didn't factor at all then we should be seeing more 1080p screens under 39" when those televisions started being made. But no, we still see 720p screens at smaller sizes. Why is that? Why have tiered resolutions beyond 480 and 1080?

When I say substantial,I mean one that actually matters in a practical sense. One that affects the consumer beyond a big or small number and manifests itself in a tangible way.

Link to comment
Share on other sites

It's not substantial because, objectively, you cannot judge a visual upgrade based on a abstract representation of that difference. 100% more of 1% is still just 1%. You get what I mean? 40% sounds significanpu on it's own. But not in context of the diminishing visual returns of resolution.

I honestly don't get what you mean. I know you probably think I'm just trolling you or something but I really don't get what you're saying and I'm honestly trying to understand.

Percentages aren't abstract, they ARE objective. They also have a benefit of scaling with the population they are applied to. 50% of 100 things is 50 things while 50% of 1000 things is 500 things. They're both 50% but 500 is a lot bigger then 50 because 1000 is a lot bigger then 100. The advantage of percentages is they scale with the population unlike fixed values yet you keep saying their relevance decreases and the population increases (your "diminishing returns") that makes no sense.

I'm sure if television companies could have gotten away with it, they'd have gone with a lower resolution. 4k I'd not the only even resolution available after 1080p.

4k is 3840x2160 which is exactly 4 1080p screens in a 2x2 grid. It is the lowest whole number multiple of 1080p. Every pixel in any 1080p content just needs to be displayed twice across and twice down, no special algorithms are needed. There is no smaller whole number multiple than 2. They chose the smallest upgrade they could that didn't cause them to have to do some artifact creating method to fill in oddball pixels (or put black borders around 1080p content).

 

If that were true, then why do we have so many disparate computer resolutions in the same aspect ratio? Perhaps computer monitors display pixels differently other than their ppi? I don't know.

Computer monitors can be anything. They aren't even at the same aspect ratio. You can get 16x9 monitors like TVs you can get 16x10 you can get 4x3 you can get whatever the panel makers decide to make. Computer software tends to scale to different resolutions pretty well. The big driver for what is available is what the panel makers decide to mass produce. There are only so many panel makers, different brands of TVs use the same panels, they just add their own controls, their own CPUs, etc. but the display itself is made by only a few companies. A lot of people seem to want their panel to match their TV now... a flat screen is a flat screen to them so 16x9 has become the most popular but it doesn't have to be that.

TV is a whole different beast. What TVs specs are is defined by the content. They display video that doesn't resize so they have to have agreed upon standards. What they do is defined by the broadcast standards. NBC, CBS, ABC, etc. broadcast their signal over the air. That is regulated by the FCC and they sell spectrum for them to use. Only so much data can be sent using a given compression scheme over given spectrum. The compression scheme has to be standardized so all TVs can decode it. So that defines a TV resolution. As I explained at the time they went from analog to digital they picked 720p and 1080i because that's what they had the bandwidth to broadcast. That's as high as they could go given the spectrum they had and the compression scheme they were using. There is nothing like that on computers, no one regulates the content like that and most programs scale or are windowed so it can be anything.

I'm certainly not claiming the numbers were chosen solely on the idea that the distance was noticeable. But if that didn't factor at all then we should be seeing more 1080p screens under 39" when those televisions started being made. But no, we still see 720p screens at smaller sizes. Why is that? Why have tiered resolutions beyond 480 and 1080?

At first it was because they didn't have the technology to produce the high ppi displays to do 1080p below 39". Over time they developed that but now < 39" TV are considered the "budget" market so they don't use quality panels on them. They're designed to be cheap TVs and so they use less expensive panels for them. If you want a better picture they want you to buy the more expensive (and more profitable for them) devices.

Broadcast content is still only 720p and 1080i (there is no 1080p broadcast) and I already explained how they have similar bandwidth requirements. 720p exists instead of just having 480 (old analog size) and 1080 (ideal digital size) because they wanted a resolution where the broadcast video was progressive scanned (paints every line, every frame) instead of the interlaced (paints only every other line, every frame) and 720p has similar bandwidth requirements as 1080i at the same aspect ratio.

  • Like 3
Link to comment
Share on other sites

I honestly don't get what you mean. I know you probably think I'm just trolling you or something but I really don't get what you're saying and I'm honestly trying to understand.
Percentages aren't abstract, they ARE objective. They also have a benefit of scaling with the population they are applied to. 50% of 100 things is 50 things while 50% of 1000 things is 500 things. They're both 50% but 500 is a lot bigger then 50 because 1000 is a lot bigger then 100. The advantage of percentages is they scale with the population unlike fixed values yet you keep saying their relevance decreases and the population increases (your "diminishing returns") that makes no sense.
4k is 3840x2160 which is exactly 4 1080p screens in a 2x2 grid. It is the lowest whole number multiple of 1080p. Every pixel in any 1080p content just needs to be displayed twice across and twice down, no special algorithms are needed. There is no smaller whole number multiple than 2. They chose the smallest upgrade they could that didn't cause them to have to do some artifact creating method to fill in oddball pixels (or put black borders around 1080p content).

 

 

It doesn't make sense because your analogy is flawed. There is a point, all things being equal (screen size, for example) when packing more pixels into something becomes unnoticeable. The resolution gaps between TV's and even computer screens these last few years have been getting larger and larger between each iteration. More and more pixels must be packed in to make more of a difference. Look at this:

 

629251357756495429.png

 

Look at the iteration difference between the previous resolutions and 4K. There's even 2K highlighted here, and other diagrams show 3K and 5K. This idea that 4K was the exact right resolution and the only choice for televisions is certainly not true. There are many points between where they could select a perfectly fine resolution that'd not be distorted. 1080p isn't exactly four 720p screens, neither was 720p to 480p. This is the largest jump in resolution for television ever, which you presume to think is because 4K is the ONLY CHOICE these companies could make, despite being a complete derailment from the previous pattern of scale. It doesn't even match PC resolution patterns:

RESWIDE.GIF

 

The above demonstrates just how incorrect your assumption is. There are MANY resolutions between 1080p and 4K which are perfectly usable.

 

To think that it's difficult for them to process video at differing resolutions at this day and age is frankly odd, to me.

Link to comment
Share on other sites

It doesn't make sense because your analogy is flawed.

No, it's not.

There is a point, all things being equal (screen size, for example) when packing more pixels into something becomes unnoticeable.

Which has NOTHING to do with "substantial". This is point 2) I outlined above and NOT what I'm talking about. It has NOTHING to do with point 1) (if something is "substantial" or not.) I'm not disputing this. It's a separate issue and I have no idea why you keep bringing it up.

 

The resolution gaps between TV's and even computer screens these last few years have been getting larger and larger between each iteration. More and more pixels must be packed in to make more of a difference. Look at this:

 

Look at the iteration difference between the previous resolutions and 4K. There's even 2K highlighted here, and other diagrams show 3K and 5K. This idea that 4K was the exact right resolution and the only choice for televisions is certainly not true.

Your pictures are NOT TV resolutions. As I said computer resolutions can be anything and there are a ton of them. That's what your charts are. TV resolutions have not changed very much at all. In the U.S. there was the NTSC standard and TVs were 480i analog. When TVs went digital they created 480p for compatibility with the NTSC older content and 720p and 1080i for new content. They did both 720p and 1080i for reasons I've already explained related to the "p" and "i". That's all there is. That giant picture you posted has nothing to do with TVs. As for 4k the term is confusing because it's used by different people for different things. With respect to TVs specifically it means 3840x2160. If you go to your local Best Buy or whatever EVERY 4k TV there will be 3840x2160. There are no 2k TVs, there are no WSXGA TVs. There is not TV content at those resolutions, that's computer stuff and they can be anything.

There are many points between where they could select a perfectly fine resolution that'd not be distorted. 1080p isn't exactly four 720p screens, neither was 720p to 480p. This is the largest jump in resolution for television ever, which you presume to think is because 4K is the ONLY CHOICE these companies could make, despite being a complete derailment from the previous pattern of scale. It doesn't even match PC resolution patterns:

Of course it doesn't match PC resolution patterns. PC resolutions have NO BEARING on TV resolutions, they can be ANYTHING. If anything the opposite is true. 1920x1080 is a popular PC resolution because they are copying TVs (they do that to save money on mass producing panels for both TVs and Monitors). TVs don't care what PC resolutions are at all. TV didn't go from 480p to 720p to 1080p as you are saying either, you're wrong. TV was 480i analog for most of it's existence (in the NTSC countries, it's different in PAL countries). When it went digital recently that was a huge change and they used it to reset and make as big a jump as they could. So they went from 480i (analog) to 1080i (digital). 480p and 720p (both digital) came out AT THE SAME TIME. It wasn't an evolution as you seem to think. They released 3 resolutions at the same time for reasons I've repeatedly explained: 480p (digital) is there for compatibility with the 480i (analog) that was going away. 720p is there to present a progressive scan option that uses similar bandwidth to the 1080 interlaced that they were going to. So really it was just 480i to 1080i they just gave a couple extra options. 480i to 1080i is a BIGGER jump than 1080p to 4k as four 480i TVs would just be 960 not 1080. (that's not even really true because 480i content wasn't typically widescreen) So you're wrong there as well.

1080 didn't have to be 4 480 screens because the CONTENT was changing. The move to analog to digital was so huge all the content was changed too (NBC, ABC, CBS, etc. changed their broadcast resolution.) With 4k (TV) there is NO 4k broadcast standard (which is why the term is so confusing.) You aren't going to get 4k NBC, ABC, CBS, etc. on your new 4k TV. So if the broadcast content isn't changing like it did when they went from 480i to 1080i they need to make sure the TVs look great on the old content. To do that they made the new screens pixel height and width an exact multiple of the existing content. The lowest whole number multiple is 2 so you get 4k (TV) which is 3840x2160. There are NOT many points between they could have selected without using some algorithm that would create artifacts. If I give you a row of 1080 pixels how are you going to display them on a 1550 pixel screen? If I have a 2160 pixel screen I can just show every pixel you give me twice and there is ZERO distortion. On a same size TV it will look exactly the same (all other things being equal) the pixel density on the 4k would just be exactly double the 1080p screen. Now sure they have algorithms that can scale the image to 1550 but it creates artifacts, there is NO WAY to do it with ZERO distortion and it takes more CPU power then just doubling the pixels. If you're trying to sell a 4k TV where there is little to no content for people are going to be pretty ticked if the stuff they already own creates artifacts because you picked some oddball resolution.

Link to comment
Share on other sites

I can post pretty pictures too!

This is the TV resolutions including the new UHD (Ultra High Definition 4k and 8k):

500px-8K_UHD%2C_4K_SHD%2C_FHD_and_SD.svg

So down in the bottom left you have SD. The two boxes there are NTSC (480) and PAL (576). Those are the old analog resolutions.

Next you have FHD, also known as 1080p.

Next you have 4k UHD which it clearly states is 2160... exactly 2x FHD.

Next you even have 8k UHD which is 4320... exactly 2x 4k UHD.

I'll even post my source, something you failed to do:

http://en.wikipedia.org/wiki/Ultra-high-definition_television

Link to comment
Share on other sites

Is this what this thread has become, arguing about resolution like a broken record?

 

Isn't this thread moderated anymore like it used to be?

 

 

What's to be moderated? As long as the discussion is civil people can rant all they like about resolutions.

 

 

It seems that you can hold out hope for 1080p/60fps on your console of choice, but you'll probably get 1080p/30fps or 900p/30fps. It was probably naive to believe that modern games, considering how demanding they generally are, would consistently have higher quality outputs on the next-gen consoles.

 

If you really want higher res/fps invest in a gaming PC.

Link to comment
Share on other sites

I can post pretty pictures too!

This is the TV resolutions including the new UHD (Ultra High Definition 4k and 8k):

500px-8K_UHD%2C_4K_SHD%2C_FHD_and_SD.svg

So down in the bottom left you have SD. The two boxes there are NTSC (480) and PAL (576). Those are the old analog resolutions.

Next you have FHD, also known as 1080p.

Next you have 4k UHD which it clearly states is 2160... exactly 2x FHD.

Next you even have 8k UHD which is 4320... exactly 2x 4k UHD.

I'll even post my source, something you failed to do:

http://en.wikipedia.org/wiki/Ultra-high-definition_television

 

Lets just agree to disagree. What the meaning of the word 'substantial' really is in this instance is subjective. Whether or not a 40% increase in resolution is a 'substantial' increase, or perhaps 'substantial enough' is relative. 

 

Personally, I don't think the number even has any relevance to this discussion because the difference is barely tangible. The part you are adamantly ignoring (whether or not the difference, in general, is noticeable by the common viewer) is what is the real issue here. 40% more, 2000% more pixels. Whatever the increase may be if a consumer cannot tell the difference then it's a moot point.

 

This is what I mean by this thing being abstract. It's not directly representative of an end result. If I add 40% more trees to a forest, it's still gonna look like a forest to most people and unless they take their time counting they probably won't even care that there's more trees.

 

We must understand that the original discussion here is 900p vs 1080p. I am sorry I derailed this discussion into a random tangent when all I was attempting to do was give us a working standard. But instead we got swept up in debating a standard (and completely ignoring the whole point of this). And that I suppose we already agree on in a way.

 

There is a non subjective way to judge whether or not the difference in resolution, visually, is significant without using a case-by-case judgement of individuals. And there are tens upon tens upon tens of articles out there that talk about resolutions and what, when and how different resolutions matter.

 

And not ignoring those, and the realities of the medium (high motion and action) I doubt any common consumer will even realize the differences between 900p and 1080p. Yet here we are, people claiming the difference is significant and that the PS4 is "40% more powerful" to the point of bastardizing this number into incorrect propaganda because it's their only saving grace when it comes to this discussion. 40% more sounds big, even when it doesn't come out that way in reality or in application.

 

But I guess we can't really say that. Because the number itself, all things removed, is a big number. And because of that, it means the difference suddenly and magically matters just because 40% is a big number all on its own.

 

You guys can keep clinging to your magic number. I really don't care. But it's obvious you're never going to let it go.

Link to comment
Share on other sites

Lets just agree to disagree. What the meaning of the word 'substantial' really is in this instance is subjective. Whether or not a 40% increase in resolution is a 'substantial' increase, or perhaps 'substantial enough' is relative.

That would be part of my point. My response to you initially was in response to your "900p vs 1080p is not substantial." statement as if YOU were the defining authority on what is or is not substantial. If it's relative as you now claim (and I agree with actually) then you can't claim something is not substantial to someone else. If it's not relative as you implied though I was curious what you, as the apparent authority on the subject, had decided was the threshold required for something to be substantial.

 

Personally, I don't think the number even has any relevance to this discussion because the difference is barely tangible.

Personally indeed. The difference may be barely tangible TO YOU but since you now admit that's relative it may not be to others. If it doesn't matter to you then why even bother reading this thread? This thread is for the people who think it is tangible.

 

The part you are adamantly ignoring (whether or not the difference, in general, is noticeable by the common viewer) is what is the real issue here. 40% more, 2000% more pixels. Whatever the increase may be if a consumer cannot tell the difference then it's a moot point.

I'm adamantly ignoring that issue because it's relative/subjective. I'm not so arrogant as to say someone is wrong if they say something it substantial because I understand that, no matter if it is substantial to me or not, it may very well be substantial to them. There is no point arguing over something that's relative/subjective.

 

This is what I mean by this thing being abstract. It's not directly representative of an end result. If I add 40% more trees to a forest, it's still gonna look like a forest to most people and unless they take their time counting they probably won't even care that there's more trees.

So what? If you agree it's relative by saying "most people" then some clearly disagree. If I set up a thread in a message board for those who do disagree to gather then what good it is for you to come into that thread and tell them they are wrong because you don't share their opinion. Just because most people have one opinion doesn't invalidate all others. Most people don't read this thread, yet here we are.

 

We must understand that the original discussion here is 900p vs 1080p. I am sorry I derailed this discussion into a random tangent when all I was attempting to do was give us a working standard. But instead we got swept up in debating a standard (and completely ignoring the whole point of this). And that I suppose we already agree on in a way.

I don't see how you were trying to give any "working standard". That's what you keep saying but I tried to get a definition from you and all I could get was that YOU can't tell the difference. What YOU can personally perceive is not a good "working standard". What you did was tell someone else their opinion was wrong and try to push your opinion as a so called "working standard". Well you're opinion isn't any more (or less) valuable then the one you dismissed.

 

There is a non subjective way to judge whether or not the difference in resolution, visually, is significant without using a case-by-case judgement of individuals. And there are tens upon tens upon tens of articles out there that talk about resolutions and what, when and how different resolutions matter.

Well if there is, I haven't heard about it from you yet in this thread. The only things I've seen you offer is your OPINION on what is substantial, the fact that YOU PERSONALLY can't perceive the difference, and a completely incorrect understanding of where TV resolutions came from.

Link to comment
Share on other sites

Do people really expect 4k gaming to happen and become the norm soon? I just don't see it, it's not really a matter of the hardware, it's more with TVs. It's going to take at least, and at best, 3 years IMO, for people to own enough 4k TVs in their homes for game makers to aim for that as the de facto goal. For now we're going to stay with 1080p.

Link to comment
Share on other sites

Do people really expect 4k gaming to happen and become the norm soon? I just don't see it, it's not really a matter of the hardware, it's more with TVs. It's going to take at least, and at best, 3 years IMO, for people to own enough 4k TVs in their homes for game makers to aim for that as the de facto goal. For now we're going to stay with 1080p.

 

That's a loaded question.  On consoles 4k gaming won't be the norm anytime soon.  The Xbox One and the PlayStation 4 won't be running 4k games though that may be the target for the Xbox Two and PlayStation 5 sometime around 2020ish.  Is that "soon"?   Even if Nintendo launches a new console to replace the Wii U in the next two years or so and beats the Xbox One and PS4 on specs I bet it will still target 1080p@60.  It just might actually hit it.

 

Computers are a different matter though.  Just like 1024x768 was the "standard" computer resolution when 640x480 was still the TV broadcast standard NTSC (analog) computer games push higher than TVs.  Right now 4k gaming is the holy grail people are building SLI rigs to run at but it won't be "the norm" anytime soon.  Now if you limit your scope to gamers who spend more than $200 on a video card then and upgrade ever couple years then sure, they'll probably hit 4k soon but that's such a small segment of the gaming public games won't be designed specifically for them, they'll just scale up to that.

 

Even setting aside gaming for a second 4k really has an uphill battle even on TVs.  Television signals are still largely defined by the broadcast standards.  The U.S. broadcast standards aren't changing.  There is STILL only 720p and 1080i for HD broadcasts.  Alliances of hardware manufacturers are standardizing UHD 4k and 8k but that's just the hardware companies, there is no TV broadcast plans to support that (or even 1080p for that matter).  Sure there have been some test markets and such but those a more tech demos then any serious effort to upgrade the broadcast resolution.  The hardware companies are banking on the fact that you'll want your 60" TV to have a better resolution than your 6" phone and phone resolutions continue to climb.  They're banking on content being created at that resolution and streamed via IP instead of broadcast (netflix, amazon prime, hulu plus, etc).  They're banking on Disc formats that support that resolution like Blu-Ray 4k.

 

The thing is there are issues with all of these.  With the IP streaming 4k is supported by the new HEVC compression.   Only unless you have a really great connection you still are going to have to crank up the compression and lose quality (Kind of like sliding the quality slider when you save a jpg if you're familiar with that).  So you're probably better off using HEVC on a 1080p signal with super high quality and minimal loss.  Likewise the mass public I'm not sure even went from DVD to Blu-Ray.  I really don't see them adopting 4k Blu-Ray en mass so soon (if ever with streaming now).  It also hurts that the current gen consoles don't have 4k Blu-Ray drives.  Specifically they lack hardware support for HEVC as the Blu-Ray and HEVC standards were finalized too late in their development.  Now maybe they can do a software solution where they use the main CPU to do whatever the GPU doesn't have the hardware to accelerate but that means it will be hot, power hungry, and possibly loud so I'm not sure they'll bother.  So that rules out streaming, discs, and broadcasts... that makes it pretty unlikely they'll be a mass switch to 4k for TV.

 

So since consumers are unlikely to CHOOSE 4k the hardware companies are just going to stop making 1080p TVs with their latest tech.  1080p will be for budget devices but it you want a TV with the latest display technology (OLED for example) or the best network services and such you'll probably have to get a 4k one if you like it or not.  Then maybe if enough people eventually have 4k TVs with little or no content they'll push content developers to provide content at 4k.

Link to comment
Share on other sites

That would be part of my point. My response to you initially was in response to your "900p vs 1080p is not substantial." statement as if YOU were the defining authority on what is or is not substantial. If it's relative as you now claim (and I agree with actually) then you can't claim something is not substantial to someone else. If it's not relative as you implied though I was curious what you, as the apparent authority on the subject, had decided was the threshold required for something to be substantial.

 

Personally indeed. The difference may be barely tangible TO YOU but since you now admit that's relative it may not be to others. If it doesn't matter to you then why even bother reading this thread? This thread is for the people who think it is tangible.

 

I'm adamantly ignoring that issue because it's relative/subjective. I'm not so arrogant as to say someone is wrong if they say something it substantial because I understand that, no matter if it is substantial to me or not, it may very well be substantial to them. There is no point arguing over something that's relative/subjective.

 

So what? If you agree it's relative by saying "most people" then some clearly disagree. If I set up a thread in a message board for those who do disagree to gather then what good it is for you to come into that thread and tell them they are wrong because you don't share their opinion. Just because most people have one opinion doesn't invalidate all others. Most people don't read this thread, yet here we are.

 

I don't see how you were trying to give any "working standard". That's what you keep saying but I tried to get a definition from you and all I could get was that YOU can't tell the difference. What YOU can personally perceive is not a good "working standard". What you did was tell someone else their opinion was wrong and try to push your opinion as a so called "working standard". Well you're opinion isn't any more (or less) valuable then the one you dismissed.

 

Well if there is, I haven't heard about it from you yet in this thread. The only things I've seen you offer is your OPINION on what is substantial, the fact that YOU PERSONALLY can't perceive the difference, and a completely incorrect understanding of where TV resolutions came from.

 

This is a horrible, horrible misrepresentation of what I was saying.

 

It ignores everything there is to do with the science and study of resolutions, viewing distances and screen sizes. On top of that, your pinning it as me somehow saying 'I don't see a difference therefore there is no difference." Which, I've already stated and you've even quoted is not what I want. Let me quote myself again for good measure:

 

 

There is a non subjective way to judge whether or not the difference in resolution, visually, is significant without using a case-by-case judgement of individuals. And there are tens upon tens upon tens of articles out there that talk about resolutions and what, when and how different resolutions matter.

 

This does not exclude me. What I do and don't see is not even a part of my argument. What I am going off of is the history of information provided to those who watch televisions all the time. Things that not only help us buy the right TV for our space, but tell us the advantages and disadvantages of alternatives. Not to mention numerous developer commentary about how little it actually means to do 900p vs 1080p visually (while performance gains are very much worth it... but then again they're evil and liars right?). 

 

Leaving out the fact that game companies have pushed the 1080p bandwagon down our throats as far as they can manage it (because that's truly and completely irrelevant to this discussion) I still don't see where people in here get the ability to declare this generation as the biggest gap in performance ever in consoles. Knowing the differences between the PS3 and 360 at the same time in their life cycles yet people still adamantly declared it's superiority despite the contrary evidence. And now here, roles reversed and there's no leeway at all for Microsoft. It's quite funny how this works out, right?

 

I'm not saying that there's no difference. But the tangibility of the difference of 900p vs 1080p (aka, the average joe looking at a screen and noticing the difference without looking for it) is simply not there outside of edge cases. And those of you claiming to notice it that's all well and fine. I'm sorry it's a problem. But just because you can see the difference does not immediately make it objectively significant. Just as me saying I can't see it does not make it not there.

 

Yes I am aware the above is a circular point, and both sides will use it to their advantage by saying 'you can't claim that cause it's an opinion and subjective!!!'. But that's called a red herring. It's not part of the discussion, nor does it prove anything either waySo if it doesn't further the discussion what does that make it? That's right... POINTLESS

 

This argument for 1080p being the only accepted resolution is essentially, "1 person out of 100 noticed the difference and say it matters to them, YOU CAN'T SAY IT'S NOT A BIG PROBLEM NOW!".

 

Yes, let's just ignore the rest of the room who may not have seen it or don't even care either way. They don't matter cause they don't support our argument  :rolleyes:

 

The whole point of trying to find a standard as I put it for us to all be on the same page was to erase our subjective and anecdotal evidence from the discussion and go by what the industry this pertains to says about the topic. But you had none of that. Tens upon tens of articles talking about how pixels work on different televisions and viewing distances but no. 40% is a big number and these few guys over here say it matters so no matter what you bring to the table it still matters!

 

And please tell me where I've made myself out to be an authority? And I'd love to know why I must be an authority to have an argument. Ever heard of 'Argumentum ad verecundiam'?

  • Like 2
Link to comment
Share on other sites

Back on topic, please.

 

The both of you are going around in circles arguing semantics at this stage.

 

The topic will remain open for those interested in it. Shutting down people's opinions won't be tolerated in here or any topic of the GH. Regardless of which side of the discussion you fall on, at least make it constructive. As always, those posting off topic nonsense will have their posts removed and warned if they continue to ignore the rules set out in OP.

 

Do not reignite the discussion I have just cleaned.

Link to comment
Share on other sites

Do people really expect 4k gaming to happen and become the norm soon? I just don't see it, it's not really a matter of the hardware, it's more with TVs. It's going to take at least, and at best, 3 years IMO, for people to own enough 4k TVs in their homes for game makers to aim for that as the de facto goal. For now we're going to stay with 1080p.

4K gaming is starting to become attainable on high-end PCs (I'm able to run some games at 3620x2263

Link to comment
Share on other sites

 

 

The performance gap this generation is objectively, factually the largest there has been in any generation. We're talking about games on the XB1 running at half the resolution and/or half the framerate. As I pointed out, the difference in performance in GTA4 between the PS3 and X360

  • Like 1
Link to comment
Share on other sites

I'd be inclined to say it is probably the most important factor. It's even greater then exclusivity when you consider there are more multi-platform, AAA games than there are exclusives. If I am going to be buying a lot of multi-platform games, or can only afford one console or the other, I am going to want to know which console they are going to run better on and purchase that one.

I was under impression that people preferred "exclusive unique games" to AAA/multi-plats or is that not the case anymore?

If graphics was the most important factor, why did so many bought PS3 last generation? It was clearly "inferior hardware" connected to "subpar online service" running "inferior software".

Link to comment
Share on other sites

Playing 4k on high end PCs is one thing, I'm talking about it becoming the norm like 1080p is today. That's not going to happen till the majority has 4k TVs in their homes and 4k monitors on their desks. Like I said, it's not about the hardware in the box, it's about the screen in front of people.

 

At best, if prices are right, we could see 4k become the majority screen in 3-4 years. Otherwise we'll have 1080p sticking around for longer. Those playing 4k games on a PC are the minority at this point in time. Heck I can't even find a good deal on a 1440p monitor in my market, they're all overpriced.

Link to comment
Share on other sites

I was under impression that people preferred "exclusive unique games" to AAA/multi-plats or is that not the case anymore?

If graphics was the most important factor, why did so many bought PS3 last generation? It was clearly "inferior hardware" connected to "subpar online service" running "inferior software".

 

 

You might be right. I just rarely have anyone ask me "Hey, which console has the better exclusives?" I always get the generic "which one is better?" question. I interpret better to mean performance wise. If someone asked me which current gen console was performing better, I could only offer one answer at this point.

 

PS3 sales could be because of a variety of factors: existing brand loyalty & familiarity (remember, the PS2 is the biggest selling system of all time), initial backwards compatibility, linux support (although I think this would have been a niche), it launched a year after the 360 which might have given consumers it was 'better' in some vague way. Blu-Ray player. I hardly think the PS3 was particularly inferior. The only issue I remember was that some games didn't run as well on the PS3 as they did on the 360. Bayonetta comes to mind as a notoriously bad multiplatform game which ran poorly on the PS3.

Link to comment
Share on other sites

Some science on TV Screen Resolution optimal seating distances and when detail starts to be lost.

 

http://www.shawndubravac.com/2013/02/what-is-the-point-of-diminishing-returns-for-tv-screen-sizes/

 

Just goes to illustrate 4K is sort of pointless in many respects (unless you feel like sitting < 4' from your 55" TV). This is also a good demonstration of how much of a difference a resolution can make, and how that difference can easily be nullified by normal seating arrangements in the home. Even at 4x the resolution, you'd probably not notice the difference between 4K and 1080p when sitting greater than 6' from a 55" screen.

 

http://gizmodo.com/5280355/guess-what-many-of-you-wasted-money-on-your-1080p-tv-but-theres-hope

 

A little more. This should hopefully help some people gauge whether or not the 900p vs 1080p issue is really going to be a problem.

 

http://en.wikipedia.org/wiki/Optimum_HDTV_viewing_distance

Link to comment
Share on other sites

This topic is now closed to further replies.