PS4 and Xbox One resolution / frame rate discussion


Recommended Posts

Some science on TV Screen Resolution optimal seating distances and when detail starts to be lost.

 

http://www.shawndubravac.com/2013/02/what-is-the-point-of-diminishing-returns-for-tv-screen-sizes/

 

Just goes to illustrate 4K is sort of pointless in many respects (unless you feel like sitting < 4' from your 55" TV). This is also a good demonstration of how much of a difference a resolution can make, and how that difference can easily be nullified by normal seating arrangements in the home. Even at 4x the resolution, you'd probably not notice the difference between 4K and 1080p when sitting greater than 6' from a 55" screen.

 

http://gizmodo.com/5280355/guess-what-many-of-you-wasted-money-on-your-1080p-tv-but-theres-hope

 

A little more. This should hopefully help some people gauge whether or not the 900p vs 1080p issue is really going to be a problem.

 

http://en.wikipedia.org/wiki/Optimum_HDTV_viewing_distance

 

I think you are comparing apples and oranges, Emn1ty.

 

Gaming can't be compared to general viewing. You are right that if I showed groups of people a video in 900p and a video in 1080p they mightn't be able to distinguish the difference, especially past a certain distance, but gaming isn't like passive viewing. The quality and the frequency of the image displayed & refreshed effects the gamer's performance.

 

None of this is to say you can't have a grand old time at some of these lower resolutions/Hz; I argue the quality of the game and the people you play it with is more important than and tech specs, but some things are objectively better. 144hz monitors exist specifically because of gamers. Higher refresh rates are incredible important to the hardcore gamer, especially FPS stuff.

 

This stuff mightn't matter to you, and to some degree I agree with you about that. But it isn't entirely a placebo effect. There are objective reasons why some of these tech advancements are better for the gamer.

Link to comment
Share on other sites

Some science on TV Screen Resolution optimal seating distances and when detail starts to be lost.

 

http://www.shawndubravac.com/2013/02/what-is-the-point-of-diminishing-returns-for-tv-screen-sizes/

 

Just goes to illustrate 4K is sort of pointless in many respects (unless you feel like sitting < 4' from your 55" TV). This is also a good demonstration of how much of a difference a resolution can make, and how that difference can easily be nullified by normal seating arrangements in the home. Even at 4x the resolution, you'd probably not notice the difference between 4K and 1080p when sitting greater than 6' from a 55" screen.

 

http://gizmodo.com/5280355/guess-what-many-of-you-wasted-money-on-your-1080p-tv-but-theres-hope

 

A little more. This should hopefully help some people gauge whether or not the 900p vs 1080p issue is really going to be a problem.

 

http://en.wikipedia.org/wiki/Optimum_HDTV_viewing_distance

4K isn't just about resolution but about image quality and colour depth. Netflix considers HDR to be more significant than the increased resolution. I've been to numerous stores with 4K and OLED displays on the floor and you can spot the difference immediately, not simply close up.

 

As for the links you provide, they only demonstrate how inaccurate such claims are. Each company has their own optimum viewing distance, varying by metres. All I know is that I CAN see the difference between 720p and 1080p

Link to comment
Share on other sites

Your market, UK, is better than mine when it comes to anything over 1080p. The best I can do is a 27" 1440p for

Link to comment
Share on other sites

I was under impression that people preferred "exclusive unique games" to AAA/multi-plats or is that not the case anymore?

If graphics was the most important factor, why did so many bought PS3 last generation? It was clearly "inferior hardware" connected to "subpar online service" running "inferior software".

 

I'm not sure exclusive unique games was ever the driving factor.  It's something that's highlighted because it's a clear and obvious difference and it's certainly pushed by the console market but most games most gamers play are multiplatform.  Here's a list of the top 10 games in the U.S. for December as an example (from NPD):

  • Call of Duty: Advanced Warfare (Xbox 360, Xbox One, PS4, PS3, PC)
  • Grand Theft Auto V (Xbox One, PS4, Xbox 360, PS3)
  • Madden NFL 15 (Xbox 360, Xbox One, PS4, PS3)
  • Super Smash Bros. (Wii U, 3DS)
  • NBA 2K15 (Xbox 360, Xbox One, PS4, PS3, PC)
  • Minecraft (Xbox 360, Xbox One, PS3, PS4)
  • Far Cry 4 (PS4, Xbox One, Xbox 360, PS3, PC)
  • Just Dance 2015 (Wii, Xbox 360, Wii U, Xbox One, PS4, PS3)
  • Destiny (Xbox One, Xbox 360, PS4, PS3)
  • FIFA 15 (Xbox 360, Xbox One, PS4, PS3, Wii, 3DS, Vita)

There is not one Xbox or PlayStation exclusive in that list.  The only exclusive in there at all is the Nintendo exclusive Super Smash Bros.  This is just one month but if you look at other months the results are similar.  Sure you may be able to cherry pick a particular month, especially if you focus on the launch month of marquee exclusive (Like Halo 5 will almost certainly make the list the month it launches) but overall most games most gamers play are multi-platform.

 

If you just pay attention to articles though you'd think exclusives were HUGE.  There was a lot of talk going into Christmas about how the Xbox had a stronger exclusives lineup and how Sunset overdrive was going to be big etc.  Yet none of those exclusives cracked the top 10.

 

As for the PS3 it didn't have inferior hardware, it was just MUCH more difficult to program for.  It lost big in the beginning because it was WAY too expensive as a result of them including the (at the time) very expensive Blu-Ray drive plus it came out about a year later.  Because it's was so far behind (both in time and install base) in the beginning most devs used the more popular system for their development work and then ported multi-platform games to the PS3.  Getting Xbox 360 ports hurts because to tap the true potential of the PS3 you have to design it for it's strengths which were different than the 360's.  It had a steep learning curve that many multi-platform developers just didn't bother trying to tackle, it just wasn't worth the effort to them.  The PS3 never caught up in the U.S. so U.S. developers really never had any reason to change that. Before the current gen launches many assumed the breakdown would be at least similar to last gen with the U.S. going heavily Xbox One and the rest of the world going PS4. This is why it's so amazing that the PS4 outsold the Xbox One in the U.S. for so long though.

 

Most of the complaints around the PS3 centered around it's cost and difficulty to program for not it's hardware performance.  The one big exception there is most devs seem to agree it was a mistake for them to split the RAM into 256 System and 256 Graphics (which can also use system) instead of using a unified bank of 512MB and letting the devs make the split as needed. (Something Sony corrected with the PS4 unified RAM)

 

Xbox 360 did dominate in online services and I'm not sure what you mean by software (the OS?)

 

In the last gen I'd say launch date and price were a MUCH larger factor in the Xbox 360 winning the U.S. then either exclusives or graphics.  The Xbox 360 was on the market for a year before the PS3 launched and was cheaper, those are HUGE advantages.  Likewise this generation price is probably the number on factor as now the XBox One is selling for $350 vs. the PS4's $400 it's outselling the PS4 in the U.S.  Launch dates were effective the same this year so that didn't matter but Xbox One costing more at launch hurt it.  Make the price the same though and the PS4 will outsell the Xbox One due to the better graphics, the difference just isn't enough to pay an extra $50 for the PS4 for most buyers apparently and exclusives aren't really a factor.

Link to comment
Share on other sites

At best, if prices are right, we could see 4k become the majority screen in 3-4 years. Otherwise we'll have 1080p sticking around for longer. Those playing 4k games on a PC are the minority at this point in time. Heck I can't even find a good deal on a 1440p monitor in my market, they're all overpriced.

I think even 3-4 years if very optimistic as the content isn't going to be there. The consoles can't play 4k games, heck that can hardly hit 1080p@60fps and broadcast television isn't going above 1080i any time soon. So what are you going to use your 4k for?

Most 4k content will be delivered via streaming but like 4k gamers people who watch TV via streaming services are a minority. 4k adoption by the masses will likely go hand in hand with the gradual replacement of broadcast television with streaming services and while I agree that does appear to be the future and a lot of us techy types on Neowin might already embrace that saying it will be mainstream with the general public in just 3-4 years seems highly unlikely to me.

The only other major source for 4k content that comes to mind is discs. I just don't see the majority of the public replacing the DVD and existing Blu-Ray drives with new 4k Blu-Ray players. Heck people were already saying Blu-Ray was doomed because of streaming I don't see how another physical disc format is going to drive content in the future... especially this soon and in just 3-4 years.

I'd bet a majority of people will be watching mostly 1080 content for many years to come. The cool thing about 4k to me is HEVC. While it enables 4k video you can also encode 1080p with it and it takes about half the bandwidth. That means more people with slower connections will be able to see better quality 1080p streams and that's cool.

Link to comment
Share on other sites

I think you are comparing apples and oranges, Emn1ty.

 

Gaming can't be compared to general viewing. You are right that if I showed groups of people a video in 900p and a video in 1080p they mightn't be able to distinguish the difference, especially past a certain distance, but gaming isn't like passive viewing. The quality and the frequency of the image displayed & refreshed effects the gamer's performance.

 

None of this is to say you can't have a grand old time at some of these lower resolutions/Hz; I argue the quality of the game and the people you play it with is more important than and tech specs, but some things are objectively better. 144hz monitors exist specifically because of gamers. Higher refresh rates are incredible important to the hardcore gamer, especially FPS stuff.

 

This stuff mightn't matter to you, and to some degree I agree with you about that. But it isn't entirely a placebo effect. There are objective reasons why some of these tech advancements are better for the gamer.

 

What does what I posted have anything to do with refresh rates and frames per second? Let alone motion across screen? That's an entirely different subject matter and does not apply to resolutions in and of themselves. Second, it is an objective fact that humans have a limitation on the level of detail they can see. While refresh rates and fps is a huge part of how an image moves across the screen, I don't think resolution plays into that at all.

 

We can talk from the perspective of computer monitors all day, but consoles are not designed to be used on computer monitors. They are designed for televisions. Sure, they can be used on computer monitors but I think it's safe to say that most people aren't going to be using them there (there are people who will but they should be understanding of the differences).

 

Again, this is just some information for people to decide for themselves if resolution matters. Some of which is straight from the guy who developed the eye exams we all use to get our vision measured. But I suppose even that is just not good enough for so called gamers.

Link to comment
Share on other sites

What does what I posted have anything to do with refresh rates and frames per second? Let alone motion across screen? That's an entirely different subject matter and does not apply to resolutions in and of themselves. Second, it is an objective fact that humans have a limitation on the level of detail they can see. While refresh rates and fps is a huge part of how an image moves across the screen, I don't think resolution plays into that at all.

 

We can talk from the perspective of computer monitors all day, but consoles are not designed to be used on computer monitors. They are designed for televisions. Sure, they can be used on computer monitors but I think it's safe to say that most people aren't going to be using them there (there are people who will but they should be understanding of the differences).

 

Again, this is just some information for people to decide for themselves if resolution matters. Some of which is straight from the guy who developed the eye exams we all use to get our vision measured. But I suppose even that is just not good enough for so called gamers.

 

 

 

So what is your point, exactly? Consumer just shouldn't be so picky and accept objectively less quality on one system?

 

How dare consumer be informed and make a decisions based on the performance of the device.

Link to comment
Share on other sites

So what is your point, exactly? Consumer just shouldn't be so picky and accept objectively less quality on one system?

 

How dare consumer be informed and make a decisions based on the performance of the device.

 

People can buy for whatever reason they want. I'd rather give them all the facts than just one side of the argument.

Link to comment
Share on other sites

71450.png

 

Source:

 

http://anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/3

 

 

Anyone got any opinions on if this is going to be a game changer?

With respect to the Xbox One and the PS4, no.

The consoles already have low level APIs. The PS4 has one called GNM and the Xbox One added one a while back to compete that does much of what DX12 does it's just completely tailored to the Xbox One GPU instead of being a more general API like mainline DirectX.

What DirectX12 does is bring a low level console like API to the desktop in a general way that can apply to different cards from different vendors. It also reduces power consumption for mobile because the latest and greatest cell phones now support DX11 and soon DX12 level graphics. This helps them conserve power and extend battery life.

So for PC and mobile it will be a game changer, for consoles... not so much (not saying there won't be SOME improvement... just not what I'd calling "game changing"). Of course for mobile that will depend on how Windows 10 phones actually sell. If they are still < 10% of the market (especially on the high end as budget phones aren't going to have DX12 capable GPUs) then it's not going to make much difference to mobile either.

Link to comment
Share on other sites

I'd say it's too early to say what changes it'll have for the Xbox One. While the version of DX11 is less general/high level on the Xbox One compared to it's Windows counterpart, that still doesn't mean it's the same to what they've done in DirectX 12. API aside, the update to the Xbox One that brings v12 will also bring a OS update which also brings with it an improved kernel and driver model. The Xbox One still runs Windows, even if it's not the full desktop version, much is the same.

Link to comment
Share on other sites

I'd say it's too early to say what changes it'll have for the Xbox One. While the version of DX11 is less general/high level on the Xbox One compared to it's Windows counterpart, that still doesn't mean it's the same to what they've done in DirectX 12. API aside, the update to the Xbox One that brings v12 will also bring a OS update which also brings with it an improved kernel and driver model. The Xbox One still runs Windows, even if it's not the full desktop version, much is the same.

Xbox head Phil Spencer has stated it won't be a massive difference:

Source

Link to comment
Share on other sites

"Microsoft promises similar "console-like" efficiency for DirectX 12"

 

Microsoft are saying that DirectX 12 will give the Desktop PC a boost with the Graphics API being similar to the efficiency of a console. Phil Spencer (Head of Xbox Division) has said various times that DirectX 12 won't make much of a difference on Xbox One.

 

This is one of the benefits of consoles where the Graphics API's were designed specifically for the hardware used and allowed low level access which meant it could run games better than PC's with same and/or with slightly higher specs. DirectX 12 aims to slightly improve the Desktops Graphics API's. 

 

I do find it interesting that Mantle performs better than DirectX 12 though, so AMD has developed a better Graphics API than the so-called software king (MS) and they have only been in the graphics industry since 2006 which is when they acquired ATI Technologies. 

Link to comment
Share on other sites

"Microsoft promises similar "console-like" efficiency for DirectX 12"

 

Microsoft are saying that DirectX 12 will give the Desktop PC a boost with the Graphics API being similar to the efficiency of a console. Phil Spencer (Head of Xbox Division) has said various times that DirectX 12 won't make much of a difference on Xbox One.

 

This is one of the benefits of consoles where the Graphics API's were designed specifically for the hardware used and allowed low level access which meant it could run games better than PC's with same and/or with slightly higher specs. DirectX 12 aims to slightly improve the Desktops Graphics API's. 

 

I do find it interesting that Mantle performs better than DirectX 12 though, so AMD has developed a better Graphics API than the so-called software king (MS) and they have only been in the graphics industry since 2006 which is when they acquired ATI Technologies. 

 

I checked the recent benchmarking of DX11, DX12 and Mantle .. there is no big difference between Mantle and DX12.

 

Doubtful that Mantle will survive.

Link to comment
Share on other sites

I checked the recent benchmarking of DX11, DX12 and Mantle .. there is no big difference between Mantle and DX12.

 

Doubtful that Mantle will survive.

Agreed, only a very few game support Mantle. DX12 supports a vast variety of more hardware and Mantle only ATM supports the lastest R7\R8\R9 cards.

Link to comment
Share on other sites

This is one of the benefits of consoles where the Graphics API's were designed specifically for the hardware used and allowed low level access which meant it could run games better than PC's with same and/or with slightly higher specs. DirectX 12 aims to slightly improve the Desktops Graphics API's. 

 

I do find it interesting that Mantle performs better than DirectX 12 though, so AMD has developed a better Graphics API than the so-called software king (MS) and they have only been in the graphics industry since 2006 which is when they acquired ATI Technologies. 

 

Did you even read the article?

Mantle has to only support AMD where as DX12 is cross vendor support and in the article they actually praise MS for getting performance close. It also promises more than a *slight* improvement for desktop api's, did you really really read the article?

Link to comment
Share on other sites

I do find it interesting that Mantle performs better than DirectX 12 though, so AMD has developed a better Graphics API than the so-called software king (MS) and they have only been in the graphics industry since 2006 which is when they acquired ATI Technologies. 

 

I find that interesting as well.  Keep in mind though that mantle drivers are more mature and DX12 is still in active development.

That said if the release version of DX12 is still slower than mantle that's pretty bad.  Despite what some may say Mantle isn't designed to be exclusive to AMD hardware (from a technical standpoint).  It is written so that nVidia COULD make mantle drivers for their cards as well (though that will never happen for business reasons, not technical).  I believe Intel was even looking into making Mantle drivers at one point.  Mantel is AMD only just because it's made by AMD and other companies aren't going to support an API that is controlled by their competition.

 

On the other hand Microsoft is changing the driver model (WDDM 2.0) and even altering the windows kernel for DX12... AMD can't do that yet Mantle outperforms DX12 (currently).  Again though for the time being I'd just chalk that up to DX12 and the supporting drivers being in an immature pre-release state compared to Mantle but I agree it's pretty bad if the situation remains at release.

Link to comment
Share on other sites

Did you even read the article?

Mantle has to only support AMD where as DX12 is cross vendor support and in the article they actually praise MS for getting performance close. It also promises more than a *slight* improvement for desktop api's, did you really really read the article?

Yeah, I saw that too. I have been reading a ton of praise over the DX12 from developers and the dramatic improvements it will be making to games on console and PC. Amazing considering how it is cross vendor support and is able to be that efficient. There is nothing else like it.

Link to comment
Share on other sites

they have only been in the graphics industry since 2006 which is when they acquired ATI Technologies.

That doesn't make any sense. AMD didn't just buy the ATI brand. They bought *everything*, which includes whatever experience ATI has in graphics.

I find that interesting as well.  Keep in mind though that mantle drivers are more mature and DX12 is still in active development.

That said if the release version of DX12 is still slower than mantle that's pretty bad.  Despite what some may say Mantle isn't designed to be exclusive to AMD hardware (from a technical standpoint).  It is written so that nVidia COULD make mantle drivers for their cards as well (though that will never happen for business reasons, not technical).  I believe Intel was even looking into making Mantle drivers at one point.  Mantel is AMD only just because it's made by AMD and other companies aren't going to support an API that is controlled by their competition.

 

On the other hand Microsoft is changing the driver model (WDDM 2.0) and even altering the windows kernel for DX12... AMD can't do that yet Mantle outperforms DX12 (currently).  Again though for the time being I'd just chalk that up to DX12 and the supporting drivers being in an immature pre-release state compared to Mantle but I agree it's pretty bad if the situation remains at release.

DX12 almost matches Mantle in the alpha state with alpha drivers. I think that is commendable, not sure why you or GotBored needs to question Microsoft's software skills. You should read the source article and general impressions around the web.

On topic, if more games go DX12 (and I hope they do), this will be a game changer for both consoles (as long as Sony keeps with the DX API). It will be "easier" porting games in both directions.

Link to comment
Share on other sites

DX12 almost matches Mantle in the alpha state with alpha drivers. I think that is commendable, not sure why you or GotBored needs to question Microsoft's software skills. You should read the source article and general impressions around the web.

Your alpha point is the same point I was making saying the API and drivers were immature, I really don't see the difference. I don't really see what I said as questioning MS's software skills, that certainly wasn't my intent.

On topic, if more games go DX12 (and I hope they do), this will be a game changer for both consoles (as long as Sony keeps with the DX API). It will be "easier" porting games in both directions.

I'm sure more games will go DX12, I don't really see that as being in question. I don't know what you mean by "as long as Sony keeps with the DX API" means. Sony doesn't use the DirectX API at all on the PlayStation, they have their own APIs tailored to their hardware. One of them is already a low level, close to the metal API like DX is intended to provide. It's normal for devs to have low level access to console hardware. They way it's done is typically specific to the particular console though and what DX12 provides is a common API that provides similar low level access in a way that works with different hardware even made by completely different companies.
Link to comment
Share on other sites

Your alpha point is the same point I was making saying the API and drivers were immature, I really don't see the difference. I don't really see what I said as questioning MS's software skills, that certainly wasn't my intent.

I'm sure more games will go DX12, I don't really see that as being in question. I don't know what you mean by "as long as Sony keeps with the DX API" means. Sony doesn't use the DirectX API at all on the PlayStation, they have their own APIs tailored to their hardware. One of them is already a low level, close to the metal API like DX is intended to provide. It's normal for devs to have low level access to console hardware. They way it's done is typically specific to the particular console though and what DX12 provides is a common API that provides similar low level access in a way that works with different hardware even made by completely different companies.

One of the things Sony announced at ps4 reveal was a DX like API. I don't remember what exactly they said but their goal was to make it easier to develop for developers familiar with DX. That's what I meant by keeping up with DX12 (API).
Link to comment
Share on other sites

71450.png

 

Source:

 

http://anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/3

 

 

Anyone got any opinions on if this is going to be a game changer?

I wouldn't read anything into a benchmark designed specifically to show off low-level APIs. Anyone can create an artificial benchmark that exacerbates an existing bottleneck but that doesn't have any bearing on real-world implementations. Until we see games requiring DX12 they will be designed with the limitations of DX11 in mind. Regarding consoles specifically, developers already use low-level APIs meaning that the performance gains from DX12 will be minimal on XB1. Even if the differences are significant, OpenGL has similar low-level APIs and can be implemented on the PS4 meaning that the relative difference between the two platforms will remain similar.

 

You're not going to see anywhere close to the level of performance difference shown in that benchmark.

Link to comment
Share on other sites

One of the things Sony announced at ps4 reveal was a DX like API. I don't remember what exactly they said but their goal was to make it easier to develop for developers familiar with DX. That's what I meant by keeping up with DX12 (API).

 

Sony has two custom APIs for the PS4.  One is a high level API like DX (up until 12) and OpenGL that familiar and easier for developers to use.  The other is a low level API that's down to the metal which makes it faster but also less familiar since it's specific to the particular hardware in the PS4 and being so low level is more difficult to use (devs have to do most things manually that the higher level APIs manage for them).  The manual control gives developers more flexability and control but it's more work.  DX12 and Mantel didn't even exist when Sony made those APIs.  Now MS is introducing a similar set of APIs.  DX has always been the high level one and that's even continuing... along with DX12 comes DX11.3 that doesn't get much hype but it's still a high level API for developers who don't want to get their hands dirty close to the metal.  DX12 is an entirely new beast for DX that provides a low level API for the first time for Windows that gives developers across different hardware similar low level access to the custom APIs consoles have traditionally had.

 

If you'd like to look into it more the high level API for the PS4 is called GNMX.  It's similar in capabilities as OpenGL and DX11.x.  The low level API is called GNM (I know, it stinks the difference in API names is only one letter... kind of confusing) and is like Mantle and DX12.  Both shipped with the PlayStation 4 at launch.  There have been a number of interviews where actual developers have been asked if DX12 is going to give MS a big advantage on Xbox One over PS4 and the developers have said no, because GNM on the PS4 already gives them similar low level hardware access, it's just proprietary and specific to the PS4 instead of being an API that can be used with different graphics cards across Intel, AMD, nVidia, etc.  That's not new or unusual.  Consoles usually give devs low level access to the hardware.  The unusual thing is that the Xbox One did NOT ship with a low level API.  Since launch though Microsoft has made HUGE improvements in the XDK stripping out all the things in the API that didn't apply to the Xbox One specifically and adding in new lower level features to improve performance so it's pretty good now and it will get better with DX12 along with providing a common API for developers that's not just specific to Xbox One hardware.

 

Microsoft even has a shading language called High Level Shader Language (HLSL) as part of DX and it's proprietary but Sony made a similar one for the PS4 called PlayStation Shader Language (PSSL) so developers have similar capabilities (OpenGL has GLSL).  Again they had it from launch.  Since GNM, GNMX, and PSSL are all Sony's own APIs that they control they can tweak them and add to them as they like if devs need something the hardware supports that isn't already there.  By and large devs seem to be happy with what's there though unlike the initial reaction to the Xbox API which a lot felt was too high level to open up the full potential of the hardware.  Again though MS has made HUGE improvements on that front in the XDK updates so it's not much of an issue anymore. It's hard to get specific about the Sony APIs though because they're under an NDA. The public doesn't know EXACTLY what's in them and the devs that have access to them can't talk about specifics but their general impression seems to be uniformly good.

Link to comment
Share on other sites

This topic is now closed to further replies.