PS4 and Xbox One resolution / frame rate discussion


Recommended Posts

On cross platform games though where both platforms are using the same models, lighting, textures, etc. if the resolution and frame rate are identical then I have to assume the devs slacked on the game (such as ignoring the extra compute units on the PS4) or intentionally held the PS4 back for parity.  Again to me the issue isn't if they hit 1080p and 60fps or not, it's that I feel the developer has taken advantage of the strengths of each platform (like the Dragon Age tweet said).

I agree, but Destiny is an example of a cross platform that started with a 1080p/900p difference between the consoles and then the X1 version got a bump to bring it in line. So for a time, it seemed Bungie was getting the result you would prefer to see and take it as evidence the are getting the most out of the ps4. Then the bump came for the X1 and suddenly it made it look like they slacked on the ps4. I think a big difference is that Bungie never made a big or deal about it.

The reason 1080p@60fps is important though is because 60Hz is the power frequency in the U.S. and other NTSC counties and 1920x1080 is the native resolution for most HDTVs.  So a game running at 1920x1080 and 60fps doesn't require any sort of interpolation or scaling or other such things.  It runs completely native on most displays which is nice so it makes a nice IDEAL to strive for but it's not the end of the world if it misses it.  We aren't going to hit 4k resolution or 120+ fps on AAA titles this generation, the hardware just can't handle it so 1080p@60fps makes the most logical target to shoot at.

I know why people want it, doesn't change the reality.

The new consoles can't offer a steady stream of 1080p/60 games. The ps4 can do more, but it can't promise that.

Link to comment
Share on other sites

The Evil Within

 

Resolution is the first order of business: the developer opts for an extra-wide aspect ratio, presumably in order to reduce the games' rendering load - though HUD elements are displayed in this region. Based on what's rendered inside the letterbox, however, we're left with a 1920x768 resolution on PlayStation 4 and a meagre 1600x640 on Xbox One. In effect, it's the now-familiar 1080p vs 900p set-up here, but the intrusive borders serve to cut-down actual rendering resolution significantly. Only 71 per cent of the screen's real estate is actually used for gameplay - and the aspect ratio utilised is actually a higher 2.5:1 rather than the 'cinematic' 2.35:1.

 

However, the main concern here is frame-rate - and first impressions are not great. The very first scene in which the player is given control is an appropriately foreboding moment, not just in terms of horror but also performance. Neither version manages to run consistently and the narrow field of view combined with jittery camera motion only serve to exacerbate the issue. The closest analogue would have to be the original console release of Resident Evil 6, where the camera feels much too close to the player while the frame-rate regularly dips below 30fps.

How far does it go? Unfortunately nearly every major sequence, particularly outdoors, is fraught with dips that interrupt the action, resulting in a jarring, jerky experience. Scenes not unlike Resident Evil 4's seminal village sequence see dips into the low 20s. The tight camera work and unsteady frame-rate lead to some incredibly off-putting moments. Thankfully, when the game does return to more enclosed tunnels the frame-rate tends to jump back to a more steady 30fps but, more often than not, the experience feels choppy and inconsistent.

What's worse, on Xbox One, it almost feels as if the renderer is out of sync with the game simulation. These particular issues don't appear in the performance metrics yet the issue is very much present. As a result, even when the game is rendering out 30fps, it sometimes feels worse than it should. To illustrate the issue, right-click, save as, and download this Xbox One clip, and compare it with this matched PlayStation 4 clip. Both games are rendering at a locked 30fps, but something's clearly very wrong with the Xbox One build. Looking at the numbers alone suggests that both versions produce similar performance metrics with some scenes even operating a touch smoother on Xbox One but, in practice, the Microsoft version feels worse, but ultimately, neither version feels particularly smooth during normal gameplay.

 

 

As it stands the results aren't looking great for either console but the PS4 version takes the lead for the moment with its higher resolution and smoother update when running at 30fps. It is odd that certain scenes do operate with a slight advantage on Xbox One, however, and we'll have to play further to see how the game trends in later sections. Anyone sensitive to frame-rate issues is going to have an issue playing this game and should probably go for the PC version provided one has the specs to handle it.

If you're alarmed about buying the PC game based on Bethesda's alarmingly high recommended specs, at least we have some good news there. A 4GB graphics card doesn't seem to be required at all for 1080p gameplay, and we've run the game just fine on the mid-range GTX 760, while even the entry-level enthusiast GTX 750 Ti offers console-style frame-rates at 1080p on max settings. As for hitting a consistent 60fps - well, it's here that the unoptimised nature of the PC port becomes apparent: we'll have more on that in a later update.

 

 

 

Source: http://www.eurogamer.net/articles/digitalfoundry-2014-the-evil-within-performance-analysis

Link to comment
Share on other sites

Evil Within doesn't sound like a good game though,  I don't know first hand but I never thought about playing it to begin with.  

Link to comment
Share on other sites

It's actually a really good game so far (I'm on chapter 7). For fans of RE4/Silent Hill and TLoU.

 

vcfan I wasn't aware of Kinect functionality in The Evil Within. Not seen anyone talk about it.

Link to comment
Share on other sites

from giantbomb podcast. they received an email from a ubisoft

unity developer who offered to provide proof, but GB ran it

through their reputable programmer source who says it checks

out.

around 2hr 23mins

http://www.giantbomb.com/podcasts/download/1034/Giant_Bombcast_10_14_2014-10-14-2014-1451096026.mp3

-getting game to 900p was a bitch

-game so big, it took months to even get it to 720p 30fps

-game was 9fps 9 months ago

-900p 30fps was only achieved few weeks ago

-ps4 couldn't do 1080p 30fps for our game, despite whatever people or sony and ms might say

- yes we have deal with ms, yes we dont want people fighting, but with concessions from ms, backing out of hardware reserves, not once, but twice, the difference between the 2 consoles is only 1 or 2 fps.

-locking the framerate was a conscious decision to stop people from bitching, but it didnt seem to work in the end

-game is crazy optimized for such young life of next gen consoles

-mordor has next gen gameplay, but not graphics like unity does

-started game early for next gen. sony wanted to push graphics for next gen consoles, so thats what we did.

- 50% of CPU used to process prepackaged info, like prebaked gobal illumination and lighting.(GB source says this all makes sense)

- CPU bound by AI is true but not entirely.

-lighting and effects the best of any game you've seen, like infamous and others.

-game is 50GB, fills entire bluray till the edge. half is lighting data.(GB source agrees with this as well)

it seems that earlier ubisoft reps statement about locking specs was talking about locking the framerates, not resolution. even though locked is better, and the difference unlocked is minor at 1-2fps between the two consoles.

another interesting take, it seems microsoft released some more cpu/gpu reserves recently.

Link to comment
Share on other sites

-ps4 couldn't do 1080p 30fps for our game, despite whatever people or sony and ms might say

- yes we have deal with ms, yes we dont want people fighting, but with concessions from ms, backing out of hardware reserves, not once, but twice, the difference between the 2 consoles is only 1 or 2 fps.

I don't buy either of these. I DO believe the game is CPU bound but that's not an excuse, it's an admission of poor design choices. If it's CPU bound then by definition the GPU is not being fully utilized and if it's not being fully utilized on the Xbox One then it certainly isn't being stressed on the more powerful PS4. This implies a poor game design because going in developers knew the Jaguar CPUs on these consoles were weak (it's a low power tablet, mini-PC, micro-server CPU design not a performance oriented part) but the idea was that with the general purpose programmability of the GPGPUs a lot of what was previously done on the CPU would be moved to the GPU instead. So yeah, the PS4 can't do 1080p 30fps because they're not fully utilizing the GPU, if they were the difference between consoles would be significantly more than 1 or 2 fps. As it is since the GPU on the PS4 isn't being used properly for "next-gen" console games the 1 to 2 fps difference is probably to the Xbox One advantage and explained by the 10% clock boost MS did right before launch. The fact the GPUs were capable of general purpose compute seems to be at best under utilized here and at worst completely ignored and the GPU is being used "prior gen" style for graphics tasks only.

- 50% of CPU used to process prepackaged info, like prebaked gobal illumination and lighting.(GB source says this all makes sense)

-game is 50GB, fills entire bluray till the edge. half is lighting data.(GB source agrees with this as well)

prebaked/prepackaged info doesn't sound so "next-gen" to me. Dynamic/Realtime Global Illumination would be more "next-gen" and if it was computing it on the fly it wouldn't need so much storage. No doubt though the prerendered stuff looks better though but if it's taking up that much processing and that much space one has to wonder if it's really worth it. I guess we'll find out when we're blown away by how much better it looks than Shadow of Mordor with it's not next gen graphics as they claim.
Link to comment
Share on other sites

It seems like most people are just brainwashed to think that 1080p is the defacto for better gaming. At what cost? An empty street like InFamous Second Son with barely any NPC. You can create a 1080p with any game, but there's a ton of stuffs that are as important. How is it possible that Ryse at 900p won the SIGGRAPH for Best Real-Time Graphics? Sometimes dropping the resolution actually helps with picture quality because you can do a lot more with each pixel. Heck, even the director of ISS said an interview that resolution sometimes must be scarified to keep frame rate and special effects.

http://www.thesixthaxis.com/2014/03/11/infamous-second-sons-director-explains-why-the-game-runs-at-30fps/

I don't think XB1 gamers have much to worry in terms of being "50% weaker" considering FH2 is one of few next-gen games with 4xMSAA. Remind me what is DC AA solution? The future is bright for XB1. Look on the bright side, Minecraft is 1080p on both consoles

http://www.eurogamer.net/articles/digitalfoundry-2014-the-making-of-forza-horizon-2

The part I've bolded is absolutely correct but the key thing you're missing is that the PS4 has to make this sacrifice much less often. Best of both worlds!
Link to comment
Share on other sites

At the end of the day they wanted to push the visual look of the game, lighting, textures, scale and so on, higher and not just resolution.  We've seen before that resolution and graphics fidelity/quality don't have to go hand in hand.   You can have a game with last-gen looks at 1080p and 60fps but have a new game that looks better be 900p and 60fps or 900p and 30fps and so on and so forth.  From all the videos I've seen not only is unity a big open world city, with lots of dynamic events going on (just walking down a street and you can go into a number of buildings and start lots of side missions on the fly), but it does look great.  The graphics between unity at 900p@30fps and AC4 at 1080p@30fps are night and day. 

 

If you really care about the raw numbers, res and frames, then you go PC.   In the end though I expect a very good looking, next-gen looking, game that I'll enjoy playing through.

Link to comment
Share on other sites

That's funny given Ubisofts comment about 60FPS being better for shooters... and then going and giving FarCry 30fps...

 

Hue hue hue. This crap can't get funnier.

Link to comment
Share on other sites

The part I've bolded is absolutely correct but the key thing you're missing is that the PS4 has to make this sacrifice much less often. Best of both worlds!

 

I think the number of times it happens shouldn't even be an issue. The fact that both consoles are having obvious trouble reaching the goal of 1080p/60fps should be the issue. Not whether or not the PS4/X1 has less or more problems with it at this point. Bottom line is that for this gen, it's an unrealistic goal (and personally I'm fine with that).

 

We can't have both, so we either sacrifice one for the other or something else gets sacrificed in their place. Personally I'm fine with a little less resolution or framerate... or even less detailed games to get those two where they need to be. But requiring every game be 1080p/60fps is an impossible standard this early in the consoles' lifespans. It took a year and a half in last generation before anyone got close to pushing the limits of the consoles (Gears of War), so I think we need to be patient here.

Link to comment
Share on other sites

I think the number of times it happens shouldn't even be an issue. The fact that both consoles are having obvious trouble reaching the goal of 1080p/60fps should be the issue. Not whether or not the PS4/X1 has less or more problems with it at this point. Bottom line is that for this gen, it's an unrealistic goal (and personally I'm fine with that).

 

We can't have both, so we either sacrifice one for the other or something else gets sacrificed in their place. Personally I'm fine with a little less resolution or framerate... or even less detailed games to get those two where they need to be. But requiring every game be 1080p/60fps is an impossible standard this early in the consoles' lifespans. It took a year and a half in last generation before anyone got close to pushing the limits of the consoles (Gears of War), so I think we need to be patient here.

 

Actually from the last round-up summary the PS4 was able to reach 1080p/60fps on almost 3/4 of the games available on the PS4.

X1 on the other hand did struggle with about 1/4 of the games being able to achieve 1080p/60fps.

 

Just to clarify for other posters the PS4 is in fact the more powerful system;

 

Info from Rad Game Tools

  CPU CPU and GPU PS4 2.3 ms 1.6 ms Xbox One 2.3 ms 2.3 ms PC 1.3 ms 1.4 ms

PC Specs: (2.8Ghz Core i5 with 4 cores and AMD R9 290x)

 

Ubisofts own console power representation graph.

 

UKB5kTO.png

 

I will also mention that it is true pixels and fps aren't the only factors which are representations of a games graphics. But with that being said most of the comparisons posted here also indicate that the PS4 beats the X1 in draw distances, shadows, reflections, game physics, on-screen objects, etc on most games or at the very least matches that of the X1 version. Not sure why people bring this one up all the time but it has been proven time and time again. When both consoles were released and specs given it should have been obvious.

 

Maybe Microsoft can pay publishers to continue optimizing games until they match that of a semi-optimized version on PS4 and give other types of support, we haven't seen a single multiplatform game run better on the X1 though and I don't expect to see that happen for the life of these consoles/ PS4 GPU is just much better than the one X1 used, GDDR5 ram is better than DDR3 for gaming also and etc.

  • Like 1
Link to comment
Share on other sites

Soooo....   checking around on the PSN store a bit, came upon the Sleeping Dogs Definitive Edition.  Fine print says the game renders at 720p.

Er, wah?

Link to comment
Share on other sites

Actually from the last round-up summary the PS4 was able to reach 1080p/60fps on almost 3/4 of the games available on the PS4.

X1 on the other hand did struggle with about 1/4 of the games being able to achieve 1080p/60fps.

 

Just to clarify for other posters the PS4 is in fact the more powerful system;

 

Info from Rad Game Tools

  CPU CPU and GPU PS4 2.3 ms 1.6 ms Xbox One 2.3 ms 2.3 ms PC 1.3 ms 1.4 ms

PC Specs: (2.8Ghz Core i5 with 4 cores and AMD R9 290x)

 

Ubisofts own console power representation graph.

 

UKB5kTO.png

 

I will also mention that it is true pixels and fps aren't the only factors which are representations of a games graphics. But with that being said most of the comparisons posted here also indicate that the PS4 beats the X1 in draw distances, shadows, reflections, game physics, on-screen objects, etc on most games or at the very least matches that of the X1 version. Not sure why people bring this one up all the time but it has been proven time and time again. When both consoles were released and specs given it should have been obvious.

 

Maybe Microsoft can pay publishers to continue optimizing games until they match that of a semi-optimized version on PS4 and give other types of support, we haven't seen a single multiplatform game run better on the X1 though and I don't expect to see that happen for the life of these consoles/ PS4 GPU is just much better than the one X1 used, GDDR5 ram is better than DDR3 for gaming also and etc.

 

 

All of this is true... Very much true... PS4 has the guts to get things done....

 

But after seeing Ryse....WOW* It really doesn't matter...  And after seeing Halo 2 get the "Anniversary Spit Polish".... all I can say is.... WOW*....

 

Drive Club... WOW*

inFamouns:SS;...... WOW*

FH2.... WOW*....

 

 

 

 

*WOW= AMAZING!!!!!

 

3rd party Developers may not want to bother, but it has been shown that the One can hold it's own in all categories.

 

Microsoft should (they really should) just ramp up 1st party games and really show what the One can do.

Link to comment
Share on other sites

I understand that some people want this thread open to keep the "truth" out there so they can compare numbers but for pretty much everyone else but the pixel-counters this thread is useless.

 

I can understand if people were discussing this in the last generation:

76110_0_org.jpg

 

But now we're at this:

xbox-one-vs-ps4.jpg

maxresdefault48.jpg

2606645-xbone_destiny_beta.jpg

 

If a game start hiccuping, then that's news, but all of this x amount of pixels vs y amount of pixels when the games compare so closely is stupid.

Link to comment
Share on other sites

The differences this generation are bigger than the last generation... 

 

Ground Zeros was a straight up 720p vs 1080p difference. Not really a good game to put into a montage as above to try and make them look similar. Those screencaps look like they've been cut and pasted from a 480p youtube video as well.

 

Either way, the "truth" as you put it, is the truth. I don't know why people are still coming into this topic to say this topic is "useless", or as we have it now, "it only mattered last generation"....

  • Like 1
Link to comment
Share on other sites

The differences this generation are bigger than the last generation... 

 

Ground Zeros was a straight up 720p vs 1080p difference. Not really a good game to put into a montage as above to try and make them look similar. Those screencaps look like they've been cut and pasted from a 480p youtube video as well.

 

Either way, the "truth" as you put it, is the truth. I don't know why people are still coming into this topic to say this topic is "useless", or as we have it now, "it only mattered last generation"....

 

Yeah was gonna say, if you're going to try and play down the differences, don't use MGS as an example :laugh: It's probably the poster boy for the topic at hand.

  • Like 1
Link to comment
Share on other sites

No! That's my point! I put it there on purpose. Even with that difference you look at the screenshots and it's very similar looking. Yes, that's one that numerically may be the most different but it still looks great at 720. There has been games that people say they look great, then someone counts the pixels and then suddenly the game sucks.

Link to comment
Share on other sites

Here's some screenshots of the "outrage" over MGS:

 

The sign in this scene:

PS4_XboxOne_04-670x382.gif?eaa32f

 

Looks like this when zoomed in and compared!

PS4_XboxOne_05a.gif?eaa32f

 

OMG!!!

 

:huh:   and this is a game, like you pointed out, that has a "huge" difference.  Now it's between 900p and 1080p.

Link to comment
Share on other sites

Here's some screenshots of the "outrage" over MGS:

 

The sign in this scene:

PS4_XboxOne_04-670x382.gif?eaa32f

 

Looks like this when zoomed in and compared!

PS4_XboxOne_05a.gif?eaa32f

 

OMG!!!

 

:huh:   and this is a game, like you pointed out, that has a "huge" difference.  Now it's between 900p and 1080p.

 

The Xbox One version looks very blurry compared to PS4 version of that sign.

Link to comment
Share on other sites

Digital Foundry: Hands-on with COD: Advanced Warfare multiplayer

 

As played at this year's Gamescom and EGX on Xbox One, multiplayer in Advanced Warfare is a very different beast to the work-in-progress Seoul campaign mission shown at Microsoft's E3 event.

 

 

 

Running at a native 1600x900 at both events (a boost from the E3-era 882p), little has changed to Xbox One's rendering setup for multiplayer, suggesting this number is a lock for all modes in the final release.

 

 

 

But what about performance? The multiplayer mode brings colossal improvements over the taxing Seoul stage shown at this year's E3. After concerns it might be dragged down by the same visual ambition, we instead get a mostly solid 60fps experience. Even with 16 players tussling for map domination on Ascend's wave-wracked shores, Sledgehammer Games' engine holds to the 60fps line very firmly indeed. With only two cases of singular dropped frames, this multiplayer build is in an encouraging state as is, though we note that when the engine is (very rarely) taxed to its limits, an adaptive v-sync kicks in, producing screen tear.

The good news? After analysing over 23,000 frames of footage we record only three actual torn frames - essentially invisible during the run of play.

 

http://www.eurogamer.net/articles/digitalfoundry-2014-hands-on-with-cod-advanced-warfare-multiplayer

 

 

excellent. im glad the MP framerate is already great.

Link to comment
Share on other sites

This topic is now closed to further replies.