PS4 and Xbox One resolution / frame rate discussion


Recommended Posts

That's fine for a personal requirement to be impressed. But the reality is you are making the claim that they are lying about it being useful. Thus, you are burdened with the task of disproving them. You have no way of knowing what 'should' or  'should not' happen in that technical demo. A blind assumption isn't good enough to make that claim.

 

So i have to disprove them but they don't have the prove anything. I'm sorry but you can turn this around all you want this demo doesn't prove anything. The non cloud portion of it is running very suspiciously bad as the fps drop to 15 fps without anything major happening. We have absolutely no proof this is not able to run without the cloud and we have no proof it can run with the cloud in a real gaming scenario with enemies, AI, current gen gfx, latency and such.

 

I want to see real games. Tech demos from my own experience are always unrealistic and always better than the final product.

Link to comment
Share on other sites

Showan the link talks about FPS and screen tearing...

 

 

There is a performance and visual difference. 

 

 
 

 

Last generation besides the obvious Skyrim comparison we were discussing 640/680p vs 720p. 90% of it was 20~25 FPS vs 30FPS as well.

 

This gen its 900p vs 1080p, and in some cases 50~60FPS vs 30~40FPS. So I don't know how you can say this is the smallest gap it's ever been without seriously fudging your maths.

 

 

I have played a ton of games on both systems, as I own both systems... What I read vs what I play are not one in the same...

 

Once again, game is more than playable...

 

Skyrim made you want to take the PS3 and throw it out the window...

Link to comment
Share on other sites

We absolutely CAN say it was hype, as there hasn't been any worthwhile implementation and there is no evidence that developers intend to implement it widely. The issue is more with the way that Microsoft presented it.

 

Further, the performance gap this generation is the LARGEST we've ever seen. We saw XB1 games (Pro Evolution 5, Call of Duty: Ghosts, Metal Gear Solid V: Ground Zeroes, Golf Club) running at 720p while their PS4 counterparts ran at 1080p

Link to comment
Share on other sites

Where did i say they should stop development? I'm saying they should stop talking about all the endless possibilities and instead deliver them or wait for 3rd party parties to deliver them. As far as marketing goes MS should focus on what has been done not what might be done and let the devs experience with it and see what happens. I mean enough with the talking about the cloud possibilities it's been over a year now we want to see real applications of it not demos.

 

I can't wait to see what the future hold too. I think the idea is neat. I just don't like people saying the cloud can do this and that just because ms said so. I think latency is a major concern. And while i might be mistaken i don't recall MS saying how they plan to overcome this problem for real gaming applications outside of online gameplay and perpetual world.

 

 

Look back at what you wrote... THis is the equivalent of "Cant walk and chew gum at same time"...  Microsoft is a big enough company where more than one thing can be done at the same time...

 

And its been others who keep bringing it up.. Not Phil and team...

 

Why focus only "Whats been done"?  That's call stagnation... If that was the case I wouldn't want to bother with my PS4 or One... I want envelopes pushed...

 

Not just "oh it's prettier"....

Link to comment
Share on other sites

So i have to disprove them but they don't have the prove anything. I'm sorry but you can turn this around all you want this demo doesn't prove anything. The non cloud portion of it is running very suspiciously bad as the fps drop to 15 fps without anything major happening. We have absolutely no proof this is not able to run without the cloud and we have no proof it can run with the cloud in a real gaming scenario with enemies, AI, current gen gfx, latency and such.

I want to see real games. Tech demos from my own experience are always unrealistic and always better than the final product.

I'm not staying you have to disprove them, but a lack of proof from them does not make it a lie. Whether or not you think something is suspicious is not a fact we can go by. It's a hunch, and has no place in an objective discussion.

If you don't believe their evidence, then that's fine. But if you want to claim otherwise then you have to put forth some counter evidence other than "I just think it looks suspicious".

Link to comment
Share on other sites

people still using launch titles to make a comparison of power differences, when early titles still used the slow, usermode debug D3D driver,and not the high performance low level driver.

 

beyond the seventh CPU core revelation - is the existence of two separate graphics drivers for the Xbox One's onboard Radeon hardware: we know about the mono-driver - Microsoft's GPU interface designed to offer the best performance from the hardware, but there was also the user-mode driver (UMD) - something that you'll see referenced throughout this piece.

Yes, remarkably Microsoft had two GPU drivers in circulation, all the way up to May 2014 when the user-mode driver was finally consigned to the dustbin. The mono-driver becomes the key to improved performance for future Xbox One games but the version utilised for launch titles would have been somewhat sub-optimal compared to the version in circulation today. One section in the SDK in this period gleefully exclaims "Tear No More!" - a feature that seems to see the introduction of v-sync and adaptive v-sync support.

http://www.eurogamer.net/articles/digitalfoundry-2015-evolution-of-xbox-one-as-told-by-sdk-leak

No, the benchmark should start at least from after the June 2014 SDK update, which free'd up resources locked by kinect, regardless if the developer made use of kinect voice of skeletal tracking or not.

i've made a list of the resolution and framerate comparisons of all the big name multiplatform titles released from August 2014 and onwards, which means these titles had the potential to use the new SDK,and the free'd up cpu,gpu and bandwidth allocations.

Same Resolution

---------

Diablo III: 1080p both, negligible framerate differences

Destiny: 1080p both, negligible framerate differences

Sleeping Dogs DE: 1080p both, ps4 framerate advantage

AC Unity: 900p both, xbox one framerate advantage

GTA V: 1080p both, xbox one framerate advantage driving, ps4 framerate advantage combat

The Crew: 1080p both, xbox one framerate advantage

Saints Row: 1080p both, ps4 framerate advantage

Different Resolutions

--------------------

Metro Redux: 1080p ps4, 912p xbox one, framerate negligible

Shadow of Mordor: 1080p ps4, 900p xbox one, framerate negligible

The Evil Within: 1080p+black bars ps4, 900p+black bars xbox one, framerate negligible

COD AW: 1080p ps4, 1360x1080 - 1080p dynamic res xbox one, xbox one framerate advantage SP only

Far Cry 4: 1080p ps4, 1440x1080 xbox one, xbox one framerate advantage

Games using kinect and no access to extra resources

---------------------------------------------------

Alien Isolation: 1080p both, ps4 framerate advantage

Dragon Age: 1080p ps4, 900p xbox one, xbox one framerate advantage

Dying Light(not confirmed): 1080p ps4, 1536x1080 xbox one(not confirmed), ps4 framerate advantage

Link to comment
Share on other sites

I'm not staying you have to disprove them, but a lack of proof from them does not make it a lie.

 

Of course not. It's just the computer engineer in me that is very skeptical about some of the claims made. At this point i don't even know anymore if the claims were made by fans or MS lol

Link to comment
Share on other sites

At this point i don't even know anymore if the claims were made by fans or MS lol

 

The claims were made by fans, same as the directx12, kinect-resources, esram, etc claims.

 

But with that being said MS do make some ambiguous statements that fans like to interpret in their own way and try to make them seem like huge game changers. Nothing has come to fruition yet but they can always hope I guess.

Link to comment
Share on other sites

Looks vs Playability are not the same... If I can play a game and see what you see, but the difference being that you see it with a pinch more beauty is a "whatever" deal to me...

 

I've turned down plenty of prettiness for different features (voice)... ...

 

THe differences being bought up are minor... But being blown up to epic proportions...

 

Slight visual diff vs can I play this game without going crazy are not the same...

We're not talking about slight visual differences - we're talking about games on the PS4 having 45-100% higher resolution and better framerates. Microsoft has had to slash the price of the XB1 in order to remain competitive, as well as dropping supposedly integral features like Kinect.

 

At the end of the day as long as people are informed about the situation it doesn't matter. If someone wants to buy the XB1 because it has what they perceive to be better exclusives or a better online experience and they don't mind some multiplatform games running at half the resolution then that's fine. Some people genuinely don't care about resolution or minor framerate drops. However, many of us consider the performance issues with the XB1 to be unacceptable.

Link to comment
Share on other sites

We're not talking about slight visual differences - we're talking about games on the PS4 having 45-100% higher resolution and better framerates. Microsoft has had to slash the price of the XB1 in order to remain competitive, as well as dropping supposedly integral features like Kinect.

 

At the end of the day as long as people are informed about the situation it doesn't matter. If someone wants to buy the XB1 because it has what they perceive to be better exclusives or a better online experience and they don't mind some multiplatform games running at half the resolution then that's fine. Some people genuinely don't care about resolution or minor framerate drops. However, many of us consider the performance issues with the XB1 to be unacceptable.

 

 

I have quite a few games for both systems... And neither have framerate drops to a point where it will make you say.. "Can this (insert console here) handle this game."...

What videogame sites give you is an exaggerated view of how a game is.

 

I have MG:Ground Zero on PS4.  Not because of how it looks, but because I've always got Metal Gear games on Playstation platforms... But I also go to play MG:GZ on the One, and the game still looks darn good.

 

Here is the thing, most of us don't even know when a game falls short on the visuals until we are told.  

 

Nobody knew that Ryse was 900p until they were told.  

 

... Dying Light is Playable and other games are still more than playable on the One.  If you want a game to be a little more prettier then sure go for the PS4 version...

 

If I can play the game, and it looks good enough, I good with it.  I'm not knocking Playstation at all (I've owned every single one ever made  :D )..., But there is a difference between playable vs visual...

 

Never once do these games become unplayable on Xbox One.

 

Whereas on PS3 Skyrim is near unplayable, and Mass Effect was not that good of an experience either...

Link to comment
Share on other sites

Never once do these games become unplayable on Xbox One.

 

Whereas on PS3 Skyrim is near unplayable, and Mass Effect was not that good of an experience either...

Assassin's Creed: Unity was

Link to comment
Share on other sites

I'm using my smartphone and the difference is obvious straight away.

This is why some people are 'caught up with this'. Not everybody is the same as you.

Link to comment
Share on other sites

I think it's obvious why I said it. No tangible proof and I've years of experience in that field. The cloud can't improve graphical performance.

No games have shown it (yet), that's true. But saying you have experience doesn't invalidate potential.

It's not about making the Xbox One more powerful on it's own... It's about offloading graphical calculations/computations where latency is non-critical, to external servers. Which is possible. In fact, that's the whole point of cloud computing. Not having to do it yourself.

 

It's not yet shown in a retail game, but doesn't make it less feasible.

How you fail to recognize this, I don't understand. Why don't you explain why it can't do this?

Link to comment
Share on other sites

I'm using my smartphone and the difference is obvious straight away.

This is why some people are 'caught up with this'. Not everybody is the same as you.

 

You realize that smart phones are high-pixel density screens right? And you also realize that you have to be really close to your phone in general. These games are made for televisions. Screens where you're sitting between 6' and 14' away from them. Also, seeing a difference does not equate to a major difference. Just a noticeable one. One that's only noticeable cause the game is sitting still. In motion the resolution differences would be minor if not unseeable.

 

I can go do some math on the resolution and make it look really bad. 1400x900 is roughly 60% of the pixels of 1920 x 1080. Yet... surprisingly, that's not apparent at all in the visuals. It looks, at least to me, like a marginal difference at best. Slightly blurrier textures is not that big a deal to me.

 

This is probably why we don't see things between 1080p and 4K in televisions. Because the difference between something like 1080p and 2160p would be barely noticeable. But when you bump it to three times the resolution 3000p+ then it becomes noticeable.

Link to comment
Share on other sites

900p vs 1080p is not substantial. It's lower, yes, but not by any means substantial. You can keep throwing around percentages because it makes things look worse than they are but that doesn't change the reality that big of a deal. I mean, look at the image here. I had to zoom all the way in on my MBP Retina display to even see the difference.

 

http://i2.minus.com/iZUB0VgplbRBy.png

900p = 1,440,000 pixels

1080p = 2,073,600 pixels

 

Going from 900p to 1080p is a 44% increase in resolution, which certainly is substantial. The image you provided highlights the difference well, as the scene on the left is noticeably more blurry. I certainly didn't have to zoom in to see the difference, as you claimed.

  • Like 1
Link to comment
Share on other sites

900p = 1,440,000 pixels

1080p = 2,073,600 pixels

 

Going from 900p to 1080p is a 44% increase in resolution, which certainly is substantial. The image you provided highlights the difference well, as the scene on the left is noticeably more blurry. I certainly didn't have to zoom in to see the difference, as you claimed.

 

I think we can agree instead of arguing whether you notice or not, the notion that this generation is somehow the smallest gap it's ever been, is just untrue. It's the biggest gap, and the eye of the beholder can decide if they notice it or not.

  • Like 3
Link to comment
Share on other sites

900p vs 1080p is not substantial.

I'm curious as to what your definition of "substantial" is then. More than 30% lower resolution is pretty big, if that doesn't count as substantial to you then what does? So there isn't a "substantial" difference between 720p and 900p in your view either?

You can keep throwing around percentages because it makes things look worse than they are but that doesn't change the reality that big of a deal.

Percentages are objective facts. Your threshold for what is or is not "substantial" is subjective and if a > 30% difference doesn't make the cut then is probably outside the norm. If I pay a $100 for something and you pay $70 I'd say you gut a substantially lower price than me.

I mean, look at the image here. I had to zoom all the way in on my MBP Retina display to even see the difference.

So you don't care, others do, why are you even reading this thread then? This thread is about the differences for those who do care. Also if you can see something or not doesn't have any bearing on if it's "substantial" or not. An electron is "substantially" smaller than a single cell organism. I can't see the difference between them at all. That doesn't mean there isn't a "substantial" difference.
Link to comment
Share on other sites

900p = 1,440,000 pixels

1080p = 2,073,600 pixels

 

Going from 900p to 1080p is a 44% increase in resolution, which certainly is substantial. The image you provided highlights the difference well, as the scene on the left is noticeably more blurry. I certainly didn't have to zoom in to see the difference, as you claimed.

 

That's my point, a 44% increase isn't substantial. The number is misleading because if a mere 44% increase is a substantial increase then why are television companies not marketing 1550p as the next big thing? They're not, they're marketing double that.

 

number is not always the answer. And the reality is that resolution is a world of diminishing returns. The more pixels you have, the more you have to add to get the same level of benefit from it. 480p -> 720p has more improvement in visual fidelity than 720p -> 1080p. Not to mention we aren't factoring in upscaling being done by the systems for anything sub-1080p.

 

Just because a number looks big doesn't mean it is a good representation of reality. You can't omit other factors.

 

I'm curious as to what your definition of "substantial" is then. More than 30% lower resolution is pretty big, if that doesn't count as substantial to you then what does? So there isn't a "substantial" difference between 720p and 900p in your view either?

Percentages are objective facts. Your threshold for what is or is not "substantial" is subjective and if a > 30% difference doesn't make the cut then is probably outside the norm. If I pay a $100 for something and you pay $70 I'd say you gut a substantially lower price than me.

So you don't care, others do, why are you even reading this thread then? This thread is about the differences for those who do care. Also if you can see something or not doesn't have any bearing on if it's "substantial" or not. An electron is "substantially" smaller than a single cell organism. I can't see the difference between them at all. That doesn't mean there isn't a "substantial" difference.

 

One cell vs two cells is significant. 5 billion cells vs 10 billion cells is not. As I said above, Resolution is a world of diminishing returns, the more pixels present in your base resolution the more pixels you need to present a noticeable difference. If this weren't true then 4K wouldn't be what we'd be seeing next. We'd be seeing 1550p as the new HD at 44% more pixels than 1080p.

 

Whether or not a percent is an objective fact, numbers can be skewed in meaning by context. And what you guys are doing is tampering with the context and selecting the single data point that helps your side while ignoring the nature of the very topic we are discussing.

 

You guys can care about that 44% difference all you want, but the truth and objectivity is in the entire picture. Not just a single number.

Link to comment
Share on other sites

I think it's obvious why I said it. No tangible proof and I've years of experience in that field. The cloud can't improve graphical performance.

So you have years of experience in a field you claim doesn't exist (or is a lie)? How does that work? :p

I guess everything that can be done with "clouds" has been done, right?

 

I'm curious as to what your definition of "substantial" is then. More than 30% lower resolution is pretty big, if that doesn't count as substantial to you then what does? So there isn't a "substantial" difference between 720p and 900p in your view either?

Percentages are objective facts. Your threshold for what is or is not "substantial" is subjective and if a > 30% difference doesn't make the cut then is probably outside the norm. If I pay a $100 for something and you pay $70 I'd say you gut a substantially lower price than me.

So you don't care, others do, why are you even reading this thread then? This thread is about the differences for those who do care. Also if you can see something or not doesn't have any bearing on if it's "substantial" or not. An electron is "substantially" smaller than a single cell organism. I can't see the difference between them at all. That doesn't mean there isn't a "substantial" difference.

It is substantial % but if people couldn't tell that KZ was running at 50% lower resolution on PS4 then 30% lower is going to be even more difficult especially when most people won't be running every game side by side.

Can we say XB1 is more efficient because it is running 100% more operating systems (200% if you count the hypervisor) on "inferior hardware"?

% are meaningless here because PS4 is pushing more pixels but lacks AF that XB1 is doing. So are more pixels better or improved textures better?

Link to comment
Share on other sites

That's my point, a 44% increase isn't substantial. The number is misleading because if a mere 44% increase is a substantial increase then why are television companies not marketing 1550p as the next big thing? They're not, they're marketing double that.

You posted a screenshot demonstrating the visual difference between 900p and 1080p, which clearly shows that 900p is more blurry. If you want to pretend that resolution isn't important then that's fine but if that's the case then why bother with the next-gen consoles at all? Their primary selling point is their graphical capability. As I pointed out, some XB1 games are running at just 720p - that's unacceptable.

 

Whether or not a percent is an objective fact, numbers can be skewed in meaning by context. And what you guys are doing is tampering with the context and selecting the single data point that helps your side while ignoring the nature of the very topic we are discussing.

 

You guys can care about that 44% difference all you want, but the truth and objectivity is in the entire picture. Not just a single number.

If you want truth and objectivity then I'll simplify my argument:

 

1) The PS4 outperforms the XB1

2) The XB1 struggles to hit 1080p

 

Happy now?

  • Like 1
Link to comment
Share on other sites

You posted a screenshot demonstrating the visual difference between 900p and 1080p, which clearly shows that 900p is more blurry. If you want to pretend that resolution isn't important then that's fine but if that's the case then why bother with the next-gen consoles at all? Their primary selling point is their graphical capability. As I pointed out, some XB1 games are running at just 720p - that's unacceptable.

 

 

If you want truth and objectivity then I'll simplify my argument:

 

1) The PS4 outperforms the XB1

2) The XB1 struggles to hit 1080p

 

Happy now?

 

 

Why does Ryse still look better than most games to have come out?  Only 900p...

 

How come nobody knew that Ryse was 900p until they were told?...

Link to comment
Share on other sites

Why does Ryse still look better than most games to have come out?  Only 900p...

 

How come nobody knew that Ryse was 900p until they were told?...

And it would look better running at 1080p. It looked great on my machine running at 1600p.

 

If Microsoft hadn't decided to force Kinect upon everyone and instead invested in adequate hardware for the XB1 then this wouldn't be an issue.

Link to comment
Share on other sites

This topic is now closed to further replies.