PS4 and Xbox One resolution / frame rate discussion


Recommended Posts

1408 is a multiple of 128 (and all powers of 2 below it of course) so 1408x792 makes sense as a resolution from a hardware/software perspective. Still a bit weird to use a resolution that isn't native to any display whatsoever, i.e. even on 720p displays it'll suffer scaling.  :/

Link to comment
Share on other sites

Its running on the source engine after all.  I'm not quite sure why they can't hit 1080p/60 or at least 900p/60.  I have a hard time believing it has anything to do with the X1 hardware.  A pc with similar specs could do it.

I really have to question why you have a hard time believing it is the hardware when the games currently released say otherwise? The only games that are 1080p/60 on the One are sports or racing games, both of which basically have static environments, so I ask sincerely based on what we have seen so far, why is it so hard to believe they cannot do 1080/60 on a game that has much more dynamic environments and a whole lot more going on?

Link to comment
Share on other sites

If the blame lies on Respawn, considering the noise MS have made about Titanfall, you'd of thought they'd of already shipped in a platoon of XB1 engineers last year.

 

Right now hitting something strange like 792 seems like a move to simply get as far away from 720p as possible, as then the internet really would shut down for a day. The "real" COD team running with a lower res than the rushed out the door Ghosts, it just doesn't sit right  :/ I'm fairly confident it will have a day 1 900p patch at the worst, it just has to.

 

MP is usually the least taxing as well, KZSF managed 1080p/60FPS on it's MP, where as the SP was 30 unlocked (now the recent patch lets you do 30 locked).

Well from my experience MP modes are ALWAYS dumbed down from the SP graphics.  In some cases like battlefield they use different engines altogether.  Titanfall is having to do a whole lot more in its multiplayer than COD. Killzone developers already said they turned down the graphic fidelity in MP to be able to get 60FPS cause 60FPS is way more important in MP.

Link to comment
Share on other sites

I really have to question why you have a hard time believing it is the hardware when the games currently released say otherwise? The only games that are 1080p/60 on the One are sports or racing games, both of which basically have static environments, so I ask sincerely based on what we have seen so far, why is it so hard to believe they cannot do 1080/60 on a game that has much more dynamic environments and a whole lot more going on?

 

 

I said 1080p or 900p.

 

Question me all you like, doesn't change my question of why they might be having issues.  Why is it so wrong to think that there might be specific games with issues beyond the hardware?

 

Again, its using the source engine, something that has a well known history on the pc performance wise.  Having played plenty of source engine games myself, they aren't usually known for requiring very high levels of pc hardware.  I know they could be using custom bits that make it more demanding, but based on the specs of the X1 compared to a similar pc build which can play source games well above 720p/60, I just thought it was strange.

 

But hey, if this is just another example of the X1's poor hardware, then so be it. 

Link to comment
Share on other sites

If the blame lies on Respawn, considering the noise MS have made about Titanfall, you'd of thought they'd of already shipped in a platoon of XB1 engineers last year.

 

Right now hitting something strange like 792 seems like a move to simply get as far away from 720p as possible, as then the internet really would shut down for a day. The "real" COD team running with a lower res than the rushed out the door Ghosts, it just doesn't sit right  :/ I'm fairly confident it will have a day 1 900p patch at the worst, it just has to.

 

 

So wait, are you agreeing with me that it could be that they ran out of time to properly optimize and not a hard limit due to the hardware?

Link to comment
Share on other sites

Why do I get the impression that this game is somehow rushed?

 

Its running on the source engine after all.  I'm not quite sure why they can't hit 1080p/60 or at least 900p/60.  I have a hard time believing it has anything to do with the X1 hardware.  A pc with similar specs could do it.

 

I saw an interview that was put up today and I got the impression that this game was originally a 360 title that was then moved to the X1 after it was announced.  I wonder if that has led to a serious lack of time to get a proper next gen version completed.

It is based on a heavily modified Source engine. They wrote a bunch of custom additions to it. They have also said that resolution might increase in the shipped version (I doubt it but might be possible given the rumored 8% bump).

This is also a very busy shooter and even PS4 did not hit 1080p in BF4. (CoD:G was just a crap game)

Link to comment
Share on other sites

Well from my experience MP modes are ALWAYS dumbed down from the SP graphics.  In some cases like battlefield they use different engines altogether.  Titanfall is having to do a whole lot more in its multiplayer than COD. Killzone developers already said they turned down the graphic fidelity in MP to be able to get 60FPS cause 60FPS is way more important in MP.

 

 

That's what I was getting at, MP usually always sacrifices something from SP in order to hit the golden 60FPS for FPS.

 

So wait, are you agreeing with me that it could be that they ran out of time to properly optimize and not a hard limit due to the hardware?

 

I find it hard to believe that a game MS have gone balls deep in for will finish with a resolution of 792p. Last gen we were doing 720p/60FPS.  At what cost who knows, maybe it will just be optimizing, or maybe they will cut back on textures (like TR XB1), but I'm confident they'll do something to get to 900p. I'd be genuinely surprised if the day 1 playable copy is running at this resolution - Maybe it will be a day 1 patch like AC4 on the PS4.

 

I believe the alpha was actually 720p, so it's come up a little.

 

And this is cheap, but it is a little funny :p

 

Titanfall1Funny.gif

Link to comment
Share on other sites

I find it hard to believe that a game MS have gone balls deep in for will finish with a resolution of 792p. Last gen we were doing 720p/60FPS.  At what cost who knows, maybe it will just be optimizing, or maybe they will cut back on textures (like TR XB1), but I'm confident they'll do something to get to 900p. I'd be genuinely surprised if the day 1 playable copy is running at this resolution - Maybe it will be a day 1 patch like AC4 on the PS4.

 

I believe the alpha was actually 720p, so it's come up a little.

 

And this is cheap, but it is a little funny :p

 

Titanfall1Funny.gif

 

 

 

Ah ok. That makes sense

 

Well you never miss a chance to rub it in people's faces :laugh:

Link to comment
Share on other sites

To be honest, I'm personally pleased that they're trying to bump up the res on Titanfall. I'm honestly confused why it can't at least run at 900p considering the actual visual fidelity of the game. Hopefully it'll hit that on final build like rumored. I'm guessing they'll be able to squeeze a lot more out with the updates which are due this month and next with the 'behind the scenes talk'.

 

If it stays around what it is now, I'll be massively disappointed, but you're not going to see me running this on a PC.

Link to comment
Share on other sites

Link to comment
Share on other sites

It's crazy how much multi-plats are different this time around, I'm going to guess this is a just above 30fps~ to a locked 60fps difference between.

 

Boggles the mind how much silicon space MS wasted on the eSRAM. I don't get why they didn't use that space simply for GPU room which could have provided a 22CU device. If they placed the eSRAM off-chip and added more since it's off-chip, around 28mb, and they could of done some wonderful post-processing with full 1080p frames. This would include free AA while keeping some of the advantages of having DDR and using new DX libraries like PRT and tilling etc.

 

eSRAM is awesome for post and since they have two pools of unified memory, they have separate channels and can use the full BW for post. With it not being able to fit a full 1080p frames with a high z-buffer, the DMAs would have to be hammered for a pretty 1080p game which would need a lot of engine work which isn't being done this early on. This creates extra wait in the engine which adds that cruicial ms which drops frames horribly.

Link to comment
Share on other sites

So Kojima says some nice words about the PS4 in a video produced by SCEE?

 

Whoop-te-doo.

 

This blind focus on 1080p/60 is sad. Resolution says (next to) nothing about how a game looks.

Link to comment
Share on other sites

So Kojima says some nice words about the PS4 in a video produced by SCEE?

 

Whoop-te-doo.

 

This blind focus on 1080p/60 is sad. Resolution says (next to) nothing about how a game looks.

I don't get this, honestly I don't.

Link to comment
Share on other sites

You are thinking 2010-today, I am thinking first few years when PS3 "didn't have any games" - remember those days? :p

 

Hey now, Warhawk came out in 2007 and it's still one of my favourite games from last gen! (i was still playing it when i got my YLOD over last Christmas :p)

 

As for the discussion playing PS3 games regardless of performance, personally i got the PS3 for the exclusives and played most of the multi-platform titles on my 360.

Link to comment
Share on other sites

It's crazy how much multi-plats are different this time around, I'm going to guess this is a just above 30fps~ to a locked 60fps difference between.

 

Boggles the mind how much silicon space MS wasted on the eSRAM. I don't get why they didn't use that space simply for GPU room which could have provided a 22CU device. If they placed the eSRAM off-chip and added more since it's off-chip, around 28mb, and they could of done some wonderful post-processing with full 1080p frames. This would include free AA while keeping some of the advantages of having DDR and using new DX libraries like PRT and tilling etc.

 

eSRAM is awesome for post and since they have two pools of unified memory, they have separate channels and can use the full BW for post. With it not being able to fit a full 1080p frames with a high z-buffer, the DMAs would have to be hammered for a pretty 1080p game which would need a lot of engine work which isn't being done this early on. This creates extra wait in the engine which adds that cruicial ms which drops frames horribly.

 

Pretty good school boy illustration of the problems MS with eSRAM

 

N3iMLE1.jpg

Link to comment
Share on other sites

Pretty good school boy illustration of the problems MS with eSRAM

 

N3iMLE1.jpg

 

I think that's wrong direction. that funnel should be upside down :D

Link to comment
Share on other sites

I think that's wrong direction. that funnel should be upside down :D

 

Nope, the point is the bottleneck kicks in and regardless of there being eSRAM, the slower speeds of DD3 RAM cannot match GDDR5. You can ram as much into the eSRAM as you can, but it'll bottleneck and as much as you're passing in cannot pass through at the same rate.

Link to comment
Share on other sites

Nope, the point is the bottleneck kicks in and regardless of there being eSRAM, the slower speeds of DD3 RAM cannot match GDDR5. You can ram as much into the eSRAM as you can, but it'll bottleneck and as much as you're passing in cannot pass through at the same rate.

I get that but I meant the data will be flowing through the ram into esram I.e. Esram will starve and not bottleneck.

although I have no idea how game development works so there is that.

Link to comment
Share on other sites

I get that but I meant the data will be flowing through the ram into esram I.e. Esram will starve and not bottleneck.

although I have no idea how game development works so there is that.

 

Sorry, my bad!!

Link to comment
Share on other sites

Pretty good school boy illustration of the problems MS with eSRAM

N3iMLE1.jpg

Ive been seeing this diagram bring thrown around but I don't get it. Its hardly representative of what's happening.

The two pools of RAM are there for different purposes. The eSRAM is there for the functionality which takes advantage of the speed of it. This is mostly doing post and holding the frame for the ROPs to fill the frame on the screen since ROPs work primarily on the input from RAM which is the eRSAM is ideally suited for. Its not like they feed off each other and everything is from eSRAM flows through DDR. eSRAM may stream textures from DDR depending on what the developers do regarding AA.

It also has DMAs which move memory blocks between pools without using CPU cycles. It has 4 which is alot of them. These are needed due to the fact that the eSRAM cant hold a full 1080p frame with a high zbuffer. This means some frames have to be moved back and forth between RAM pools by the DMAs.

Link to comment
Share on other sites

Pretty good school boy illustration of the problems MS with eSRAM

 

N3iMLE1.jpg

 

Gotta love diagrams that were clearly drawn by someone who has never had any experience with how caching works. :rolleyes:

The ESRAM cache acts in a similar fashion to CPU RAM caches - by the logic in the diagram, CPUs should eliminate their on-chip caches and instead fit more cores on the silicon, relying on GDDR5 RAM, as there is *only* 2-8 MB of cache, which is *clearly* bottlenecked by the >8 GB of DDR3 RAM in the system!

Link to comment
Share on other sites

the problem with current XB1 eSRAM is that are not big enough.

Here is my basic drawing which does not account for all the factors, so it's really generalized.ibsb.png

 

old SDK? are you suggesting Microsoft gving out un-optimized SDK for the registered game devs?

Link to comment
Share on other sites

That's what is happening to a degree, the new SDK will bring improvements to DX and the driver level of the Xbox one, it all also allow devs to use the ESRAM properly without much effort.

 

And as others have said, the funnel diagram is stupid and wrong on so many levels.

Link to comment
Share on other sites

the problem with current XB1 eSRAM is that are not big enough.

 

old SDK? are you suggesting Microsoft gving out un-optimized SDK for the registered game devs?

The SDK has been heavily leaked/rumored to feature a poor implementation of DX on the console. Along with the 8% which is rumored to be gained through the updates these months, there's still a lot of future changes to boost performance. 

Link to comment
Share on other sites

This topic is now closed to further replies.