PS4 and Xbox One resolution / frame rate discussion


Recommended Posts

That's what is happening to a degree, the new SDK will bring improvements to DX and the driver level of the Xbox one, it all also allow devs to use the ESRAM properly without much effort.

 

And as others have said, the funnel diagram is stupid and wrong on so many levels.

I agree, the funnel diagram isn't the best way to show this. Plus, I simplified mine way too much by separating data and graphics, but that was shown due to the difference in speed since it is GDDR5 RAM, so it would not be as good for anything but graphics.

Link to comment
Share on other sites

I agree, the funnel diagram isn't the best way to show this. Plus, I simplified mine way too much by separating data and graphics, but that was shown due to the difference in speed since it is GDDR5 RAM, so it would not be as good for anything but graphics.

 

Ah right so do you think having the DDR3 in there is a wiser decision because of all the media functionality whereas cus the PS4 is really only just forcing gfx down it doesnt matter to much? Do you think MS deliberately designed the system that way or because they thought the GDDR5 would be to expensive to be viable (as rumored)?

Link to comment
Share on other sites

Ah right so do you think having the DDR3 in there is a wiser decision because of all the media functionality whereas cus the PS4 is really only just forcing gfx down it doesnt matter to much? Do you think MS deliberately designed the system that way or because they thought the GDDR5 would be to expensive to be viable (as rumored)?

I think cost must have been a major factor for them because they wanted 8GB from the start whereas Sony just got lucky in the end.(2GB in first rumor --> 8GB at reveal)

Link to comment
Share on other sites

I think cost must have been a major factor for them because they wanted 8GB from the start whereas Sony just got lucky in the end.(2GB in first rumor --> 8GB at reveal)

 

Yah well they were going with 4GB initially which would of killed the console, the OS takes up 3.5GB ram, unless they improved it which shows complete lack of forsight. The 8GB is why there losing money on every console, although GDDR5 is cheaper now its still quite expensive really as a ram

Link to comment
Share on other sites

Yah well they were going with 4GB initially which would of killed the console, the OS takes up 3.5GB ram, unless they improved it which shows complete lack of forsight. The 8GB is why there losing money on every console, although GDDR5 is cheaper now its still quite expensive really as a ram

 

I thought both companies were selling the console for pretty much what it cost to make?

Link to comment
Share on other sites

I think cost must have been a major factor for them because they wanted 8GB from the start whereas Sony just got lucky in the end.(2GB in first rumor --> 8GB at reveal)

Yup, this is the biggest thing you have to consider. Sony got extremely lucky with the availability of GDDR.

 

MS was clear on 8GB from the start which is why they went down the DDR route.

Link to comment
Share on other sites

I thought both companies were selling the console for pretty much what it cost to make?

 

MS either breaking even or having a small profit and sony are losing a bit on every console, its not much i dont think but its still a loss

Link to comment
Share on other sites

Yup, this is the biggest thing you have to consider. Sony got extremely lucky with the availability of GDDR.

 

MS was clear on 8GB from the start which is why they went down the DDR route.

 

No doubt.

 

But it's not just about RAM, MS chose the graphics card which isn't as good as the PS4.

Link to comment
Share on other sites

No doubt.

 

But it's not just about RAM, MS chose the graphics card which isn't as good as the PS4.

Only because 40%~ of the silicon is dedicated to eSRAM. In terms of silicon space, the X1 annihilates the PS4. It's the 2nd largest chip produced.

 

I'm not entirely knowledgeable in this area, but I truly don't understand why the dedicated so much silicon towards it rather than having it off-silicon. The X1 could of been compared with a Titan if they didn't use all that space.

Link to comment
Share on other sites

Only because 40%~ of the silicon is dedicated to eSRAM. In terms of silicon space, the X1 annihilates the PS4. It's the 2nd largest chip produced.

 

But that's not making anything better for MS. If anything all it's done is increase the size of the console.

  • Like 1
Link to comment
Share on other sites

But that's not making anything better for MS. If anything all it's done is increase the size of the console.

Well it makes the eSRAM solution that bit better, so yes. It doesn't really correlate with how big the console is although the chip would produce more heat obviously. They just played it safe.

Link to comment
Share on other sites

But that's not making anything better for MS. If anything all it's done is increase the size of the console.

I guess it still debatable whether it is making anything better for MS. Esram was helpful last gen, so it could be again.

Its clear now that MS had a lot of room to improve the sdk and drives. The comments that are starting to show up form developers points in that direction. MS themselves have mentioned it as well.

The amount of Esram is too small for a developer to easily fit a 1080p frame into it, so they have to leverage various features that are meant to avoid that issue, but its time consuming. If MS' plan is to improve the sdk/drivers so that developers no longer have to worry about managing it and its just managed behind the scenes, then maybe things improve over time.

As far as the gpu choice, I always wondered if MS and Sony really knew the exact hardware each had chosen to use.

Link to comment
Share on other sites

Only because 40%~ of the silicon is dedicated to eSRAM. In terms of silicon space, the X1 annihilates the PS4. It's the 2nd largest chip produced.

 

I'm not entirely knowledgeable in this area, but I truly don't understand why the dedicated so much silicon towards it rather than having it off-silicon. The X1 could of been compared with a Titan if they didn't use all that space.

 

I think its because itll make the transfer so fast its crazy and i bet the latency is as low as possible. I think the DDR3 is the only ram off chip so integrating the eSRAM into the chip would make it lightning fast. Hasnt got to travel across a SoC or anything want something off eSRAM boom its there haha

Link to comment
Share on other sites

Talk from those in the Titanfall beta make me think this is not a graphics powerhouse of a game and it could use more optimization.  It seems to be more taxing then it should be on pc for a source based game.

Link to comment
Share on other sites

Talk from those in the Titanfall beta make me think this is not a graphics powerhouse of a game and it could use more optimization.  It seems to be more taxing then it should be on pc for a source based game.

 

Hopefully ill get my beta key on 17th feb on PC they could still be doing optimisations but i know how well frostbite3 runs on my computer so ill have a look at the graphics textures, AA and stuff and framerate... maybe it could shed some light if its need more optimisation but liek is aid they could be running optimisations from now till release so could change

Link to comment
Share on other sites

Hopefully ill get my beta key on 17th feb on PC they could still be doing optimisations but i know how well frostbite3 runs on my computer so ill have a look at the graphics textures, AA and stuff and framerate... maybe it could shed some light if its need more optimisation but liek is aid they could be running optimisations from now till release so could change

 

Titanfall doesn't use Frostbite 3, it uses Source.

Link to comment
Share on other sites

Titanfall doesn't use Frostbite 3, it uses Source.

 

 

Exactly.  The devs chose that in order to save them time, but it seems like the downside is that they are not going to get high graphical fidelity out of it.  Not to say that is a big failure, since it also sounds like people are really enjoying the game.

Link to comment
Share on other sites

Titanfall doesn't use Frostbite 3, it uses Source.

 

Yeah i know but umm yeah doesnt translate does it lol.  Think im trying to say i know the visuals i can get out of BF4's frostbite 3 so could try compare them to source and framerate see if the engine or coding is letting them down... then again ill prolly just ignore all of it and try play the game!

Exactly.  The devs chose that in order to save them time, but it seems like the downside is that they are not going to get high graphical fidelity out of it.  Not to say that is a big failure, since it also sounds like people are really enjoying the game.

 

Think alot of launch titles close to release are using old engines just to save time/didnt know what the specs of consoles were so hard to develop for something you kow nothing about even though DICE did a good job of predicting for its engine

Link to comment
Share on other sites

it's probably more accurate to say that it is based on source.

 

Its not based, it uses the source engine. You can say its a modified source engine but every game apart from the first game that uses it modifies the engine.

Its slowly been improving over the years in incremental updates but hasn't been updated to source 2.0 yet.

 

As for the reasoning behind using a 10yr old game engine: "Respawn chose to build Titanfall on the Source game engine early in their production cycle due to their developers' familiarity and its ability to maintain 60 frames per second on the Xbox 360"

Link to comment
Share on other sites

Its not based, it uses the source engine. You can say its a modified source engine but every game apart from the first game that uses it modifies the engine.

Its slowly been improving over the years in incremental updates but hasn't been updated to source 2.0 yet.

 

As for the reasoning behind using a 10yr old game engine: "Respawn chose to build Titanfall on the Source game engine early in their production cycle due to their developers' familiarity and its ability to maintain 60 frames per second on the Xbox 360"

If you are quoting wikipedia, why not quote the whole thing? This is literally the same paragraph.

 

The company built upon the engine during development in features such as lighting, rendering, visibility, networking, and tools pipelines.[31] The game also uses Microsoft's cloud computing for multiplayer servers, physics, and artificial intelligence.

 

So pretty much what I said?

  • Like 2
Link to comment
Share on other sites

So pretty much what I said?

 

No, based on implies a new engine. Its the same engine only modified like all other games that use it.

 

But feel feel to misinterpret the meanings I can't be bothered arguing over trivial things. Also I didn't post the quote you did cause it doesn't state anything new I already said it was a modified engine and I didn't feel the need to throw in the over-hyped marketing catch phrase 'cloud computing' in my post.. 

Link to comment
Share on other sites

But feel feel to misinterpret the meanings I can't be bothered arguing over trivial things. Also I didn't post the quote you did cause it doesn't state anything new I already said it was a modified engine and I didn't feel the need to throw in the over-hyped marketing catch phrase 'cloud computing' in my post.. 

"based on" implies the same thing as "built upon" and is not the same as "it uses".

 

If you can't be bothered about this, why bother getting into the discussion?

You can call cloud computing whatever you want, I included that because it is relevant to the games engine (physics and AI as mentioned). It doesn't matter if you think it is just a buzzword.

  • Like 2
Link to comment
Share on other sites

Titanfall is on the block for consoles and PC @ Eurogamer

 

What we can confirm for this build, though, is an internal resolution of 1408x792, with a pass of 2x MSAA to tackle aliasing before any upscale to your preferred resolution. This pixel count is something the team is happy to verify - even with indications that it could end up around the 900p mark for the final product. It may not be a gargantuan number as-is, but it pulls the game away from the ho-hum 1280x720 that was on most peoples' bingo cards - a step up, if not a remarkable one, in terms of the final image.

 

 

However, it all changes once you buckle into a Titan, and in this build we see lengthy passages of play (particularly by the end of a mission) falling within the 35-45fps range. Neither one of the levels on show is especially worse than the other in this regard, and it's clearly the barrage of alpha effects that ends up pressing the hardware too far.

 

 

It's also impossible to ignore the tearing that creeps up during such dips. Adaptive v-sync is in play, which taps in any time the engine detects a frame going over budget and missing a slot within its 60Hz refresh. Temporarily removing this lock helps to make control over the action feel smoother than it otherwise would, but at a big - and regular - cost to the overall presentation. As a game with a heavy focus on competitive multiplayer, these are clearly optimisation problems that need tackling before the game goes to market in March - with screen-tear high on the list.

 

 

That said, it's unfortunate that once you stop to have a look around at the arena that the environment appears so clinical. Much of Fracture, for example, is built to a very strict and rigid wireframe, with no evidence of next-gen technologies such as tessellation to round off the more egregious corners. Shading is also largely missing, outside of baked-in shadows and ambient occlusion that fades in we near objects. The two maps shown so far are vibrant and well-suited for the wall-run-and-gunning gameplay, but without these extra layers of detail the whole world come across as a little plain, lacking in dynamic destruction and enhanced environmental detail and animation. It's clear where the emphasis is: Titanfall relies upon the sheer intensity of its action for its measure of spectacle.

 

 

All of which leads us back to the assessment of the Xbox One version's fate. Titanfall is only a month shy of its final March release, and the tussle in this early beta is clearly between performance and effects-work. Given Respawn's reputation in the business, we'd hope frame-rate does indeed prove to be king. However, with recent suggestions of the internal resolution being pushed up higher to the 1600x900 mark, it's not clear where the GPU power can be found to maintain a large boost in pixel count while at the same time clearing up the frame-rate issues we find in the current 1408x792 version.

 

 

Source: http://www.eurogamer.net/articles/digitalfoundry-2014-titanfall-beta-tech-analysis

Link to comment
Share on other sites

This topic is now closed to further replies.