Jimquisition: Ubisoft Talks Bollocks About Framerate And Resolution


Recommended Posts

Honesty. How hard is it to be honest?

"We locked it at 30 FPS because it is a current gen consoles are very limited in terms of power"

Why is the above statement hard?

 

I would imagine that they would immediately lose any financial and marketing support that MS and Sony provide if they say that.

Link to comment
Share on other sites

I would imagine that they would immediately lose any financial and marketing support that MS and Sony provide if they say that.

 

I'd really appreciate it if you stopped wandering the forums spreading this like it was fact.

Link to comment
Share on other sites

 

 

 

nBXZqR3.png

 

Games for Windows really ??? I though every companies dropped that stupid label.

 

Anyway great that the PC version is 50$ only cause the game will already be old by then.

Link to comment
Share on other sites

I'd really appreciate it if you stopped wandering the forums spreading this like it was fact.

 

There is a reason why I started the sentence with "I would imagine".

Link to comment
Share on other sites

This is an unofficial developer response on the issue, very englightening tbh:

I'm happy to enlighten you guys because way too much ###### about 1080p making a difference is being thrown around. If the game is as pretty and fun as ours will be, who cares? Getting this game to 900p was a BITCH. The game is so huge in terms of rendering that it took months to get it to 720p at 30fps. The game was 9fps 9 months ago. We only achieved 900p at 30fps weeks ago. The PS4 couldn't handle 1080p 30fps for our game, whatever people, or Sony and Microsoft say. Yes, we have a deal with Microsoft, and yes we don't want people fighting over it, but with all the recent concessions from Microsoft, backing out of CPU reservations not once, but twice, you're talking about a 1 or 2 fps difference between the two consoles. So yes, locking the framerate is a conscious decision to keep people bullshiting, but that doesn't seem to have worked in the end. Even if Ubi has deals, the dev team members are proud, and want the best performance out of every console out there. What's hard is not getting the game to render at this point, it's making everything else in the game work at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed optimization for lots of Ubisoft games in the past, this is crazily optimized for such a young generation of consoles. This really is about to define a next gen like no other game before. Mordor has next gen system and gameplay, but not graphics like Unity does. The proof comes in that game being cross gen. Our producer (Vincent) saying we're bound with AI by the CPU is right, but not entirely. Consider this, they started this game so early for next gen, MS and Sony wanted to push graphics first, so that's what we did. I believe 50% of the CPU is dedicated to helping the rendering by processing pre-packaged information, and in our case, much like Unreal 4, baked global illumination lighting. The result is amazing graphically, the depth of field and lighting effects are beyond anything you've seen on the market, and even may surpass Infamous and others. Because of this I think the build is a full 50gigs, filling the bluray to the edge, and nearly half of that is lighting data.

 

Craps on a few peoples 'parity' parade. Internet makes me laugh.

Link to comment
Share on other sites

Every other publisher isn't releasing a $250 million dollar game so it's a bad comparison.

That's all the more reason the game SHOULD be released on PC simultaneously! The bigger the budget the easier it is to work on a multiplatform release. Activision manages to release new Call Of Duty games?another multi-billion dollar franchise?simultaneously; Bethesda does with Elder Scrolls. Even if one were to accept a small delay it is absolutely inexcusable to be talking about more than a year.

 

Look, I'd like a PC version just as much as the next person, but I'm not about to crucify a company that's done a great job year after year. They are a company, they are seeking profits, and you'd be naive to think anything differently. That doesn't mean any of this is intentional though, but rather based on prioritization. You serve the ones that pay the bills.

They are perfectly entitled to pursue profit at the expense of PC gamers, just as I'm free not to support them. It doesn't seem financially prudent to repeatedly snub a major platform, supported by the fact that other developers don't do that.

 

nBXZqR3.png

That's because the PC doesn't have any licencing fees like consoles.

 

Getting back on topic, the point is that developers have to stop lying and making up excuses. The only reason for 30fps is because the XB1 and PS4 can't handle 60fps, otherwise you'd have seen a lot more games on the PC running at only 30fps. The platforms that have the power to run at 60fps (or higher) do so and that's not a coincidence. When games on the 'next-gen' consoles are struggling to achieve 1080p it's obvious that they're not running at 30fps for creative reasons.

Link to comment
Share on other sites

I don't doubt a lot of devs work really hard to squeeze everything they can out of the consoles. It is the PR and suit wanks who spin nonsense.

 

They should say they are proud of and respect just how hard their devs work to get it up to the standard it is instead of offering nonsense about cinematic aesthetics when, as theyarecomingforyou put it, its really about a performance constraint.

Link to comment
Share on other sites

This is an unofficial developer response on the issue, very englightening tbh:

 

Craps on a few peoples 'parity' parade. Internet makes me laugh.

 

This is directly from Ubisoft's Alexis Vaisse talk at GDC Europe on making use of GPGPU to offload CPU intensive calcuations for each console -- in this case cloth simulation.

 

600x371xps4gpu_x1gpu-600x371.jpg.pagespe

 

By Ubi's own internal numbers, PS4 GPU blows X1 out of the water.  So if the guy you're quoting actually exists, what we'll get is really really really REALLY good cloth simulation at 900p for, as Sterling put it, "the 10 additional NPC's on the screen that I won't give a ###### about"?  Again, this will be the first AC game I pass on, mostly because of very public PR blunders and obvious ######.

Link to comment
Share on other sites

That's all the more reason the game SHOULD be released on PC simultaneously! The bigger the budget the easier it is to work on a multiplatform release. Activision manages to release new Call Of Duty games?another multi-billion dollar franchise?simultaneously; Bethesda does with Elder Scrolls. Even if one were to accept a small delay it is absolutely inexcusable to be talking about more than a year.

Throwing money at something doesn't equate to a win. Otherwise, Microsoft would be running circles around Sony. :P

 

I don't feel like arguing as I'm still at work, but you're free to disagree with me. I just don't think 2 months is worth the upset. /shrug

Link to comment
Share on other sites

This is directly from Ubisoft's Alexis Vaisse talk at GDC Europe on making use of GPGPU to offload CPU intensive calcuations for each console -- in this case cloth simulation.

 

600x371xps4gpu_x1gpu-600x371.jpg.pagespe

 

By Ubi's own internal numbers, PS4 GPU blows X1 out of the water.  So if the guy you're quoting actually exists, what we'll get is really really really REALLY good cloth simulation at 900p for, as Sterling put it, "the 10 additional NPC's on the screen that I won't give a ###### about"?  Again, this will be the first AC game I pass on, mostly because of very public PR blunders and obvious ######.

So, your telling me Ubisoft are wrong and poor developers by using Ubisofts technical material as sourcing for your argument?

 

If they had enough headroom left in the GPU's, they would of done this. The fact that they only got to 900p weeks ago on the PS4, doesn't that tell you something?

 

Come on guys.

Link to comment
Share on other sites

This is an unofficial developer response on the issue, very englightening tbh:

 

Craps on a few peoples 'parity' parade. Internet makes me laugh.

That's an extremely poorly formatted repost of what was said in another thread.  I'll link my response here.

 

https://www.neowin.net/forum/topic/1198885-ps4-and-xbox-one-resolution-frame-rate-discussion/page-89#entry596615157

 

NO "next-gen" console game should be CPU limited this generation, that's just poor game design.  Because of GPGPUs both Sony and MS intentionally used weak CPUs ("Jaguar" Cores were designed for low power tablets and the like NOT performance parts) and beefed up the GPUs to handle non-graphics related compute capabilities (at the same time as graphics tasks) so that tasks that were previously done on the CPU would now be done on the GPU instead.  Being CPU bound means the CPU, not the GPU is being pushed to it's limit.  That's an indication that they aren't moving enough (or any? non-graphics tasks to the GPU.  This generation of console was designed so that just about everything should be moved to the GPU until the GPU can't take any more.  EVERY console game should be GPU limited.  The CPU should be doing only those things that absolutely can't be done on the GPU and any overflow after the GPU is maxed out.  From that quote:

 

"MS and Sony wanted to push graphics first, so that's what we did.  I believe 50% of the CPU is dedicated to helping the rendering by processing pre-packaged information, and in our case, much like Unreal 4, baked global illumination lighting"

 

What MS and Sony meant was they wanted developers to push the GPU first.  That's absolutely NOT what they did if the game is CPU limited and using 50% of your exteremly limited CPU processing budget to process GRAPHICS (global illumination lighting) is a HORRIBLE design decision.  So the platform holders tell you to push the Graphics (meaning GPU) because the CPU sucks and your solution is to use the incredibly weak CPU to do baked global illumination lighting (graphics) that ends up CPU limiting your game so it can't even push the powerful GPUs to their limits... are you serious?!?  Not only that but this wonderful CPU draining/GPU bottlenecking baked global illumination lighting takes up nearly half of the 50gigs the game is:

 

"Because of this I think the build is a full 50gigs, filling the bluray to the edge, and nearly half of that is lighting data."

 

Seems like a pretty horrible design decision to me.

Link to comment
Share on other sites

 

 

 

 

 

Let see how many Pc gamers can run GTA V at a stable 60fps.

I'm not sure about GTA V but I run Battlefield 4 on Ultra at about 90 fps on my PC, sometimes 110 fps dependent on where I'm "looking" in the map. They had it locked at 60fps in the config but i bumped it up to 150 max (in the config).

Link to comment
Share on other sites

So, your telling me Ubisoft are wrong and poor developers by using Ubisofts technical material as sourcing for your argument?

 

If they had enough headroom left in the GPU's, they would of done this. The fact that they only got to 900p weeks ago on the PS4, doesn't that tell you something?

It tells us that they limited the PS4 version to provide parity with the XB1 version, hence the controversy. If that wasn't the case they wouldn't have said anything about parity, they would have instead said that they worked to the strengths of each platform. Given the substantial performance differences in other games it seems like a remarkable coincidence that both versions would turn out exactly the same after Ubisoft said it targeted parity.

 

It was either a misguided attempt to diffuse the controversy surrounding the difference in resolution between the XB1 and PS4 or they received incentives from Microsoft. I would be surprised if it wasn't the latter.

Link to comment
Share on other sites

It tells us that they limited the PS4 version to provide parity with the XB1 version, hence the controversy. If that wasn't the case they wouldn't have said anything about parity, they would have instead said that they worked to the strengths of each platform. Given the substantial performance differences in other games it seems like a remarkable coincidence that both versions would turn out exactly the same after Ubisoft said it targeted parity.

It was either a misguided attempt to diffuse the controversy surrounding the difference in resolution between the XB1 and PS4 or they received incentives from Microsoft. I would be surprised if it wasn't the latter.

I think Asmodai has the right idea here. The cpu power of both consoles is fairly similar and that means if the game is truly cpu bound... Then parity really means nothing here. It seems like an overall lack of planning, and Ubisoft making a poor call on where to focus the game's resources.

 

That being said, it could just be that there was no convenient way to leverage the GPU's for the processing but unlike PC's and previous console generations where the meat was/is in the processing power the new consoles aren't so.

Link to comment
Share on other sites

This is an unofficial developer response on the issue, very englightening tbh:

 

Craps on a few peoples 'parity' parade. Internet makes me laugh.

 

Still doesn't make their original response any less ######.

Link to comment
Share on other sites

Take a look at this article:

 

http://www.worldsfactory.net/2014/10/15/ps4-gpgpu-doubles-xb1-gpgpu-ubisoft-test

 

It talks about Ubisofts own presentation at GDC Europe about how the CPUs of these consoles are weak and the GPUs are made to do tasks formerly done on the CPU.

They use their cloth simulation algorithm as an example of something formerly done on CPUs being ported to the GPU.

 

Here are the results for doing it via CPU:

(dancers per 5ms CPU time)

 

Xbox 360: 34

PS3: 105 (using SPUs)

PS4: 98

Xbox One: 113

 

You'll note the PS4 is actually slower than even the PS3 because the PS4 is NOT designed to do heavy lifting on the CPU.  Even the Xbox One barely outperforms the PS3.  The Xbox One advantage over the PS4 is likely from the clock boost it got just before launch.

 

Now when the move the cloth physics simulation to the GPU (which you can't do on the prior gen consoles as they lack the necessary compute shaders) we see what this generation is SUPPOSED to be about:

 

Xbox One: 830

PS4: 1600

 

Despite having only 50% more compute units by Ubisofts own test the PS4 almost doubles the Xbox One performance.  Even the slower "next gen" console is over 8x the performance of the fastest prior gen.  That is what "next-gen" games are supposed to be about.  Developers are supposed to be pushing darn near everything to the GPU until it is maxed out.  NO "next gen" game should be CPU limited because anything and everything that can be should be running on the GPU, every console game should be GPU bound.  If the developer is doing that on both consoles there should be a significant difference in performance not just 1 to 2 fps, if they are not there is an aweful lot of Sony GPU hardware not being used and the game isn't very "next gen" because the CPUs just aren't that powerful compared to last gen (in the case of the PS4 it's actually SLOWER!).

 

It's amazing to me that Ubisoft does a demo on how important it is to move formerly CPU based tasks to the GPU at GDC Europe and then announces a CPU bound "next-gen" game.

  • Like 2
Link to comment
Share on other sites

So, your telling me Ubisoft are wrong and poor developers by using Ubisofts technical material as sourcing for your argument?

 

If they had enough headroom left in the GPU's, they would of done this. The fact that they only got to 900p weeks ago on the PS4, doesn't that tell you something?

 

Come on guys.

 

No I'm telling you I'm calling ###### on the "anonymous dev" until sourced.  The game has been in development for four years, among ten studios, always with the intent of it being "next-gen".  They've been developing middleware for GPGPU calculations.  And they just got to 720p with a steady framerate a few months ago?  Yeah, no.

Link to comment
Share on other sites

No I'm telling you I'm calling ###### on the "anonymous dev" until sourced.  The game has been in development for four years, among ten studios, always with the intent of it being "next-gen".  They've been developing middleware for GPGPU calculations.  And they just got to 720p with a steady framerate a few months ago?  Yeah, no.

The source is the podcast, listen to it. I'd rather believe him than some taken out of context quote which you all don't even know is true. I know you're going to quote me giving a reason for my I'm being ridiculous, you know, because arm-chair developers who don't even know what the role software architecture means, tell me other wise.

 

NO "next-gen" console game should be CPU limited this generation, that's just poor game design.  

It would be when you've got the CPU's of a middle-range notebook. A mid-range i5 would smash these to pieces. If you've got a CPU thread which takes 32ms, you're CPU bound. If you've got your GPU taking even like 24ms for instance, there's no way you can offload onto the GPU. They've been working on GPGPU calculations, blah blah blah, like the GPU isn't doing anything in the game. 

Link to comment
Share on other sites

It's amazing to me that Ubisoft does a demo on how important it is to move formerly CPU based tasks to the GPU at GDC Europe and then announces a CPU bound "next-gen" game.

 

There is a simple reason for that: time and expenses for optimizations. You can't just create an IP on an improperly tested environment, you must test thorough so you don't risk damaging that IP because of a bad port / badly implemented game, full of bugs, etc, but this costs time and money. This is fairly new tech and as more discoveries are being made in this field (optimization), the better games are going to be released, taking full advantage of the hardware of the next-gen consoles. They could provide this new game with all the optimizations they could make but that would take time and money (if they wanted to release the game in a timely fashion); instead they did a safe approach.

Link to comment
Share on other sites

It would be when you've got the CPU's of a middle-range notebook. A mid-range i5 would smash these to pieces. If you've got a CPU thread which takes 32ms, you're CPU bound. If you've got your GPU taking even like 24ms for instance, there's no way you can offload onto the GPU. They've been working on GPGPU calculations, blah blah blah, like the GPU isn't doing anything in the game.

Once again the GPUs this cycle are GPGPUs and via Compute Shaders are able to process most, if not all, tasks the CPU can. If you've got a CPU thread which takes 32ms and your GPU is not being pushed to the max then you should be moving whatever that thread is doing to the general purpose compute shaders. You should keep moving things to the GPU until the GPU is your limiting factor. EVERY console game should be GPU bound. The CPUs ARE weak, they're not even middle-range notebook weak as you said they're tablet weak. That was intentional because you're NOT supposed to do JACK on them. You're supposed to use the compute shaders to do darn near everything, THATS the innovation this console generation. If you're CPU bound your still writing your game like a last gen game just with a better graphics card and completely missing the point of compute shaders.
Link to comment
Share on other sites

There is a simple reason for that: time and expenses for optimizations. You can't just create an IP on an improperly tested environment, you must test thorough so you don't risk damaging that IP because of a bad port / badly implemented game, full of bugs, etc, but this costs time and money. This is fairly new tech and as more discoveries are being made in this field (optimization), the better games are going to be released, taking full advantage of the hardware of the next-gen consoles. They could provide this new game with all the optimizations they could make but that would take time and money (if they wanted to release the game in a timely fashion); instead they did a safe approach.

 

Well if you don't have the time or money to optimize the game to take full advantage of the hardware then you should NOT be running around making statements like:

 

"The PS4 couldn't handle 1080p 30fps for our game, whatever people, or Sony and Microsoft say."

 

"from someone who witnessed optimization for lots of Ubisoft games in the past, this is crazily optimized for such a young generation of consoles. This really is about to define a next gen like no other game before. Mordor has next gen system and gameplay, but not graphics like Unity does."

 

"Consider this, they started this game so early for next gen, MS and Sony wanted to push graphics first, so that's what we did."

 

The PS4 can't hanlde 1080p 30fps because they're not fully utilizing the compute shaders on it's GPU and they've bottlnecked their game engine on the weak CPU.  That's just poor design.  It's not like the CPU changed to a weaker version last minute.  It would be one thing if they didn't take into acount the MHz jump MS did last minute or the extra RAM Sony did last minute but it was known from the start by developers that the CPU would be weak in favor of using compute units, they did a talk on it themselves at GDC.  That wasn't a NEW revelation at the time.

 

CPU bound this console generation is NOT "crazily optimized".  Maybe compared to "Ubisfot games in the past" but if that's the case that bodes very poorly to how well prior games have been optimized.

 

As for the "push graphics first", the point the platform holders were trying to make (both MS and Sony) is the CPUs are WEAK, use the GPU!  So by making the game CPU bound they did NOT do that as he claims.  In fact not only did they not do it but they did graphics related tasks on the weak CPU that bottlenecked the system so it can't even fully utilize the GPU, thats pretty much the EXACT OPPOSITE of the point MS and Sony were trying to make.  The CPUs on this generation console aren't significantly more powerful than the PS3 (the PS4 is even weaker by their own tests) so being CPU bound pretty much means you're making a prior gen game... just with a better fixed function graphics card.  This should not happen in a true "next gen" game.  The GPU should always be the bottleneck and if it's not then more tasks need to be moved from the weak CPU to the strong (comparatively) GPU.  That's the beauty of Compute shaders, they can do pretty much everything a CPU can, which is the real innovation of "next gen" systems and why their CPUs are so poor.

  • Like 1
Link to comment
Share on other sites

Well if you don't have the time or money to optimize the game to take full advantage of the hardware then you should NOT be running around making statements like:

 

"The PS4 couldn't handle 1080p 30fps for our game, whatever people, or Sony and Microsoft say."

 

"from someone who witnessed optimization for lots of Ubisoft games in the past, this is crazily optimized for such a young generation of consoles. This really is about to define a next gen like no other game before. Mordor has next gen system and gameplay, but not graphics like Unity does."

 

"Consider this, they started this game so early for next gen, MS and Sony wanted to push graphics first, so that's what we did."

 

The PS4 can't hanlde 1080p 30fps because they're not fully utilizing the compute shaders on it's GPU and they've bottlnecked their game engine on the weak CPU.  That's just poor design.  It's not like the CPU changed to a weaker version last minute.  It would be one thing if they didn't take into acount the MHz jump MS did last minute or the extra RAM Sony did last minute but it was known from the start by developers that the CPU would be weak in favor of using compute units, they did a talk on it themselves at GDC.  That wasn't a NEW revelation at the time.

 

CPU bound this console generation is NOT "crazily optimized".  Maybe compared to "Ubisfot games in the past" but if that's the case that bodes very poorly to how well prior games have been optimized.

 

As for the "push graphics first", the point the platform holders were trying to make (both MS and Sony) is the CPUs are WEAK, use the GPU!  So by making the game CPU bound they did NOT do that as he claims.  In fact not only did they not do it but they did graphics related tasks on the weak CPU that bottlenecked the system so it can't even fully utilize the GPU, thats pretty much the EXACT OPPOSITE of the point MS and Sony were trying to make.  The CPUs on this generation console aren't significantly more powerful than the PS3 (the PS4 is even weaker by their own tests) so being CPU bound pretty much means you're making a prior gen game... just with a better fixed function graphics card.  This should not happen in a true "next gen" game.  The GPU should always be the bottleneck and if it's not then more tasks need to be moved from the weak CPU to the strong (comparatively) GPU.  That's the beauty of Compute shaders, they can do pretty much everything a CPU can, which is the real innovation of "next gen" systems and why their CPUs are so poor.

 

this is much, much more complex then it seems. One thing you are spot on: "Maybe compared to "Ubisfot games in the past" but if that's the case that bodes very poorly to how well prior games have been optimized." Unfortunately Ubisoft aren't the only dev house in this boat... *cough* Crytek *cough*

Link to comment
Share on other sites

Once again the GPUs this cycle are GPGPUs and via Compute Shaders are able to process most, if not all, tasks the CPU can. If you've got a CPU thread which takes 32ms and your GPU is not being pushed to the max then you should be moving whatever that thread is doing to the general purpose compute shaders. You should keep moving things to the GPU until the GPU is your limiting factor. EVERY console game should be GPU bound. The CPUs ARE weak, they're not even middle-range notebook weak as you said they're tablet weak. That was intentional because you're NOT supposed to do JACK on them. You're supposed to use the compute shaders to do darn near everything, THATS the innovation this console generation. If you're CPU bound your still writing your game like a last gen game just with a better graphics card and completely missing the point of compute shaders.

You have to have guaranteed resources for the tasks your going to be doing. With my examples I've mention above, if you have a 32ms CPU thread and off-load 25% of that work to the GPU, thats say 24ms for the CPU thread to process. Then your GPU is going to be at 32ms. Then you've also got to the think of the dependencies of the tasks and the order they run. For example, you don't want to tell the GPU to start rendering the environment when the AI hasn't even been calculated. That's also saying that the GPU isn't already full, which it will be, since Ubisoft use GPGPU for cloth simulation which is in Unity.

Link to comment
Share on other sites

This topic is now closed to further replies.