Jimquisition: Ubisoft Talks Bollocks About Framerate And Resolution


Recommended Posts

It's testing the system, pushing its limits. You have to test the waters before you dive in. I don't really understand why you are phrasing this as if they are intentionally letting themselves get screwed over by limited hardware. You're painting a picture I honestly don't see to be the case. Regardless of what their PR guys spit out, they have nothing to do with the development of the actual game.

I don't think they're "intentionally letting themselves get screwed over by limited hardware". I think they're choosing not to further optimize the games for the consoles because it costs more time and money then running around telling everyone how well optimized their game is, how other games aren't next gen like theirs, and how the game consoles aren't capable of more as excuses for the less than expected performance as a result of their failure to optimize to the specific strengths of the console.

Furthermore this thread is called "Ubisoft talks bollocks..." so it IS about "what their PR guys spit out". A lot of what I was responding to was a quote allegedly from a Ubisoft Unity developer who emailed the GiantBomb podcast so he does have something to do with the development of the actual game.

Well optimized is relative to games previous. It may very well be optimized, but again we are only so far into the lifecycle of the console. There may be more optimizations they can make in the future. I feel you are very much overreacting to this game.

Well you shouldn't be saying the platforms aren't capable of something because you've failed to do the required optimization to get there. That makes it sound like the hardware is inadequate when in fact it's just that you haven't had time/money to do more optimizations this early in the product cycle (which is legit but that's NOT what they're saying.)

Link to comment
Share on other sites

Yes, technically 10% of the GPU is more FLOPS or whatever basic arithmetic operation than the CPU, but depending on what is it your game does, again, it may not be feasible or even be counter-productive to try to shoehorn intrinsically serial operations on the GPU. 10% of that CPU is way faster than 10% of the GPU at doing certain things, i.e. not massively parallel number crunching but any complex algorithms with lots of intermediate steps and branching logic. I mean, I get your point but you can't really reduce this to a question of the "weaker part limiting the stronger part".  There's more raw power in the GPU but it's not as available as the CPU; it's more accurate to think of the two as different complementary parts than simply a weak and a strong one.

I think you're too hung up on a single set of operations and we're drifting way off into the theoretical here.  Even if one type of operation can't be moved to the GPU, which Ubisoft stated themselves pretty much everything can, I seriously doubt NOTHING the CPU is doing in a CPU limited game can be moved. Everything you do move frees up more resources for those that don't thus helping to relieve the bottleneck.  Again that's all theoretical though.

 

The Ubisoft Unity developer quote I replied to specifically said "50% of CPU used to process prepackaged info, like prebaked global illumination and lighting."  Last I checked global illumination and lighting was not only something compute units can do but even something that fixed function GPUs could do.  If you're using 50% of a weak CPU to do graphics related tasks and it's CPU binding your game so your GPU can't be fully utilized that's a poor design.  Why on earth would you move a graphics releated tasks (lighting) from the strong GPU to the weak CPU?

Link to comment
Share on other sites

I think you're too hung up on a single set of operations and we're drifting way off into the theoretical here.  Even if one type of operation can't be moved to the GPU, which Ubisoft stated themselves pretty much everything can, I seriously doubt NOTHING the CPU is doing in a CPU limited game can be moved. Everything you do move frees up more resources for those that don't thus helping to relieve the bottleneck.  Again that's all theoretical though.

 

The Ubisoft Unity developer quote I replied to specifically said "50% of CPU used to process prepackaged info, like prebaked global illumination and lighting."  Last I checked global illumination and lighting was not only something compute units can do but even something that fixed function GPUs could do.  If you're using 50% of a weak CPU to do graphics related tasks and it's CPU binding your game so your GPU can't be fully utilized that's a poor design.  Why on earth would you move a graphics releated tasks (lighting) from the strong GPU to the weak CPU?

Apparently they've chosen to pre-compute lighting instead of calculating it on the fly probably because they figured out that they could make the game look better this way. If they ran the lighting in real-time on the GPU they'd become GPU-bound but if the game looked worse for it that's not an improvement is it? Now I'm not sure what is it exactly that keeps the CPU busy 50% of its time with pre-baked lighting, is it just deciding what/where/how to apply it, something of the sort, probably that it's some complicated logic that's not easily parallelizable. Note the dev didn't say "50% of the time is dedicated to lighting", he said "prepackaged info like illumination and lighting". So there's some other info that's processed as well and we don't know which is taking more time or how they arrive at that 50%. Perhaps the GPU is already quite busy with other things and they don't have the headroom on the GPU to do it there.

 

Anyway, we don't know and we don't have the performance traces to tell what's going on so your criticism seems to rest on assumptions that no one can prove or refute.

Link to comment
Share on other sites

Apparently they've chosen to pre-compute lighting instead of calculating it on the fly probably because they figured out that they could make the game look better this way. If they ran the lighting in real-time on the GPU they'd become GPU-bound but if the game looked worse for it that's not an improvement is it?

If all you care about is lighting then no. Moving from baked lighting on the CPU to dynamic lighting makes GPU would free up the CPU to do more of whatever NON-graphics things it was doing. Maybe add even more AI or more complex behaviors, I have no idea what else the CPU is doing in Unity but whatever it is it could get better if the lighting wasn't taking 50%.

Furthermore dynamic lighting may not look as good but it's... dynamic so it changes based of what is happening whereas baked lighting is static. Baked lighting is what computer games did before GPUs were even capable of doing dynamic lighting, it's not exactly "next-gen". Now I'm not saying it looks worse. You can have a giant server farm doing tons of calculations when baking your lighting that your GPU couldn't hope to do realtime so your lighting could look way better but it's an odd thing to bottleneck your next-gen game on.

I have no doubt if they're using 50% of the CPU and almost half of the 50GB of game data on lighting that Unity will have super impressive lighting. I'm just saying that's a poor allocation of resources because great lighting doesn't make a great game. Obviously extremely poor lighting can be detrimental to a game but if they do whatever everyone else is doing with respect to lighting it's not extremely poor and I seriously doubt when Unity comes out gamers are going to be talking about how amazing the lighting was.

Perhaps the GPU is already quite busy with other things and they don't have the headroom on the GPU to do it there.

They do have the headroom on the PS4. They've told us the game is CPU limted, that's not speculation. That means the CPU is preventing the GPU from being fully utilized. Even in the best case scenario where it's 100% CPU 99% GPU (it can't be 100% GPU or it wouldn't be CPU limited like Ubi said) on the XBox One the PS4 has roughly 50% more GPU power (Ubisofts own cloth demo had it at double performance when you take into account the RAM bandwidth and such). That's not speculation either. So the PS4 GPU has around 30% headroom. That is a significant chunk and that's what people are mad about in this "parity" dispute. So sure it takes time and money and effort to optimize to use that 30% and it's PS4 specific (if the Xbox One is at 99% GPU it can't handle whatever you're going to move on the PS4 to use that 30%.) It also probably doesn't mean much for a PC version where gamers typically have strong CPUs and that lighting isn't bottlenecking the system.

I think Ubi just decided it wasn't worth the effort to do PS4 specific optimizations. I'm not in the camp that thinks it out of malice or because MS paid them off or whatever I just think they did a cost/benefit analysis and decided it wasn't worth it. That IS speculation but that's the best possible scenario I can come up with, not any of the conspiracy stuff going around. That IS what people are referring to as "parity" though. They designed the system for the lowest common denominator (Xbox One) thereby holding back the performance of the stronger system (PS4) and now they perform nearly identical. I'm sure they spent a great deal of time optimizing the performance on the Xbox One.

Link to comment
Share on other sites

If they ran the lighting in real-time on the GPU they'd become GPU-bound

We don't know that's the case. I'm not a lighting expert, but one would think that drivers and GPU's are optimised for that sort of work. Precomputing a ton of lighting data and having to process it all with the CPU seems like a bad design decision if they're already struggling to allocate enough resources to the AI and other parts of the game. Unless of course, it was an early decision, and by the time everything else was in place, it was too late to scrap it due to time constraints and budget. In which case they're trying to make the best out of a bad situation. Either way, it seems like a self-imposed limitation and they only have themselves to blame for it.

 

but if the game looked worse for it that's not an improvement is it?

I think most gamers would agree that 1080p @ 60fps is preferable to slightly better lighting. Mordor looks pretty damn good does it not? That doesn't use precomputed and CPU expensive lighting.

 

Now I'm not sure what is it exactly that keeps the CPU busy 50% of its time with pre-baked lighting, is it just deciding what/where/how to apply it, something of the sort, probably that it's some complicated logic that's not easily parallelizable. Note the dev didn't say "50% of the time is dedicated to lighting", he said "prepackaged info like illumination and lighting".

Yes, it's hard to be sure without seeing the code. However, "helping out the rendering", as the developer described it, appears to suggest that additional CPU cycles are being allocated to doing some GPU related work. Whether or not that could in fact be acomplished by executing it on the GPU isn't known, but it sounds like a problem in the design itself to me.

 

Perhaps the GPU is already quite busy with other things and they don't have the headroom on the GPU to do it there.

I honestly find that hard to believe running 900p @ 30fps. Either it's poorly designed, or the GPU is being underutilised (especially on the PS4).

 

Anyway, we don't know and we don't have the performance traces to tell what's going on so your criticism seems to rest on assumptions that no one can prove or refute.

That's true. We do however have Ubisoft's history of targeting the weakest platform (Xbone) and reducing visual fidelity (Black Flag / Watchdogs) to cater to it.
Link to comment
Share on other sites

I found this an interesting read from another Ubisoft developer.
 

Ubisoft recently found its foot in its mouth when talking about how well Assassin?s Creed: Unity runs on Xbox One and PlayStation 4, but it?s not gonna make the same mistake when it comes to The Division.

Executive producer Fredrik Rundqvist at The Division developer Massive made it clear that his team is squeezing everything they can from the newish systems. In an interview with website PlayStation Universe, Rundqvist explained that Massive began The Division with the new consoles specifically in mind. It?s no surprise that a developer would come out and say its game is doing something really well, but Ubisoft is likely skittish about permitting its developers to talk after recent issues with Assassin?s Creed. The Division producer also baked in a bit of an excuse for why Ubisoft?s other games might not look as good.

?We developed both the engine and the game specifically for this new generation of consoles,? said Rundqvist. ?So we didn?t have any problems at all, it was perfectly built for that.?

Rundqvist pointed out that this is different from a game like Assassin?s Creed, which runs on an older technology that Ubisoft upgraded for Sony?s and Microsoft?s new systems.

?Ours was built from the ground up [for PlayStation 4 and Xbox One],? said Rundqvist who went on to specifically praise Sony?s tech. ?[The Division] takes full advantage of the PlayStation 4. It?s an amazing machine.?

Last week, Ubisoft dealt with some unhappy fans after Assassin?s Creed: Unity producer Vincent Pontbriand explained the game is 900 lines of horizontal resolution and 30 frames per second on both Xbox One and PlayStation 4. The reason he gave was to avoid ?all of the debates.? Fans thought this meant Ubisoft purposefully held back the PS4 version of the game ? although Ubisoft came out and denied that is the case.

While Rundqvist is taking pride in maximizing the PlayStation and Xbox One, that doesn?t mean the game will run at 1,080 lines of horizontal resolution and 60 frames per second, which is an ideal that many gamers want to see. Instead, the studio has previously said that it wants to lock The Division in at 30 frames per second. That will enable the developer to put more effort into the visual fidelity.

While Rundqvist and his team at Massive are working to build an amazing looking game right now, they are also looking forward to the future.

?With a console generation there is a lot of growth once people learn to use the system,? said Rundqvist. ?I think the same will happen with the PS4. We will be able to push much more from the consoles when we learn to use all the details. They are incredibly powerful by default but there is more to get from them.?

Source: http://venturebeat.com/2014/10/20/ubisoft-the-division-takes-full-advantage-of-ps4-xbox-one/

Link to comment
Share on other sites

If all you care about is lighting then no. Moving from baked lighting on the CPU to dynamic lighting makes GPU would free up the CPU to do more of whatever NON-graphics things it was doing. Maybe add even more AI or more complex behaviors, I have no idea what else the CPU is doing in Unity but whatever it is it could get better if the lighting wasn't taking 50%.

 

Ugh... this is why I said I wasn't getting into a discussion about this. Everything you say rests on implicit unprovable assumptions. We don't know that the pre-baked lighting is taking 50% CPU, we know a dev said 50% of CPU is busy with "prepackaged information" that includes lighting. How much does lighting take in that we don't know. We don't know in what ways the game could be improved with that 50% CPU assuming it could be entirely put on the GPU with no overhead, assuming the GPU could do the task efficiently enough to not hamper what it's already doing in that game, assuming the GPU still has a lot of room to do it... We don't know how better the pre-baked lighting looks relative to what could realistically be computed on the fly and how important it is to the game experience according to the designer's intentions which we don't know either. It's all assumptions and no data.

 

Not going to get into more of your or simplezz's arguments at this point.

  • Like 2
Link to comment
Share on other sites

We don't know that the pre-baked lighting is taking 50% CPU, we know a dev said 50% of CPU is busy with "prepackaged information" that includes lighting. How much does lighting take in that we don't know.

First that's from a developer on the game not just some random Joe speculating. Second when people say "I have a bunch of stuff, including X" the point of saying "including X" is to give an example representative of the whole without having to explicitly list everything in the bunch. So yeah, we don't know it's specifically 50% but the point doesn't hinge on it being EXACTLY 50%. Maybe it's 42%, maybe it's 37%, the EXACT number isn't that important yet you seem to be hung up on the fact we don't know it. Unless you are actually contending that his "including lighting" comment really did mean just some tiny amount like 1% of that 50% in which case I guess we'll just have to agree to disagree here. I don't think it's wild speculation to assume the few things he chose to explicitly list represent a significant portion of that 50% and personally I think it's unreasonable to expect him to list out every item and what it's percentage is in order for the point to be valid.

We don't know in what ways the game could be improved with that 50% CPU

Nor is it necessary to know that. We DO know that the other systems running on the CPU would be able to use more CPU if a significant portion of that 50% were freed up. Unless your contention is that NOTHING else running on the CPU would see ANY benefit from suddenly having a bunch more CPU available to it? I DID do some rampant speculation on what the areas that improve might be but that entire section could have been removed from the post and the point is still valid. I also made a point to make clear "I have no idea what else the CPU is doing in Unity but whatever it is it could get better" Again it's NOT necessary to know EXACTLY what areas would improve to know that there would be improvement.

assuming it could be entirely put on the GPU with no overhead,

It doesn't have to be ENTIRELY put on the GPU and it doesn't have to have NO overhead. If that 50% drops to 10% that 10% is may well be overhead but the program still benefits. You make it sound as if not getting the entire 50% makes the whole idea pointless. I really don't understand why you're so hung up on the specific numbers, the general concept will suffice.

assuming the GPU could do the task efficiently enough to not hamper what it's already doing in that game,

We don't have to assume a GPU can do lighting, that's one of their basic functions and even fixed function GPUs can do that without using general purpose compute units at all. Also baked lighting is static, it doesn't DO anything in the game but look pretty. You can't hamper it from DOING nothing.

assuming the GPU still has a lot of room to do it...

We don't have to assume that at all. We KNOW the GPU on the PS4 is significantly more powerful than the Xbox One. If the Xbox One GPU is able to run something we KNOW the PS4 can do that exact same task with a lot of room to do it. Again I'll leave out the numbers since you fixate on them and the EXACT values aren't important but it's not a tiny gap between the GPU power of the consoles.

We don't know how better the pre-baked lighting looks relative to what could realistically be computed on the fly

No, we don't. Nor did I at any point say or imply that on the fly lighting would look better. In fact I made a point to explicitly say I have no doubt Unity will have super impressive lighting. Now THAT is speculation because I haven't seen it obviously but I'm speculating to their benefit. You act as though not knowing which lighting looks better makes my greater point invalid. How can that be possible when I even admitted that the baked lighting can look far better?

Again if all you care about is lighting and it isn't a problem that it doesn't change then baked is absolutely the way to go, I'm agreeing with that. There is much more to a game than lighting though and if you're spending a significant portion (note I didn't say 50% EXACTLY) of your CPU power on a normally GPU oriented graphics task like lighting and your game is CPU limited with a significant portion of your GPU resources still available (which we know is true on the PS4 if the Xbox One is capable of running the same thing) then it's a bad design for the PS4. If you strive for "Parity" then it makes total sense but if you're going to claim it's NOT "Parity" and the PS4 CANT do more and call out other games from other developers as not "next-gen" then that's bull.

Link to comment
Share on other sites

First that's from a developer on the game not just some random Joe speculating. Second when people say "I have a bunch of stuff, including X" the point of saying "including X" is to give an example representative of the whole without having to explicitly list everything in the bunch. So yeah, we don't know it's specifically 50% but the point doesn't hinge on it being EXACTLY 50%. Maybe it's 42%, maybe it's 37%, the EXACT number isn't that important yet you seem to be hung up on the fact we don't know it. Unless you are actually contending that his "including lighting" comment really did mean just some tiny amount like 1% of that 50% in which case I guess we'll just have to agree to disagree here. I don't think it's wild speculation to assume the few things he chose to explicitly list represent a significant portion of that 50% and personally I think it's unreasonable to expect him to list out every item and what it's percentage is in order for the point to be valid.

 

Define what significant means. Perhaps everything in the least is 5-15%? If you have 20 things and they all average out to similar. Or even if lighting and illumination took up around 15% and the rest of it was miscellaneous things that still makes the fact it's 15% vs 1%, which in turn makes it noteworthy. The issue here is we don't know all the numbers. We can't assume that significant means 45% or 30%... we only have the whole and how that breaks down we do not actually know.

 

 

Nor is it necessary to know that. We DO know that the other systems running on the CPU would be able to use more CPU if a significant portion of that 50% were freed up. Unless your contention is that NOTHING else running on the CPU would see ANY benefit from suddenly having a bunch more CPU available to it? I DID do some rampant speculation on what the areas that improve might be but that entire section could have been removed from the post and the point is still valid. I also made a point to make clear "I have no idea what else the CPU is doing in Unity but whatever it is it could get better" Again it's NOT necessary to know EXACTLY what areas would improve to know that there would be improvement.

 

This point is moot if we cannot prove that there are indeed operations that would benefit from being moved off the CPU to the GPU. You're basically saying, "Well... if we took the squares out of the square hole we'd have more room in the hole." Well, those squares have to go somewhere... and you're just conveniently dismissing the possibility that the other hole (GPU) may be round and not be ideal for your squares.

 

 

It doesn't have to be ENTIRELY put on the GPU and it doesn't have to have NO overhead. If that 50% drops to 10% that 10% is may well be overhead but the program still benefits. You make it sound as if not getting the entire 50% makes the whole idea pointless. I really don't understand why you're so hung up on the specific numbers, the general concept will suffice.

We don't have to assume a GPU can do lighting, that's one of their basic functions and even fixed function GPUs can do that without using general purpose compute units at all. Also baked lighting is static, it doesn't DO anything in the game but look pretty. You can't hamper it from DOING nothing.

 

Again, assuming there are operations that make sense on the GPU to be pushed there. It doesn't matter if you save 10 or 20% of your CPU workload if it turns into a net increase on the GPU. If you take up an additional 20% resources with the move and that's worth it, then fine. But really you're just assuming that doing these things will either be beneficial or weren't already tested and failed to deliver. You are riding on ignorance. You can't claim that doing the lighting real time vs. baked lighting will be more performant. You can't claim with certainty that the realtime lighting vs baked lighting will be more visually impressive/appealing. You are just saying, "Well maybe if they did it this way!" as if that really has any substance. It does not.

 

We don't have to assume that at all. We KNOW the GPU on the PS4 is significantly more powerful than the Xbox One. If the Xbox One GPU is able to run something we KNOW the PS4 can do that exact same task with a lot of room to do it. Again I'll leave out the numbers since you fixate on them and the EXACT values aren't important but it's not a tiny gap between the GPU power of the consoles.

No, we don't. Nor did I at any point say or imply that on the fly lighting would look better. In fact I made a point to explicitly say I have no doubt Unity will have super impressive lighting. Now THAT is speculation because I haven't seen it obviously but I'm speculating to their benefit. You act as though not knowing which lighting looks better makes my greater point invalid. How can that be possible when I even admitted that the baked lighting can look far better?

 
Exact values are everything in optimization. We can't clearly make claims about their ability to optimize without know what they've done to optimize it. Or what hurdles they've run into during that process.
 
 

Again if all you care about is lighting and it isn't a problem that it doesn't change then baked is absolutely the way to go, I'm agreeing with that. There is much more to a game than lighting though and if you're spending a significant portion (note I didn't say 50% EXACTLY) of your CPU power on a normally GPU oriented graphics task like lighting and your game is CPU limited with a significant portion of your GPU resources still available (which we know is true on the PS4 if the Xbox One is capable of running the same thing) then it's a bad design for the PS4. If you strive for "Parity" then it makes total sense but if you're going to claim it's NOT "Parity" and the PS4 CANT do more and call out other games from other developers as not "next-gen" then that's bull.

You presume to know more than Ubisoft on how to optimize their games? They've already said they are still learning the hardware and that there's more to be squeezed out of them. But I think you're filling in the dots with anecdotes and conclusions you've personally drawn to fit some image you want to paint of the company. I agree, the marketing department is a bunch of morons. But you can't blame the developers for the marketing minions which have no say/part in what they do.

Link to comment
Share on other sites

I think most gamers would agree that 1080p @ 60fps is preferable to slightly better lighting. Mordor looks pretty damn good does it not? That doesn't use precomputed and CPU expensive lighting.

You are wrong in thinking that 1080p/60fps is preferable to lighting. I would say overall Image Quality is more important than a single metric. You should check out the Halo 3 footage in Halo MCC. They improved lighting in addition to bumping it to 1080p/60fps. If PS4 has to drop to sub1080p for better lighting in any game, I hope developers do that instead of going 1080p/60fps because of fanboy pressure.

GG had to go below 1080p for Killzone:SF to achieve their target image quality and I think it is ok because that's the best they could do for a launch game. It would have been foolish for them to compromise image quality for hitting true 1080p/60fps.

Link to comment
Share on other sites

You are wrong in thinking that 1080p/60fps is preferable to lighting.

 

So actual GAMEPLAY in a GAME is more important than the visual appearance of said GAME? What the ######?

 

I must say, I am so happy that I do not own a console so I don't have to deal with ###### like this. Next-gen console generation that is supposed to last for 10 years... hue hue hue!

Link to comment
Share on other sites

So actual GAMEPLAY in a GAME is more important than the visual appearance of said GAME? What the ######?

 

I must say, I am so happy that I do not own a console so I don't have to deal with ###### like this. Next-gen console generation that is supposed to last for 10 years... hue hue hue!

Did you read my post before hitting reply?

You are wrong in thinking that 1080p/60fps is preferable to lighting. I would say overall Image Quality is more important than a single metric.
Link to comment
Share on other sites

Did you read my post before hitting reply?

 

I did read it. My point still stands. GAME, not a visual novel or a movie. Screw resolution, responsiveness and fluidity of gameplay is what is the most important thing.

 

I messed up the first sentence of my previous post. It should be less important? not more important?. Makes the question look stupid but I can't edit.

Link to comment
Share on other sites

You are wrong in thinking that 1080p/60fps is preferable to lighting. I would say overall Image Quality is more important than a single metric. You should check out the Halo 3 footage in Halo MCC. They improved lighting in addition to bumping it to 1080p/60fps. If PS4 has to drop to sub1080p for better lighting in any game, I hope developers do that instead of going 1080p/60fps because of fanboy pressure.

GG had to go below 1080p for Killzone:SF to achieve their target image quality and I think it is ok because that's the best they could do for a launch game. It would have been foolish for them to compromise image quality for hitting true 1080p/60fps.

I'd say 60fps is more important than resolution from a gameplay standpoint. As elenarie said, responsiveness and fluidity are paramount in most games. Without them, it can end up feeling sluggish and slow to respond.

Clearly in the case of AC:Unity, Ubisoft made an early design decision to prerender lighting, and it's come back to haunt them because it's bound the game to the CPU. Now they're struggling to find cycles to process NPC's and AI. Visual and gameplay compromises have been made. That's how I see it anyway.

Link to comment
Share on other sites

Define what significant means.

Significant means I was intentionally avoiding specific numbers so no one gets hung on them as what they are EXACTLY isn't necessary to demonstrate the point. I thought I spelled that out pretty clearly.

This point is moot if we cannot prove that there are indeed operations that would benefit from being moved off the CPU to the GPU.

No it's not. Because I'm not claiming the exact same operation can be moved. We DO know GPUs can do lighting and we DO know that because of hardware differences the PS4 will have a good deal of GPU headroom over anything the Xbox One is doing.

You're basically saying, "Well... if we took the squares out of the square hole we'd have more room in the hole."

Sure, that's a true statement. If you take the squared out of the square hole you do indeed have more room in that hole. Are you saying that's not the case?

Well, those squares have to go somewhere... and you're just conveniently dismissing the possibility that the other hole (GPU) may be round and not be ideal for your squares.

We DO know that the GPU is capable of doing lighting and we DO know that the PS4 has GPU headroom over anything the Xbox One is doing.

You can't claim with certainty that the realtime lighting vs baked lighting will be more visually impressive/appealing.

No you can't, nor have I. Once again I AM NOT and have NEVER made that claim. In fact I've repeatedly said that moving it to the GPU will almost certainly be worse. I don't know why you guys keep bringing up this straw man argument. Really it just makes you're position look weak. Instead of debating the substance of my argument you are throwing out straw men and arguing if 50% is EXACTLY correct or not. It doesn't matter to the greater point. Are you going to say my argument is invalid next because of grammer or spelling mistakes in my post?

You presume to know more than Ubisoft on how to optimize their games?

No, I don't. This is another straw man. I've already stated I don't think it's out of malice, I've NEVER stated it was out of ignorance. At NO POINT did I claim to know more that Ubisoft on how to optimize their games.

I agree, the marketing department is a bunch of morons. But you can't blame the developers for the marketing minions which have no say/part in what they do.

First the quote I responded to in this thread was the one from the alleged Unity developer. Now maybe that quote was B.S. and wasn't from an actual Unity Developer but I haven't really heard anyone question that. My comments were on the substance of that quote. That developer was trying to put out the fire created by the marketing department. This thread is called "Ubisoft talks bollock about..." so it is VERY MUCH about how B.S. their media claims are NOT how competent their programmers are.

Again I've said before and I'll say again I'm sure given more time and money the developers could optimize the game better. I'm NOT questioning their skill at all. It's their claim that the PS4 CANT run their game at higher rez that I take issue with. Their claim their claim that despite the platform holders (BOTH Sony and MS were called out in the developers quote) might say it's not possible. They make it sound like the hardware isn't capable instead of them just not having time/money to do the required optimizations. They also throw the Shadow of Mordor developers under the bus with their claim that it lacks "next-gen" graphics compared to Unity. This thread is about the B.S. they are SAYING and THAT is what I'm taking issue with. It's about them making up stupid excuses (bollocks as the subject puts it) as to why there is no virtually no performance difference between the PS4 and Xbox One versions despite the PS4s hardware advantage.

Link to comment
Share on other sites

Significant means I was intentionally avoiding specific numbers so no one gets hung on them as what they are EXACTLY isn't necessary to demonstrate the point. I thought I spelled that out pretty clearly.

 

As I said, we can't avoid specifics in this argument. It just supports unfounded conclusions such as the ones you are making. Your point relies on this ignorance.

 

 

No it's not. Because I'm not claiming the exact same operation can be moved. We DO know GPUs can do lighting and we DO know that because of hardware differences the PS4 will have a good deal of GPU headroom over anything the Xbox One is doing.

Significant means I was intentionally avoiding specific numbers so no one gets hung on them as what they are EXACTLY isn't necessary to demonstrate the point. I thought I spelled that out pretty clearly.

No it's not. Because I'm not claiming the exact same operation can be moved. We DO know GPUs can do lighting and we DO know that because of hardware differences the PS4 will have a good deal of GPU headroom over anything the Xbox One is doing.

Sure, that's a true statement. If you take the squared out of the square hole you do indeed have more room in that hole. Are you saying that's not the case?

We DO know that the GPU is capable of doing lighting and we DO know that the PS4 has GPU headroom over anything the Xbox One is doing.

No you can't, nor have I. Once again I AM NOT and have NEVER made that claim. In fact I've repeatedly said that moving it to the GPU will almost certainly be worse. I don't know why you guys keep bringing up this straw man argument. Really it just makes you're position look weak. Instead of debating the substance of my argument you are throwing out straw men and arguing if 50% is EXACTLY correct or not. It doesn't matter to the greater point. Are you going to say my argument is invalid next because of grammer or spelling mistakes in my post?

 

You're claiming that there are any operations that can be moved, and just assuming that moving the lighting over would be of benefit. Just because the PS4 has more GPU power than the X1 doesn't change that. I don't even understand why you keep bringing that up as if the difference somehow matters when the discussion is not about really about that. Well it kind of is. All I'm saying is, building the game in two fundamentally different ways for different consoles is not cost effective. And if they cannot get the same look and feel out of both games by doing so then it's pointless. Again, you are merely saying, "Well... there's space here we should use it!" without any consideration as to how things work. Yes, lighting can be done on the GPU. Will it be better? Well you've already said it probably won't be so I don't really see where your issue is. It's almost as if your entire point is, "Well they could do it this way and it might work." This doesn't mean anything, it's again anecdotal and not based on facts. And just because you refuse to use facts in your argument to avoid the truth doesn't mean you get a free pass.

 

We are discussing something that relies on the facts.

 

Optimization isn't something that's a vague science.

 

 

No, I don't. This is another straw man. I've already stated I don't think it's out of malice, I've NEVER stated it was out of ignorance. At NO POINT did I claim to know more that Ubisoft on how to optimize their games.

 
Then why are we constantly bringing up the point that the PS4 has so much more power in the GPU? I don't really see this as part of the discussion anymore. It's as if you think because of said difference that your proposals are suddenly justifiable just because the PS4 can assumably brute force past the limitations due to the gap. Perhaps the gap isn't really the problem at all?
 

First the quote I responded to in this thread was the one from the alleged Unity developer. Now maybe that quote was B.S. and wasn't from an actual Unity Developer but I haven't really heard anyone question that. My comments were on the substance of that quote. That developer was trying to put out the fire created by the marketing department. This thread is called "Ubisoft talks bollock about..." so it is VERY MUCH about how B.S. their media claims are NOT how competent their programmers are.

 

This thread itself is about the PR statement, so I'm talking within that context. But you cannot really believe their marketing department controls any of the decisions made. How they spin it may be BS, but the decisions themselves could have had very understandable reasoning behind it. We cannot assume that marketing incompetence begets developer incompetence.

 

 

Again I've said before and I'll say again I'm sure given more time and money the developers could optimize the game better. I'm NOT questioning their skill at all. It's their claim that the PS4 CANT run their game at higher rez that I take issue with. Their claim their claim that despite the platform holders (BOTH Sony and MS were called out in the developers quote) might say it's not possible. They make it sound like the hardware isn't capable instead of them just not having time/money to do the required optimizations. They also throw the Shadow of Mordor developers under the bus with their claim that it lacks "next-gen" graphics compared to Unity. This thread is about the B.S. they are SAYING and THAT is what I'm taking issue with. It's about them making up stupid excuses (bollocks as the subject puts it) as to why there is no virtually no performance difference between the PS4 and Xbox One versions despite the PS4s hardware advantage.

 
So... you are understanding that they are still going through growing pains but not when it comes to the PS4? And you say you have no bias or malice. They have already stated that they are still working with the new hardware and that both systems still have more power to show. Again you're bringing in marketing babble as a basis for their development decisions. That somehow because Ubi is spinning it in a ###### way that makes the developers incompetent and poor at optimizing (for the PS4).
 
It was also discussed time, and time, and time again on these forums and in the general community that the difference in hardware on the PS4 and X1 would be much smaller than it looked on paper (which has consistently been the case with all hardware, ever). So I'm not sure why you're so upset with similar performance other than you think the PS4 should be better because you want it to be better. The fact is, in this case, it isn't.
Link to comment
Share on other sites

You're claiming that there are any operations that can be moved, and just assuming that moving the lighting over would be of benefit.

I'm not sure I follow you here. Yes, I'm claiming that moving things the CPU is doing off the CPU will benefit whatever remains on the CPU.

Just because the PS4 has more GPU power than the X1 doesn't change that. I don't even understand why you keep bringing that up as if the difference somehow matters when the discussion is not about really about that. Well it kind of is.

Again I'm not sure what you mean here. This thread IS really about the "Bollocks" claims made by Ubisoft with respect to the reason the PS4 version of Unity does not perform more than 1 or 2 frames per second different from the Xbox One version. As such the relative strength of the PS4 is absolutely relevant to the conversation.

All I'm saying is, building the game in two fundamentally different ways for different consoles is not cost effective. And if they cannot get the same look and feel out of both games by doing so then it's pointless.

Have you even read my posts before responding to them? I said the same thing as your first sentence. I said I didn't think it was out of malice or because MS paid them and that I thought it was just because of time/money. It appears we both agree on what the reason REALLY is, the point here is that's NOT what they're saying the reason is. They're making up "bollocks" excuses instead and strongly denying it has anything to do with "parity". What is being referred to as "parity" is what your second statement there describes, "the same look and feel out of both", I totally agree that's what it is, I've said that. THAT is exactly what people are mad about, the Xbox One "holding back" the PS4 so they have the same look and feel. THAT is what Ubisoft is denying. Their denials and B.S. excuses as to why that isn't the case is what this thread is about and what I've been talking about. If they just came out and admitted that they wanted them to look the same so they didn't do any PS4 specific optimizations then this thread wouldn't even exist.

So... you are understanding that they are still going through growing pains but not when it comes to the PS4?

NO, once again that is an incorrect characterization of my argument. I understand they are still going through growing pains for BOTH the Xbox One and the PS4. I'm saying their P.R. AND developer quote denials of it being inadequate optimization (due to the early state of this console generation, time/money, or whatever) and/or their denial of "parity" (keeping the Xbox One and PS4 version with the same "look and feel") being a factor are B.S. THAT is what this thread is about.

And you say you have no bias or malice.

No, I didn't. I never used the word bias. I personally don't believe any human being is free of bias. I think everyone's opinion on something has at least some influence on things they do or say so I would never make that claim. That's a philosophical idea that's not really important to this conversation other then to emphasis there is NO WAY I made that claim. It's ANOTHER straw man argument.

As for malice I DID use that word but it was NOT in reference to me. I said I don't believe Ubisoft is intentionally holding the performance of the PS4 version back out of malice, as in dislike for the PS4 or because MS paid them to. Some people DO seem to believe that and I was trying to make it clear that I was NOT one of them. Oddly enough I believe I followed that with my thoughts about it likeley just being a cost/benefit analysis that you seemingly agreed with above. I'm not sure how you saw the "malice" part but missed the cost/benefit part.

That somehow because Ubi is spinning it in a ###### way that makes the developers incompetent and poor at optimizing (for the PS4).

Not this straw man again. I have REPEATEDLY and EXPLICITLY stated that I am NOT saying the developers are incompetent. NEVER have I said that. I've specifically said I believe they are perfectly capable of doing more optimizations (even PS4 specific ones) if given the time/money to do so and if the desire for platform "parity" is removed.
Link to comment
Share on other sites

I'm saying their P.R. AND developer quote denials of it being inadequate optimization (due to the early state of this console generation, time/money, or whatever) and/or their denial of "parity" (keeping the Xbox One and PS4 version with the same "look and feel") being a factor are B.S. THAT is what this thread is about.

 

I understand what you are saying, you are angry (and others are) that the PS4 doesn't use the extra resources it has to outperform the X1. Despite the fact that the developer has stated it's not a matter of GPU, but CPU optimization. You then have suggested they move CPU processing tasks to the GPU to give more headroom and to utilize the PS4's more powerful GPU. And despite arguments to the contrary of how that would not be practical you've carried on dismissing them by trying not to concentrate on specifics (which do, indeed matter).

 

You may think you aren't saying they are incompetent. But you are implying as such. By suggesting their choices for optimization are poor, and that they "should have done it this way" you are already demonstrating a level of distrust in their decision making. If you really felt they'd done their jobs right then we wouldn't be sitting here debating the intricacies of GPU vs CPU processing. But to some degree you feel the game was built improperly on top of disliking the way the results the developers gave was portrayed by their PR department. On the one hand you say the developers are good, on the other you are telling them how to do their jobs.

 

And on top of this you are injecting the word "parity" into it as if it's some evil nemesis of the gaming world. Parity means a lot of things, not just targeting the lowest common denominator. To be honset, they couldn't have targeted the lowest common denominator since both consoles have the same limitation in regards to this game: the CPU. That in and of itself lends to the idea that the X1 didn't hold this game back despite your inclination that the PS4's game should be better because of a more powerful GPU.

 

Instead of trying to paint some imaginary, sensationalized story for you to drill into Ubi we should be looking at what we know. And from what we've seen based on developer commentary I see no reason to be taking stabs at how the game performs. Sure, we can take stabs at the marketing guys but not the game itself.

Link to comment
Share on other sites

you are angry (and others are) that the PS4 doesn't use the extra resources it has to outperform the X1.

I wouldn't say I'm angry and it's not that the PS4 doesn't outperform the X1 that annoys me. What annoys me and caused me to comment on this thread was the denials that it's because of "parity" or time/money constraints that it doesn't and instead the excuses like "more cinematic", "next-gen graphics" compared to Mordor, "PS4 can't handle it" as if the hardware was fully utilized and incapable of doing it. Again it's the PR and that one developers specific statement I'm taking issue with NOT the developers skill.

You may think you aren't saying they are incompetent. But you are implying as such.

I implied no such thing. Now you're going to tell me what I meant? In the event my wording was open to some misunderstanding though I have REPEATEDLY and EXPLICITLY stated I am NOT saying they are incompetent.

By suggesting their choices for optimization are poor, and that they "should have done it this way" you are already demonstrating a level of distrust in their decision making.

I'm not demonstrating a level of distrust of their skill. I'm saying those decisions were likely made in order to keep the versions the same, i.e. "Parity", and/or due to time/money constraints which is NOT what their PR or that Developer quote was saying. You can be the greatest programmer in the world but if you're TOLD to keep the versions the same or you're not given the time/money to perform enough optimizations you can release a product that is poorly optimized. It does NOT necessarily reflect on the skill of the developer and while it could I've taken great pains to explicitly exclude that possibility.

And on top of this you are injecting the word "parity" into it as if it's some evil nemesis of the gaming world.

Again, this is simply not true. AGAIN I've said that if their PR came out and said the versions were the same because they wanted to keep both platforms looks as identical as possible (or they decided not to optimize further so they could get the games in our hands sooner) then I would NOT have posted here. The point is that they are DENYING that's a factor and saying the PS4 CANT perform better and offering other "Bollocks" explanations it's those denials I'm commenting on.
Link to comment
Share on other sites

I'm not demonstrating a level of distrust of their skill. I'm saying those decisions were likely made in order to keep the versions the same, i.e. "Parity", and/or due to time/money constraints which is NOT what their PR or that Developer quote was saying. You can be the greatest programmer in the world but if you're TOLD to keep the versions the same or you're not given the time/money to perform enough optimizations you can release a product that is poorly optimized. It does NOT necessarily reflect on the skill of the developer and while it could I've taken great pains to explicitly exclude that possibility.

 

This is the issue, we don't know why, or what reason. You are assuming it is those reasons and you don't actually know what the reasons are. Yes, you can be the best programmer in the world and get told to do something but there are other levels of management within the development team and the corporation. There are deadlines, and there are timelines. Timelines set by those doing the work and deadlines made based on those estimated timelines. There's a lot going on that we know nothing about and to jump to conclusions and insinuate anything specific happened (be it positive or negative) is already misrepresenting things.

Link to comment
Share on other sites

This is the issue, we don't know why, or what reason.

No we don't and I've never claimed to. That doesn't mean we can't refute any reason they give though.

You are assuming it is those reasons

No I'm not. I'm agreeing those are actually believable reasons. Did you not just say a few posts back "building the game in two fundamentally different ways for different consoles is not cost effective." i.e. it's not worth the time or effort and "if they cannot get the same look and feel out of both games by doing so then it's pointless." i.e. they might be going for parity. I'm trying to AGREE with you on those points and again underscore that I've NEVER contradicted those. I am NOT saying I KNOW with certainty that those are the specific reasons. The point here isn't that I KNOW what the reasons are it's that the ones given by the P.R. and that one developer quote are "bollocks".

Yes, you can be the best programmer in the world and get told to do something but there are other levels of management within the development team and the corporation. There are deadlines, and there are timelines. Timelines set by those doing the work and deadlines made based on those estimated timelines.

Yes that was MY point, I agree 100% with this. Never have I said the contrary and that's exactly why your straw man about me saying the developers were incompetent is such "bollocks".
Link to comment
Share on other sites

No we don't and I've never claimed to. That doesn't mean we can't refute any reason they give though.

No I'm not. I'm agreeing those are actually believable reasons. Did you not just say a few posts back "building the game in two fundamentally different ways for different consoles is not cost effective." i.e. it's not worth the time or effort and "if they cannot get the same look and feel out of both games by doing so then it's pointless." i.e. they might be going for parity. I'm trying to AGREE with you on those points and again underscore that I've NEVER contradicted those. I am NOT saying I KNOW with certainty that those are the specific reasons. The point here isn't that I KNOW what the reasons are it's that the ones given by the P.R. and that one developer quote are "bollocks".

Yes that was MY point, I agree 100% with this. Never have I said the contrary and that's exactly why your straw man about me saying the developers were incompetent is such "bollocks".

 

The reason I got the impression from you that you were not trusting the expertise of the devs is based on several statements you've made in this thread that contradict what you are saying now.

 

 

"Because of this I think the build is a full 50gigs, filling the bluray to the edge, and nearly half of that is lighting data."

 

Seems like a pretty horrible design decision to me.

 

It's amazing to me that Ubisoft does a demo on how important it is to move formerly CPU based tasks to the GPU at GDC Europe and then announces a CPU bound "next-gen" game.

 

CPU bound this console generation is NOT "crazily optimized".  Maybe compared to "Ubisfot games in the past" but if that's the case that bodes very poorly to how well prior games have been optimized.

 

These comments tell me you think they did a bad job optimizing the game. Perhaps you were exaggerating, or your opinion has changed since you've said these things but it's why I've come to the conclusion you think the devs made bad decisions.

Link to comment
Share on other sites

The reason I got the impression from you that you were not trusting the expertise of the devs is based on several statements you've made in this thread that contradict what you are saying now.

I understand how my wording could have been interpreted that way. That's why I took great pains to EXPLICITLY spell out that is NOT what I mean, right from the beginning and I've stated it repeatedly since. I'm not sure why this is still being brought up.

These comments tell me you think they did a bad job optimizing the game. Perhaps you were exaggerating, or your opinion has changed since you've said these things but it's why I've come to the conclusion you think the devs made bad decisions.

I DO think the game is poorly optimized for the PS4 if it's performing nearly identically to the Xbox One. I do NOT think that's because of a lack of skill on the part of the developers, nor have I ever said that. Yes, my statements, when taken out of context, can give that impression and I'm fully aware of that and so I have repeatedly and explicitly stated that is not the case so as there can be no confusion on that point.

I don't claim to know specifically why it's poorly optimized for the PS4 but I agree that a cost/benefit analysis (time/money) of doing more (PS4 specific?) optimizations and/or a desire for "parity" COULD possibly be explanations (NOT that I'm saying I KNOW they are specifically). I was NOT exaggerating, my opinion has NOT changed since I said those things. Nothing I've said recently conflicts with any of those quotes in any way.

Again the point here is that the P.R. statements and the specific developer quote posted in these comments are "bollocks".

Link to comment
Share on other sites

I DO think the game is poorly optimized for the PS4 if it's performing nearly identically to the Xbox One.

 

This is what I don't get. They are both suffering the same limitation and that means it doesn't matter the platform. You're claim here is based on nothing tangible other than "The PS4 has a better GPU so the PS4 version should be better." That isn't realistic.

Link to comment
Share on other sites

This topic is now closed to further replies.