Jimquisition: Ubisoft Talks Bollocks About Framerate And Resolution


Recommended Posts

This is what I don't get. They are both suffering the same limitation and that means it doesn't matter the platform. You're claim here is based on nothing tangible other than "The PS4 has a better GPU so the PS4 version should be better." That isn't realistic.

The fact that they both suffer the same limitation doesn't mean it not poorly optimized for the PS4.  I don't need anything more tangible than one platform has superior hardware to the other.  As an extreme example if I have a game that runs 640X480@30fps on my 386 and it still runs 640x480@30fps on my Core i7 then it's poorly optimized for the Core i7.  Now THAT is EXTREMELY exaggerated and I'm NOT saying the difference between the consoles is even remotely that large but it illustrates the point that you CAN make the optimization claim based on nothing more tangible than a hardware performance differences.

 

If you agree that the PS4 GPU/Memory is significantly more capable, i.e. it's not a trivial difference between the Xbox One and PS4. (I'm trying not to use numeric examples here so we don't get hung up on them).  Then if both systems are running the same thing at effectively the same performance then the more powerful system is being under-utilized and is thus is poorly optimized for the stronger system.  Now there may be good reasons for that.  I don't know what specific reasons are at play with Ubi but as we've said possible explications could be that they decided it wasn't worth more time/money to do PS4 specific optimizations, they intentionally are keeping the games the same ("parity"), or any number of other things but Ubi P.R. and that developers quote are denying those two explanations and offering instead a string of "bollocks" ones.  That's what this thread is about... that the specific excuses that Ubi P.R. and that developers quote are offering are "bollocks".  NOT that the developers are incompetent, NOT that there is no possible reasonable reason for near identical performance despite hardware differences, JUST that those specific public statements were B.S.

Link to comment
Share on other sites

The fact that they both suffer the same limitation doesn't mean it not poorly optimized for the PS4.  I don't need anything more tangible than one platform has superior hardware to the other.

 

Even when that superiority isn't where the problem lies? Come on, we can't seriously be watering down the facts to this. It's a CPU issue, not a GPU issue. The GPU is where the consoles differ the most and that isn't where the problem is. 

Link to comment
Share on other sites

Even when that superiority isn't where the problem lies? Come on, we can't seriously be watering down the facts to this. It's a CPU issue, not a GPU issue. The GPU is where the consoles differ the most and that isn't where the problem is. 

A game is poorly optimized for the PS4 if it is designed in such a way to allow the weak CPU to bottleneck the GPU to a significant degree.  We KNOW the GPU is bottle-necked by not just a trivial amount but to a significant degree because Ubisoft has told us even the Xbox One is CPU limited which means even on it's weaker GPU it's not being fully utilized (though it may be extremely close and if so then it IS well optimized FOR THE XBOX ONE because it isn't realistic that you will fully utilize BOTH the GPU and CPU EXACTLY).  If the Xbox One isn't fully utilizing it's GPU and the PS4 has a significantly stronger GPU then we know it's being significantly under utilized.  If ANY game is signifantly under utilizing the GPU on the PS4 it is poorly optimized for the PS4.

 

If I write a game that uses 100% CPU and 20% GPU on my machine, it is poorly optimized for the machine it is running on.  Especially when the machine was specifically designed with a weak CPU because the manufacturer wanted me to use the GPU as much as possible instead.  Now again there a possible good reasons for the decision not to optimize to take better advantage of that machine, I'm not saying there aren't.  But despite if the reasons or good or not it's still poorly optimized for it.

Link to comment
Share on other sites

A game is poorly optimized for the PS4 if it is designed in such a way to allow the weak CPU to bottleneck the GPU to a significant degree.  We KNOW the GPU is bottle-necked by not just a trivial amount but to a significant (define significant) degree because Ubisoft has told us even the Xbox One is CPU limited which means even on it's weaker GPU it's not being fully utilized (though it may be (might not) extremely close and if so (what if it isn't) then it IS well optimized FOR THE XBOX ONE because it isn't realistic that you will fully utilize BOTH the GPU and CPU EXACTLY).  If the Xbox One isn't fully utilizing it's GPU and the PS4 has a significantly stronger GPU then we know it's being significantly under utilized.  If ANY game is signifantly under utilizing the GPU on the PS4 it is poorly optimized for the PS4. (How do you know it's being underutilized? I thought you said you weren't accusing them of doing a poor job? Cause this is how this comes off. If you know a better way to optimize then by all means send them an email or something)

 

If I write a game that uses 100% CPU and 20% GPU on my machine, it is poorly optimized for the machine it is running on. (Depends on what the game is doing) Especially when the machine was specifically designed with a weak CPU because the manufacturer wanted me to use the GPU as much as possible (possible being the key word) instead.  Now again there a possible good reasons for the decision not to optimize to take better advantage of that machine, I'm not saying there aren't.  But despite if the reasons or good or not it's still poorly optimized for it.

 

Basically what I'm getting here is lots of guesses and nothing concrete. These are all nothing but assumptions made to tell a story that may just not be the case. We aren't here to tell stories, but to discuss reality.

Link to comment
Share on other sites

Basically what I'm getting here is lots of guesses and nothing concrete. These are all nothing but assumptions made to tell a story that may just not be the case. We aren't here to tell stories, but to discuss reality.

I use "significant" because if I use approximate numbers you'll nit pick the numbers and ignore the greater point, if the number is 49.8% or 31.2% SPECIFICALLY isn't important.  The point in saying it's "significant" is just to indicate that it's NOT a tiny number like "1%" or "0.0001%" which would be an "insignificant" difference.

 

The statement in the parenthesis with respect to Xbox One performance where I DO use "may be" and "if so" was a side note with respect to Xbox One optimization and doesn't support or negate my comments on the PS4 (in hindsight I should have just left it out since you decided to fixate on it).  You're right though we don't KNOW if it's optimized well for the Xbox One either but I was giving them the benefit of the doubt in that particular uncertainty and saying it may be.  Maybe it IS poorly optimized for the Xbox One as well though if that's your point.  My point was even in the best case scenario if it was PERFECTLY optimized for the Xbox One (100% CPU, 100% GPU), if the PS4 performs identically we know it's not PERFECTLY optimized for the PS4 because the more powerful GPU is being under utilized.

 

We DO KNOW that a GPU is being under utilized if it's doing the same tasks at the same speed as a weaker GPU.  That's not speculation, it's FACT.  If you have a budget GFX card and a top end GFX from the same series of GFX cards from the same manufacturer even and you run a game on the same system replacing just the graphics card and it performs identically then it is FACT that the game is under utilizing the stronger GFX card.  That's not speculation, it's not even opinion, I really don't understand how you can even argue that's "guesses" or "nothing concrete".  That is absolutely concrete and there aren't ANY guesses.

 

Once again you act as though my saying it's under utilized means the developers are doing a poor job.  I've already gone over this repeatedly.  The developers doing a poor job is NOT the only possible explanation for hardware being under utilized no matter how much you may wish it were so.  There are any number of reasons hardware may be under utilized and I've given examples, examples you've even agreed with as possibilities,  but there is no point to list them again here... especially since you'll then just make the B.S. claim that I'm saying I KNOW that whatever I choose as an example is DEFINITELY the case when that's not what I'm saying at all.

Link to comment
Share on other sites

I use "significant" because if I use approximate numbers you'll nit pick the numbers and ignore the greater point, if the number is 49.8% or 31.2% SPECIFICALLY isn't important.  The point in saying it's "significant" is just to indicate that it's NOT a tiny number like "1%" or "0.0001%" which would be an "insignificant" difference.

 

I'm not fixating on the numbers. I'm fixating on the fact that we don't know the numbers and inserting random guesses about that number (maybe not random, but ones that favor your argument) won't get us anywhere nor will it validate your point. This entire statement is a hypothetical. We can't make claims on hypotheticals, and we certainly can't point fingers.

 

 

The statement in the parenthesis with respect to Xbox One performance where I DO use "may be" and "if so" was a side note with respect to Xbox One optimization and doesn't support or negate my comments on the PS4 (in hindsight I should have just left it out since you decided to fixate on it).  You're right though we don't KNOW if it's optimized well for the Xbox One either but I was giving them the benefit of the doubt in that particular uncertainty and saying it may be.  Maybe it IS poorly optimized for the Xbox One as well though if that's your point.  My point was even in the best case scenario if it was PERFECTLY optimized for the Xbox One (100% CPU, 100% GPU), if the PS4 performs identically we know it's not PERFECTLY optimized for the PS4 because the more powerful GPU is being under utilized.

 

We DO KNOW that a GPU is being under utilized if it's doing the same tasks at the same speed as a weaker GPU.  That's not speculation, it's FACT.  If you have a budget GFX card and a top end GFX from the same series of GFX cards from the same manufacturer even and you run a game on the same system replacing just the graphics card and it performs identically then it is FACT that the game is under utilizing the stronger GFX card.  That's not speculation, it's not even opinion, I really don't understand how you can even argue that's "guesses" or "nothing concrete".  That is absolutely concrete and there aren't ANY guesses.

 

In a perfect world where there is data to support this idea then yes. But the reality is we don't know the numbers. Underutilized assumes we could (theoretically) be done on the GPU (or has a practical method in place for doing so). If there isn't a realistic way of using more of the GPU I don't think the game is underutilizing it. Just because League of Legends performs the same on a $700 budget gaming rig as on a $1800 monster (the only difference being FPS in which case won't matter past 60) doesn't mean the game isn't optimized because it isn't using my $800 GPU to it's full potential. I have just have a bigger graphics card than the game requires for its GPU based processes. You are imposing an arbitrary standard on how these games should perform which is unfair.

 

 

Once again you act as though my saying it's under utilized means the developers are doing a poor job.  I've already gone over this repeatedly.  The developers doing a poor job is NOT the only possible explanation for hardware being under utilized no matter how much you may wish it were so.  There are any number of reasons hardware may be under utilized and I've given examples, examples you've even agreed with as possibilities,  but there is no point to list them again here... especially since you'll then just make the B.S. claim that I'm saying I KNOW that whatever I choose as an example is DEFINITELY the case when that's not what I'm saying at all.

 

Then stop presuming to know more than those paid to do something professionally. It took YEARS for PC/Console developers to really begin taking advantage of multicore CPU's. And here we are, in the first generation of consoles which demands a GPU focus and we're getting our panties in a wad over them not adapting in the first 12 months of the console's lifespans. These things take time. PC developers don't have to deal with this, and last gen the CPU was in excess of the GPU. The way games have been made up until now does not conform with the new architecture and that means developers need to feel things out.

 

You seem to think that you can at one end of the spectrum say the dev's are doing a fine job but at the other end of the spectrum question their decisions in the game. Yes, time and money matter. Deadlines exist for that reason but even then you aren't really sure they didn't try the methods you suggested and found there was no practical way to get it to work in a reasonable amount of time. In fact, I doubt they are the only developer with this issue and it seems that Ubisoft's marketing statement is giving the development team undue scrutiny.

 

The real problem with your point of view is that it hinges on the idea that PS4 > X1 and therefore the game for some unknown reason must perform better on the PS4 or else it's not optimized. That is an unfair standard to set. It's like saying that one car should go farther than the other because it can go faster (without considering the fuel efficiency of the vehicles at hand and assuming that's okay because we don't know what their fuel efficiency is).

Link to comment
Share on other sites

I'm not fixating on the numbers. I'm fixating on the fact that we don't know the numbers and inserting random guesses about that number (maybe not random, but ones that favor your argument) won't get us anywhere nor will it validate your point. This entire statement is a hypothetical. We can't make claims on hypotheticals, and we certainly can't point fingers.

It's NOT hypothetical that the PS4 GPU has more compute units. It's NOT hypothetical that the PS4 has faster memory. Based on those REAL hardware differences it is NOT hypothetical that the PS4s GPU is outperforms the Xbox Ones and it's NOT by a small amount like 1% or 0.0001%. Just because I can't give you the EXACT number doesn't mean I can't know the PS4s GPU is faster, that's a ridiculous claim.

Underutilized assumes we could (theoretically) be done on the GPU (or has a practical method in place for doing so).

No, it doesn't. A GPU is being under utilized if it's NOT being fully utilized. If the GPU is being worked at 100% and the CPU is at 98% then the CPU hardware is being under utilized. That's FACT. The HARDWARE is not operating at it's full capability. Now if it's just 2% off like that then it's NOT significant because it's not realistic to use EXACTLY 100% of both. We know more then a small amount of the PS4s GPU is not being utilized though because it's GPU is much more powerful than the Xbox One so it can do anything the Xbox One is doing with a lot of room to spare. That's FACT.

Just because League of Legends performs the same on a $700 budget gaming rig as on a $1800 monster (the only difference being FPS in which case won't matter past 60) doesn't mean the game isn't optimized because it isn't using my $800 GPU to it's full potential.

This is unbelievably wrong. Read the headline subject of this post! It specifically says FRAMERATE which is FPS, it absolutely DOES matter past 60. If your $700 budget gaming rig runs League of Legends at the exact same Framerate and resolution (with all other games settings being the same) as a $1800 monster then the $1800 IS being under utilized, that's FACT. The Monster should be able to churn out much higher FPS and if you locked it at 60 then you're under utilizing it. Now Unity here is locked at 30fps so the PS4 should be able to higher than that least due to the better GPU. Even if your "won't matter past 60" were true it hasn't gotten to 60.

Then stop presuming to know more than those paid to do something professionally.

I'm not, that's you're Staw Man argument you keep throwing out. I've repeatedly said that's not the case.

It took YEARS for PC/Console developers to really begin taking advantage of multicore CPU's. And here we are, in the first generation of consoles which demands a GPU focus and we're getting our panties in a wad over them not adapting in the first 12 months of the console's lifespans. These things take time. PC developers don't have to deal with this, and last gen the CPU was in excess of the GPU. The way games have been made up until now does not conform with the new architecture and that means developers need to feel things out.

The entire above statement is a POSSIBLE reason for WHY the hardware is being underutilized. (that has nothing to do with developer skill). I've already agreed this a valid reason so I don't know why you keep stating it like it disproves anything I've said. It's another of your straw men. Furthermore when I say the same thing you attack me for claiming I KNOW that it's the case when all I'm saying is it's one possibility, the some exact one you keep saying.

You seem to think that you can at one end of the spectrum say the dev's are doing a fine job but at the other end of the spectrum question their decisions in the game.

Exactly like you did above. The devs can be doing a fine job but the game can still under utilize the hardware because it's early in the console cycle and they haven't had TIME to learn the ins and out to fully utilize it. Not that I'm saying I KNOW that's the EXACT reason for this game in particular but as one possible example of how it false to claim that saying the devs are good and the hardware is underutilized somehow conflicts. There is no conflict there. I can say both, YOU have said both, I don't see what your issue is.
Link to comment
Share on other sites

Wow :laugh: I would just add here that any expectations of a multi platform title fully optimized for a particular platform are impractical and mostly foolish for the annual releases of the likes of AC and CoD.

Link to comment
Share on other sites

Wow :laugh: I would just add here that any expectations of a multi platform title fully optimized for a particular platform are impractical and mostly foolish for the annual releases of the likes of AC and CoD.

 

You could say that for a poor few guys team, not an international corporation with over 9,000 employees.

Link to comment
Share on other sites

You could say that for a poor few guys team, not an international corporation with over 9,000 employees.

Ubisoft has 9000 employees working on AC: Unity?

They are not going to optimize fully on either consoles and this is nothing new. If a developer does it then more power to them but all Ubi cares about is the release date so optimization is probably not a word in their release plan.

Link to comment
Share on other sites

Ubisoft has 9000 employees working on AC: Unity?

They are not going to optimize fully on either consoles and this is nothing new. If a developer does it then more power to them but all Ubi cares about is the release date so optimization is probably not a word in their release plan.

 

I dunno, it seems they care more about talking ###### than working on the game to hit the release date.

Link to comment
Share on other sites

Funny part of this debate is, their PC minimum spec is so out there I'll be playing it on consoles and rolling my eyes at you guys (not that I wasn't already.)

 

Whatever they're doing on PC is apparently destroying the CPU.  Curtain call for DX11, it seems.

Link to comment
Share on other sites

Funny part of this debate is, their PC minimum spec is so out there I'll be playing it on consoles and rolling my eyes at you guys (not that I wasn't already.)

 

Whatever they're doing on PC is apparently destroying the CPU.  Curtain call for DX11, it seems.

 

Don't worry, we'll be playing much better stuff. There is no need to even roll eyes, that is so last gen. Hue hue hue.

Link to comment
Share on other sites

Don't worry, we'll be playing much better stuff. There is no need to even roll eyes, that is so last gen. Hue hue hue.

Sure, for the ten people who even meet the system specs, I'm sure it'll be much better.

Link to comment
Share on other sites

Wow :laugh: I would just add here that any expectations of a multi platform title fully optimized for a particular platform are impractical and mostly foolish for the annual releases of the likes of AC and CoD.

I agree with this as well.  But this falls into the category of what the community calls "parity".  There ARE good reasons for so called "parity" but the issue here is that Ubisoft specifically denied that failure to optimize the game to the PS4 (due to newness of this console generation, time, etc.) or a desire to have the versions be the same ("parity") were the reason the PS4 and Xbox One versions are both 900p@30fps.  Instead they claimed the PS4 was incapable or performing better, that 30fps is more cinematic, and a string of other "bollocks" excuses.  This thread is about how B.S. the excuses Ubisoft P.R. and the anonymous Unity developer have put forward are.  If they just come out and said what you did this thread wouldn't exist.

Link to comment
Share on other sites

I really hope the games turns out decent (looks and gameplay)... to make all these assumptions a mute point...

 

I really honestly hope it looks stunning for it not to be in full 1080p... so that it can be let go...........for now  :/

Link to comment
Share on other sites

It's NOT hypothetical that the PS4 GPU has more compute units. It's NOT hypothetical that the PS4 has faster memory. Based on those REAL hardware differences it is NOT hypothetical that the PS4s GPU is outperforms the Xbox Ones and it's NOT by a small amount like 1% or 0.0001%. Just because I can't give you the EXACT number doesn't mean I can't know the PS4s GPU is faster, that's a ridiculous claim.

This is a straw man, you've misrepresented the argument by back-peddling and changing what we were talking about. We were not discussing the PS4's power difference compared to the X1. We were discussing the amount of resources certain elements of the game were taking up. How much CPU the lighting/illumination effects were using. Please don't try and slip out of the actual discussion.

 

I agree with this as well.  But this falls into the category of what the community calls "parity".  There ARE good reasons for so called "parity" but the issue here is that Ubisoft specifically denied that failure to optimize the game to the PS4 (due to newness of this console generation, time, etc.) or a desire to have the versions be the same ("parity") were the reason the PS4 and Xbox One versions are both 900p@30fps.  Instead they claimed the PS4 was incapable or performing better, that 30fps is more cinematic, and a string of other "bollocks" excuses.  This thread is about how B.S. the excuses Ubisoft P.R. and the anonymous Unity developer have put forward are.  If they just come out and said what you did this thread wouldn't exist.

 

I don't see it as a failure when the only solution to optimize it further is to keep working with the hardware... which can easily be done on another game. It's not necessary to hold back a game just because you can't get every ounce out of a system. This "parity" term is a scapegoat... a buzzword people are using to highlight something incorrectly. Unless you think they should have spent another year on the game to try and optimize further, how does this even qualify as any kind of "failure" ? Maybe if it was 30fps, 900p and dropping frames with last-gen graphics it'd be a failure. But this most definitely is not a failure.

Link to comment
Share on other sites

We were not discussing the PS4's power difference compared to the X1.

Yes, we were... at least I was.  If a power difference exists between two pieces of hardware and a program is doing the same thing runs the same on both, then the more powerful system is being under utilized.  That's true if it's gaming rigs running League of Legends (in your example) or the Xbox One and PS4 running AC: Unity (what this thread started about).  That's been one of my key points from the beginning, it hasn't changed.

 

I have no idea what YOU were talking about because you keep contradicting yourself.  One moment the PS4 was NOT being under utilized and then the next you're explaining why it IS under utilized (new console generation, keeping the versions the same, etc.)  Plus you kept creating straw men and "disproving" things I never even said. (such as I know more than the developers)  You seem to want so bad to lump me in with people that are mad about the PS4 version not being better but I'm just annoyed that "Ubisoft talks bollocks" about the reasons for it.

Link to comment
Share on other sites

Yes, we were... at least I was.  If a power difference exists between two pieces of hardware and a program is doing the same thing runs the same on both, then the more powerful system is being under utilized.  That's true if it's gaming rigs running League of Legends (in your example) or the Xbox One and PS4 running AC: Unity (what this thread started about).  That's been one of my key points from the beginning, it hasn't changed.

 

I have no idea what YOU were talking about because you keep contradicting yourself.  One moment the PS4 was NOT being under utilized and then the next you're explaining why it IS under utilized (new console generation, keeping the versions the same, etc.)  Plus you kept creating straw men and "disproving" things I never even said. (such as I know more than the developers)  You seem to want so bad to lump me in with people that are mad about the PS4 version not being better but I'm just annoyed that "Ubisoft talks bollocks" about the reasons for it.

 

Lets set a definition here. Underutilizing something, to me, means there was something that was not tried to improve performance. Aka, something that was available to them that they did not do. Seeing as the consoles are new, they may have done what they could to get things working smoothly. Does that qualify as "underutilizing"? To me it does not. That sounds like they did the best they could with what they had, what they wanted to accomplish, and what they knew how to do. If you want to describe that as a "failure", then I don't know what a success would be in your world.

 

Second, we were discussing the amount of resources the lighting was taking up. The discussion about whether or not the X1 is weaker than the PS4 was never an actual discussion. I merely mentioned how your crutch is the fact there is a difference means the PS4 game should be better than the X1. Please, go back and read what we were talking about instead of feigning ignorance about the discussion. It's all there to be seen, plain as day.

Link to comment
Share on other sites

Ubisoft has 9000 employees working on AC: Unity?

They are not going to optimize fully on either consoles and this is nothing new. If a developer does it then more power to them but all Ubi cares about is the release date so optimization is probably not a word in their release plan.

 

No, but there are 10 dev studios working on it.

 

And to quote the Ubi dev "this game is crazy optimized!"

Link to comment
Share on other sites

Lets set a definition here. Underutilizing something, to me, means there was something that was not tried to improve performance. Aka, something that was available to them that they did not do. Seeing as the consoles are new, they may have done what they could to get things working smoothly. Does that qualify as "underutilizing"? To me it does not. That sounds like they did the best they could with what they had, what they wanted to accomplish, and what they knew how to do. If you want to describe that as a "failure", then I don't know what a success would be in your world.

Underutilize: to utilize less than fully or below the potential use

http://www.merriam-webster.com/dictionary/underutilize

That's a nice custom definition you have there but I was going by what the word actually means. Maybe if you had just posted your own thoughts to the thread instead of attacking mine you'd have some leeway to redefine what words mean but since your post was a direct reply to mine maybe you should have figured out what I meant (or what the word actually means) before attacking my post. The PS4 hardware IS underutilized. It is less than fully used. It is below it's potential use. We KNOW this because it's more powerful than the Xbox One and so if it's doing the same thing with the same performance as the Xbox One it IS being underutilized. This early in the generation MOST games are underutilized, just because it's common doesn't change the fact that it IS in fact being underutilized. That isn't a problem in itself though, the problem, and what this thread is about, is that Ubisoft actively DENIED it was an optimization issue that resulted in similar performance. That DENIAL is "bollocks". THAT is, and always has been, my point.

Second, we were discussing the amount of resources the lighting was taking up. The discussion about whether or not the X1 is weaker than the PS4 was never an actual discussion. I merely mentioned how your crutch is the fact there is a difference means the PS4 game should be better than the X1. Please, go back and read what we were talking about instead of feigning ignorance about the discussion. It's all there to be seen, plain as day.

I really don't give a crap how much resources the lighting are taken up specifically. That was NEVER a central point. The only reason it came up in my posts at all is because that's what the Ubisoft Developer (not me) used as an excuse for why the game is CPU bound. That fact that it's CPU bound is important because it's further evidence that the PS4 GPU is under utilized. If the game is CPU bound even on the Xbox One with it's slighting stronger CPU and significantly weaker GPU then there has to be a large amount of GPU power not being used on the PS4. That's not speculation, it's FACT.

FACT: If game X is CPU bound it is BY DEFINITION underutilizing it's GPU.

FACT: If game X has the same settings and is running at the same resolution and same framerate on two different power GPUs then the more powerful GPU is being underutilized. (the more powerful GPU should be able to have at the very least a higher framerate running the same settings as the weaker one)

FACT: Ubisoft has told us AC: Unity is CPU bound. That's not speculation, that's direct from the developer who would know.

FACT: The PS4 has a more powerful GPU then the Xbox One.

FACT: The PS4 and the XBox One run the same 900p@30fps for AC: Unity. Again, not speculation, that's direct from the developer.

Given those FACTS we KNOW the PS4 is being underutilized, no speculation there. Again, that isn't in itself a problem, especially this early in the console generation EXCEPT Ubisoft came out and made statements such as "The PS4 couldn't handle" more, 30fps is more cinematic, etc. It is those "bollocks" statements from both the PR and that dev quote that this thread is about and all I've ever been arguing.

You're so hung up on the lighting stuff, if I can quote specific figures, straw man arguments, etc. you can't see the forest from the trees. Look at the title of this thread, does it say ANYTHING about lighting? If the lighting stuff that came up is all you care about then who cares, I'll concede that point... whatever your position was, I don't really care. It doesn't change the FACTS I've outlined above.

Link to comment
Share on other sites

Underutilize: to utilize less than fully or below the potential use

http://www.merriam-webster.com/dictionary/underutilize

That's a nice custom definition you have there but I was going by what the word actually means. Maybe if you had just posted your own thoughts to the thread instead of attacking mine you'd have some leeway to redefine what words mean but since your post was a direct reply to mine maybe you should have figured out what I meant (or what the word actually means) before attacking my post. The PS4 hardware IS underutilized. It is less than fully used. It is below it's potential use. We KNOW this because it's more powerful than the Xbox One and so if it's doing the same thing with the same performance as the Xbox One it IS being underutilized. This early in the generation MOST games are underutilized, just because it's common doesn't change the fact that it IS in fact being underutilized. That isn't a problem in itself though, the problem, and what this thread is about, is that Ubisoft actively DENIED it was an optimization issue that resulted in similar performance. That DENIAL is "bollocks". THAT is, and always has been, my point

 

Except when computer hardware is concerned, using 100% of anything is a bad idea. You want to use as much as you need. Optimization is about efficiency, not using 100% reasources. The problem here is you are using these words as synonyms when they are not. Optimization can often result in underutilization if we want to use text book definitions. But that does not mean the game is not running well, or is not optimized well. When I am defining my word, I'm defining it within the context of our discussion. We are discussion Optimization and whether or not it is using the hardware necessary (ie, not underutilizing the hardware) for its purposes. So unless you really believe games should be using 100% GPU and CPU on anything they run on to be optimized, then my statement still stands.

 

I was not redefining a word for the purposes of proving you wrong. I was doing so in an effort to put us on the same page. But instead you'd rather pull out a dictionary and completely disregard that effort and attack me for it.

 

 

I really don't give a crap how much resources the lighting are taken up specifically. That was NEVER a central point. The only reason it came up in my posts at all is because that's what the Ubisoft Developer (not me) used as an excuse for why the game is CPU bound. That fact that it's CPU bound is important because it's further evidence that the PS4 GPU is under utilized. If the game is CPU bound even on the Xbox One with it's slighting stronger CPU and significantly weaker GPU then there has to be a large amount of GPU power not being used on the PS4. That's not speculation, it's FACT.

1. FACT: If game X is CPU bound it is BY DEFINITION underutilizing it's GPU.

2. FACT: If game X has the same settings and is running at the same resolution and same framerate on two different power GPUs then the more powerful GPU is being underutilized. (the more powerful GPU should be able to have at the very least a higher framerate running the same settings as the weaker one)

3. FACT: Ubisoft has told us AC: Unity is CPU bound. That's not speculation, that's direct from the developer who would know.

4. FACT: The PS4 has a more powerful GPU then the Xbox One.

5. FACT: The PS4 and the XBox One run the same 900p@30fps for AC: Unity. Again, not speculation, that's direct from the developer.

Given those FACTS we KNOW the PS4 is being underutilized, no speculation there. Again, that isn't in itself a problem, especially this early in the console generation EXCEPT Ubisoft came out and made statements such as "The PS4 couldn't handle" more, 30fps is more cinematic, etc. It is those "bollocks" statements from both the PR and that dev quote that this thread is about and all I've ever been arguing.

You're so hung up on the lighting stuff, if I can quote specific figures, straw man arguments, etc. you can't see the forest from the trees. Look at the title of this thread, does it say ANYTHING about lighting? If the lighting stuff that came up is all you care about then who cares, I'll concede that point... whatever your position was, I don't really care. It doesn't change the FACTS I've outlined above.

 

First of all, let me state that it's quite counterintuitive to begin facts with the statement "IF". In fact, by definition the world IF means there is a variable and thus it is not a fact. But I've already mentioned this on several occasions and you continue to state possibility as fact.

 

1. Only if there was a way to more greatly use the GPU, and benefit from that usage. You cannot just say, "You aren't using enough, use more." when there isn't any other way to use it. I may have 30 muffins, but I can only eat 5 of them. Sure I've got room in my lungs but you can't breath muffin.

2. Except both consoles are "underutilizing" their GPU's, so the point is moot. If one glass is half full and the other is 3/4's full, they are both still not full.

3. Ok, doesn't change anything I've said.

4. What does this matter if the consoles are both CPU BOUND? A fact does have to be somewhat relevant to the conversation for it to matter.

5. Already addressed above, it's due to the games being CPU BOUND, nothing to do with the GPU power.

 

This thread is about their BS claim, and the technical issues that led up to this result are entirely relevant. You were the one proposing that they should have optimized their game differently. You brought up specs and numbers, and then began to backpedal on those arbitrary figures and now are trying to back out of the discussion entirely claiming it's not the topic. You made it part of the topic, and the FACT is, it is indeed relevant to the topic.

Link to comment
Share on other sites

This topic is now closed to further replies.