Ubisoft: 30 frames per second feels "more cinematic"


Recommended Posts

I don't think it's the 30fps specifically that has people upset.  It's the quote ?We decided to lock them at the same specs to avoid all the debates and stuff,? that has people mad.  The point being that there is a general consensus that the PS4 has more powerful hardware.  If the Xbox One can run the game at 900p@30fps then the PS4 should be able to do 1080p@30fps or even 900p@60fps all other things being equal (same textures, lighting, effects, etc).  If the resolution was announced and the PS4 was 900p@30fps and the Xbox One was 720p@30fps I don't think there would be this backlash, even though those are both 30fps and sub 1080p.  That quote isn't vague, it directly states that they held the PS4 back for parity.  After the Watch_Dogs PC fiasco right or wrong Ubisoft doesn't have a lot of credibility with gamers so it's unlikely they're going to believe the backpedaling they do now.

 

The most believable explanation I've heard is the game is CPU bound.  That their AI is so incredible that it's the bottleneck so the increased graphics power of the PS4 doesn't help.  This of course totally ignores that modern GPUs are GPGPUs and at least some of whatever is bottle necking the CPU could probably be moved to the extra compute units on the PS4, that's what they're there for.  That not as easy though and not cross platform so they're not going to do that.  So instead the PS4 gpu is under utilised and constrained by poor game engine design.  Additionally if they're going to claim it's the AI that's doing that it's going to have to be some SUPER impressive AI because Shadow of Mordor's AI (nemesis system) is AMAZING, it's a similar sort of game, and it still manages to hit 1080p on the PS4.

Link to comment
Share on other sites

I'm also curious, what is the standard fps of a television these days anyways? Isn't it 50HZ for refresh rates? If the standard isn't 60fps, isn't all this moot?

Link to comment
Share on other sites

I'm also curious, what is the standard fps of a television these days anyways? Isn't it 50HZ for refresh rates? If the standard isn't 60fps, isn't all this moot?

i just recently bought a TV at the end of last year so I can tell you from shopping around that most TVs (at least here in the US) start at 60hz & go up from there

Link to comment
Share on other sites

I'm also curious, what is the standard fps of a television these days anyways? Isn't it 50HZ for refresh rates? If the standard isn't 60fps, isn't all this moot?

 

What is even more stupid is that the "cinematic" experience does not depend on the framerate at all. They could render at 120fps and still make the animations feel like a stupid low-budget movie.

Link to comment
Share on other sites

This of course totally ignores that modern GPUs are GPGPUs

Of course, if you use a modern GPUs GPGPU capability you're also cutting into its rendering power, which brings the whole technical argument circular.  Using the GPU harder would make it harder to hit 1080p for the same reasons as the weaker CPU.

 

I'm sure future games will optimize better for either console, but for now with cross platform you're looking at best results with least effort.

Link to comment
Share on other sites

I'm also curious, what is the standard fps of a television these days anyways? Isn't it 50HZ for refresh rates? If the standard isn't 60fps, isn't all this moot?

The ATSC under H.264/MPEG-4 AVC High Profile Level 2 supports up to 1080p@60Hz. (Hz or cycles per second is equivalent to frames per second in games).

 

Most LCD displays even for computers and on laptops today also have a refresh rate of 60Hz. (you can get performance ones that go higher)

 

For TVs 60Hz is considered low, 120 and 240 are not uncommon.  120 was needed for 3D so each eye would see a standard 60Hz image even though it only sees every other frame.  (One eye sees even frames at 60Hz with the other sees odd at 60Hz for a total of 120Hz output from the TV).  Even without 3D though it's good because real cinema recordings are often 24Hz.  As such with 60Hz display it doesn't divide evenly so some artifacts are possible.  At 120Hz the TV can just show each frame of a 60Hz content for two cycles each or for 24Hz content it can show each frame for 5 cycles and it works out with no need for interpolation or anything else that might generate artifacts.  Then of course as with any spec TV makers got in an arms race so now there are 240Hz, 360Hz, 720Hz, etc. TVs but they are all still multiples of the base 60.

Link to comment
Share on other sites

i just recently bought a TV at the end of last year so I can tell you from shopping around that most TVs (at least here in the US) start at 60hz & go up from there

It's 50 in the UK I think.  Dunno about Europe/etc.

Link to comment
Share on other sites

It's 50 in the UK I think.  Dunno about Europe/etc.

Forgot about that.  Internationally it goes by the Utility Frequency/Mains Frequency.  Which generally in the NTSC regions such as the U.S. and parts of Asia is 60Hz and in the PAL regions such as Europe is 50Hz.  For this reason in addition to the 24, 30, and 60Hz ATSC formats there are also 25 and 50 for PAL regions.

Link to comment
Share on other sites

oh bull, if you are going to pull the cinematic claim do it at 24 fps like movies usually are filmed at then at lease you can make that argument... this is just a we can't handle it so here's our PR spin BS

Link to comment
Share on other sites

oh bull, if you are going to pull the cinematic claim do it at 24 fps like movies usually are filmed at then at lease you can make that argument... this is just a we can't handle it so here's our PR spin BS

the fact that Naughty Dog made 60fps optional in Last of Us makes these claims even less believable/acceptable

Link to comment
Share on other sites

the fact that Naughty Dog made 60fps optional in Last of Us makes these claims even less believable/acceptable

 

Why? This imo, is quite anecdotal. Just because a company who's known for a cinematic game didn't do it doesn't make the reason invalid.

 

Taking a look at some television specs, it does seem they can easily output 60fps. But, this trick that's proposed here is already used on Blu-Ray movies, where they artificially deflate the fps to make it closer to a cinematic experience. So seeing it in games shouldn't be a huge surprise (even if games are not filmed, but there'd be little reason to deflate a film-based movie since it's already filmed at 24fps and inflated to 30fps on HDTV's).

 

I assume the above practice is mostly used for CGI movies such as Pixar, etc.

 

I don't know if this is really an excuse from Ubisoft or not, but I think that there are games where 30/60fps really do not matter. Certain games function on frame rates (mostly fighters and action/rpg games) but outside that I see little reason to grab my pitchfork over a 30fps game... especially if it's an artistic decision.

Link to comment
Share on other sites

Also there was very much in the movie that wad rendered, to stay it was all filmed is incorrect. It may be that people are used to other framerates, but does that not in part mean something? Convention is just as important as advancement at times, and if your players/viewers have a "weird" feeling from your product then there's a problem. More is not always better.

When you film something you're sampling out of real-life motion which creates natural motion blur that largely compensates for low framerates. That's why movies can get away (to some extent) with only 24fps. Video games typically produce sharp images that only produce a good illusion of motion at a much higher framerate. This is due to very nature of the way our eyes track movement. Games can emulate some computationally inexpensive motion blur but it's a poor approximation and doesn't solve the input lag issue and unresponsiveness of a low framerate. That's what I meant by the filmed/rendered distinction. Of course parts of movies are also rendered but at a quality level that convincingly reproduces film quality, with accurate motion blur etc.

  • Like 2
Link to comment
Share on other sites

Sounds to me like selecting the lowest common denominator (aka laziness). Those specs are clearly targeted at the Xbone. They locked fps and resolution so that it would work on both consoles without any extra work (N)

  • Like 1
Link to comment
Share on other sites

Sounds to me like selecting the lowest common denominator (aka laziness). Those specs are clearly targeted at the Xbone. They locked fps and resolution so that it would work on both consoles without any extra work (N)

If I recall correctly the Xbox had a slightly better CPU, and if Ubisoft are to believed (who knows, they like digging themselves holes), the game is bottlenecked by the CPU so if anything it is the PS4 that is the lowest common denominator. 

 

The AI in this game had better be damned impressive...

Link to comment
Share on other sites

oh bull, if you are going to pull the cinematic claim do it at 24 fps like movies usually are filmed at then at lease you can make that argument... this is just a we can't handle it so here's our PR spin BS

 

aren't DVDs 30 fps?

Link to comment
Share on other sites

If I recall correctly the Xbox had a slightly better CPU, and if Ubisoft are to believed (who knows, they like digging themselves holes), the game is bottlenecked by the CPU so if anything it is the PS4 that is the lowest common denominator.

I suppose that would explain it. It's strange though, because up to now, we've only heard of GPU bottlenecks.

The AI in this game had better be damned impressive...

:laugh: I won't hold my breath.
Link to comment
Share on other sites

I don't even touch a game unless it's 4k/120fps. Why would anyone settle for less than that?

 

We might as well start doing that in a few years. Nothing can excuse the pathetic "next-gen" consoles.

Link to comment
Share on other sites

I don't even touch a game unless it's 4k/120fps. Why would anyone settle for less than that?

 

LOL.     a cost of the machine that can actually run modern games maxed out at this res and fps... :rolleyes:

Link to comment
Share on other sites

Why do people think it's that automatically and that they're lying to everyone now all of a sudden?  This is the same company that pushed out a patch for AC4 on the PS4 that bumped the res up to 1080p and never bothered to go back and work on the XB1 version. But now with unity they've changed their stance? Why?  And please lets leave the crazy ideas that one company or the other has started to pay developers to do this, that's just silly.

 

In this case, I don't think it's a "moneyhatting" situation.  I will say that last year (and years previous), AC was comarketed with Sony and usually had Sony-exclusive missions.  In fact, I think originally it was set to be a PS-exclusive franchise but Sony didn't want to pay for it.  And last year, the game was patched up to 1080p on PS4.

 

This year, the game is comarketed with MS.  Ubi would kind of look bad if the game is being advertised during the holidays would run better on the other console.  So while I don't think MS went to them and said "nerf the PS4 version", I do think Ubi didn't want the debate.  Where they ######ed up was to admit they didn't want the debate, then try to pull some "well, there's a CPU bottleneck and we have lots of AI" when the GPU on the PS4 is made to offload things from the CPU like physics and AI.

 

The proof I offer is another major open world game coming out for both consoles with tons of interactive AI -- GTA V, which is 1080p on PS4 (haven't seen what the X1 resolution is confirmed to be).  However, Rockstar dropped the "30fps is cinematic" crap too.

  • Like 2
Link to comment
Share on other sites

"well, there's a CPU bottleneck and we have lots of AI" when the GPU on the PS4 is made to offload things from the CPU like physics and AI.

 

No idea if the NPCs have to calculate the meaning of life in AC, but the AI that actually is behind those NPCs is quite frankly very similar to the AI that was in the first game.

Link to comment
Share on other sites

Well another high profile example of this happening would be Destiny. At the last minute, the X1 gained parity with the ps4 version, but I didn't see any negative backlash. Its not the numbers that enrage people, its the comments made by Ubisoft.

 

It all comes down to how Ubisoft delivered the message honestly. Bungie didn't make a big deal out of it and it kind of just passed by. Ubisoft said way too much and are now paying for it.

Link to comment
Share on other sites

This topic is now closed to further replies.