Digital Foundry: Hands-on with PlayStation 4


Recommended Posts

Digital Foundry: Hands-on with PlayStation 4

It's Sony's time to shine. After a generation that kicked off with compromised multi-platform ports mixed in with some ground-breaking first-party efforts on PS3, there's a sense that it could retake a position of supremacy in the console arena by virtue of the new console's rock-solid core specs. And that's before we factor in the goodwill pouring out from enthusiast gamers because of its decision to stick with existing DRM standards. Unlike many of Microsoft's show-floor demos this E3, however, the use of genuine PS4 hardware among all developers also gives us a far more honest insight into the console's potential, despite making for a brace of nail-biting "will it? won't it?" moments during its conference.

 

Not everything goes to plan, but when it comes to the games it's clear there's a carefully plotted agenda for the PS4's tentative late 2013 launch. Given how Xbox One banner titles like Forza Motorsport 5 and Killer Instinct are running at a blistering native 1080p at 60 frames per second on day one, there's nothing truly equivalent competing on the Sony side in terms of fighters and racing sims. However, between Infamous: Second Son and Killzone: Shadow Fall, the bases are thoroughly covered for first and third person action titles, while Mark Cerny's mysterious platformer Knack brings some light-hearted respite to its online launch lineup.

 

The question is, for its faster unified GDDR5 memory and beefier GPU clocks, does Sony's first wave of titles actually bring the 1080p60 dream into focus? Or does this first wave of games more accurately represent the continuation of the console 30fps template? During our time at E3, we got a chance to test almost every PS4 title on display, allowing us to get a feel for where development stands for each of these core titles, plus some of the third-party efforts too.

 

Source

Link to comment
Share on other sites

I'm overly disappointed that all the games where running around the 20fps-30fps mark. This late on until release, the dev-kits really should be stable enough to provide 60fps without hesitation. Appears the developers are really struggling.

Link to comment
Share on other sites

The release date is coming pretty quick and it seems that both MS and Sony have a ton of work to do.

As much as I want to buy a new console at launch, I'm starting to think that both will be extremely rushed in order to make the holiday season, and will be plagued with issues.

Link to comment
Share on other sites

I'm also surprised and "disappointed" that the PS4 games were not running at a full 60fps. I put "disappointed" in quotation marks because 30fps is fast enough that humans can't tell the difference, but the games should have been consistently hitting 30fps with no dips. That's kind of a let down.

Link to comment
Share on other sites

I'm also surprised and "disappointed" that the PS4 games were not running at a full 60fps. I put "disappointed" in quotation marks because 30fps is fast enough that humans can't tell the difference, but the games should have been consistently hitting 30fps with no dips. That's kind of a let down.

:s Uh, that is just so very wrong.

  • Like 2
Link to comment
Share on other sites

I'm also surprised and "disappointed" that the PS4 games were not running at a full 60fps. I put "disappointed" in quotation marks because 30fps is fast enough that humans can't tell the difference, but the games should have been consistently hitting 30fps with no dips. That's kind of a let down.

 

 

The human eye can technically detect up to 1000 fps, but the commonly "accepted" number is 150 fps.

Link to comment
Share on other sites

:s Uh, that is just so very wrong.

 

 

The human eye can technically detect up to 1000 fps, but the commonly "accepted" number is 150 fps.

 

How many we can technically see vs how many are needed in order to create the illusion of motion are two different questions. In the context of games (like my comment), 30 is more than enough to do this. So I don't understand why people believe 60 will look better. I have seen games run at both, and it's almost impossible to tell the difference.

 

Side-note: I wonder if the 30fps cap is Sony's solution to keep the hardware from overheating? There were rumors about the Xbone overheating, and they both have similar hardware. I wonder if this is related?

Link to comment
Share on other sites

Don't forget lot of demos at E3 are 1 month old build. So lot of them are around beginning may build. Considering the launch games will golden around mid october it still leave 5 months and an half of development. Not a lot but still more than enough to do some tuning.

Link to comment
Share on other sites

I'm overly disappointed that all the games where running around the 20fps-30fps mark. This late on until release, the dev-kits really should be stable enough to provide 60fps without hesitation. Appears the developers are really struggling.

 

I thought it was built around an X86 processor with a Radeon GPU - identical to how PC games were made for the past decade. No idea how struggling is possible.

How many we can technically see vs how many are needed in order to create the illusion of motion are two different questions. In the context of games (like my comment), 30 is more than enough to do this. So I don't understand why people believe 60 will look better. I have seen games run at both, and it's almost impossible to tell the difference.

 

Side-note: I wonder if the 30fps cap is Sony's solution to keep the hardware from overheating? There were rumors about the Xbone overheating, and they both have similar hardware. I wonder if this is related?

 

60fps is better for 'high action' where you wouldn't get as much blur. Also you can lose 30fps and it will still appear quite smooth. Lose any frames on 30fps and it stutters. 

30fps would utilize half the graphics power one would assume, so it would produce half the related waste heat. Very safe bet they locked it to that to prevent a/any system failure(s) during a initial showing demo. If not, thats not a reassuring sign for 4K resolutions. Microsofts demos were running on PCs so they don't exactly count.

Link to comment
Share on other sites

I thought it was built around an X86 processor with a Radeon GPU - identical to how PC games were made for the past decade. No idea how struggling is possible.

 

60fps is better for 'high action' where you wouldn't get as much blur. Also you can lose 30fps and it will still appear quite smooth. Lose any frames on 30fps and it stutters. 

30fps would utilize half the graphics power one would assume, so it would produce half the related waste heat. Very safe bet they locked it to that to prevent a/any system failure(s) during a initial showing demo. If not, thats not a reassuring sign for 4K resolutions. Microsofts demos were running on PCs so they don't exactly count.

 

Just because it uses a PC architecture doesn't mean its like developing on a PC, its still completely different. If the console isn't stable and finalized properly with the right SDK tools, you're going to see the result you're seeing now.

Link to comment
Share on other sites

Here's a question for those knowledgeable on these things... Why is it always 30fps or 60fps? Why not 40fps, 45fps or 50fps?

 

I'm not bothered myself, as I barely notice the different from 30fps to 60fps. I'd be happy with there just being no slowdowns in any titles (looking at Codemasters).

 

I also wonder if anything will be gained when the games are being run as finished products on finished hardware? My little experience with developing a Windows 8 game say yes, but a WinJS game is far removed from anything on consoles.

Link to comment
Share on other sites

How many we can technically see vs how many are needed in order to create the illusion of motion are two different questions. In the context of games (like my comment), 30 is more than enough to do this. So I don't understand why people believe 60 will look better. I have seen games run at both, and it's almost impossible to tell the difference.

 

Side-note: I wonder if the 30fps cap is Sony's solution to keep the hardware from overheating? There were rumors about the Xbone overheating, and they both have similar hardware. I wonder if this is related?

A game running at 60 FPS will look better than one running at 30 FPS because the motion will look smoother. It's less noticeable with real-time strategy games because it's harder to notice huge changes in motion from an aerial viewpoint. But with first-person shooters, it's easier to notice it and it's usually masked by motion blur. You can test this yourself in a game like Counter-Strike: Global Offensive or Team Fortress 2. Play a match and set the max frame-rate to 60 FPS. Move around and pay attention to the fluidity of movement. And then set it to 30 FPS. You'll notice less fluidity of movement when you look up and down or side to side.

 

The main issue with 30 FPS though is that any drop in frame-rate is more noticeable than 60 FPS. If developers can make their games run at 30 FPS regardless of what's going on, then most players won't be able to tell what the frame-rate is (unless they play the game at 60 FPS too, for reference). We're still far away from that with the PS4 and presumably the XB1 too.

 

I wonder how DICE are going to pull off 60 FPS with a game like Battlefield 4 though.

Link to comment
Share on other sites

This is why I see no real rush to buy the systems, more so with the PS4 since lots of the games, well, any exclusives I would care about, aren't till sometime in 2014, which could be this time 2014 or holiday 2014 so who knows.    I'm sure the developers can crank up the FPS but the key is if it's going to be smooth and constant or is it going to jump all over the place?   If it's not smooth and constant with minor dips than all that "raw horsepower GPU" hoopla fans are talking about will be for nothing.

 

You could also caught it up to being the start yet I really really doubt complexity of coding for the PS4 will be a good excuse like it was for the PS3.  We're talking about a generic PC here, even more so than what MS has done with the XB1.  Developers should be more than able to optimize the heck out of their games for the PS4 from very early on.

Link to comment
Share on other sites

This is why I see no real rush to buy the systems, more so with the PS4 since lots of the games, well, any exclusives I would care about, aren't till sometime in 2014, which could be this time 2014 or holiday 2014 so who knows.    I'm sure the developers can crank up the FPS but the key is if it's going to be smooth and constant or is it going to jump all over the place?   If it's not smooth and constant with minor dips than all that "raw horsepower GPU" hoopla fans are talking about will be for nothing.

 

You could also caught it up to being the start yet I really really doubt complexity of coding for the PS4 will be a good excuse like it was for the PS3.  We're talking about a generic PC here, even more so than what MS has done with the XB1.  Developers should be more than able to optimize the heck out of their games for the PS4 from very early on.

That's the drawback of launch titles. They'll always be the worst in terms of image quality and performance. And that's one reason why I don't plan on buying a next-gen console in 2013. I'll wait until late-2014 at the earliest. My best friend plans on doing the same thing. We've considered simply upgrading our gaming PCs to hold us out until more next-gen titles are released.

Link to comment
Share on other sites

Don't forget lot of demos at E3 are 1 month old build. So lot of them are around beginning may build. Considering the launch games will golden around mid october it still leave 5 months and an half of development. Not a lot but still more than enough to do some tuning.

I thought there was a rumor that Xbox One was "6 months" behind schedule yet it had the only 1080p 60 fps game on demo (Forza 5).

That's the drawback of launch titles. They'll always be the worst in terms of image quality and performance. And that's one reason why I don't plan on buying a next-gen console in 2013. I'll wait until late-2014 at the earliest. My best friend plans on doing the same thing. We've considered simply upgrading our gaming PCs to hold us out until more next-gen titles are released.

This can be easily validated by checking out how CoD2 looks today, IIRC it was a launch title for Xbox 360. :)

Link to comment
Share on other sites

I'm not really surprised honestly, half these games aren't finished and are still getting optimised for the finished hardware (While they would have had dev kits for a while, the actual finished hardware itself is quite new)

The human eye can technically detect up to 1000 fps, but the commonly "accepted" number is 150 fps.

Human eyes don't see in distinct "frames" though, depending on the person and the situation you can get people who can see extremely quick visual changes (Something like 1/1200th of a second)

How many we can technically see vs how many are needed in order to create the illusion of motion are two different questions. In the context of games (like my comment), 30 is more than enough to do this. So I don't understand why people believe 60 will look better. I have seen games run at both, and it's almost impossible to tell the difference.

...

It's certainly not "impossible" to see the difference between 30fps and 60fps, the difference is huge. Look at all the complaints with the Hobbit, because it went from 24fps to 48fps (And so the motion was much smoother, too smooth for some people), the only reason we consider such low framerates to be "ok" is because they're either blurred to hell and back, or we're just used to them (As is the case with games on consoles, we've had the last 7-8 years to get used to them struggling to hit 30fps).

Link to comment
Share on other sites

Here's a question for those knowledgeable on these things... Why is it always 30fps or 60fps? Why not 40fps, 45fps or 50fps?

 

I'm not bothered myself, as I barely notice the different from 30fps to 60fps. I'd be happy with there just being no slowdowns in any titles (looking at Codemasters).

 

I also wonder if anything will be gained when the games are being run as finished products on finished hardware? My little experience with developing a Windows 8 game say yes, but a WinJS game is far removed from anything on consoles.

Refresh rates on televisions tend to run in numbers divisible by 30.

Link to comment
Share on other sites

Human eyes don't see in distinct "frames" though, depending on the person and the situation you can get people who can see extremely quick visual changes (Something like 1/1200th of a second)

 

I never said eyes see in "frames".  I simply stated the medical fact that human eyes can distinguish the difference in fps up to 150 and the brain has the capacity to perceive up to 1000 fps.

 

The person I quoted said humans can't tell the difference of anything over 30 fps, and that's completely false.

How many we can technically see vs how many are needed in order to create the illusion of motion are two different questions. In the context of games (like my comment), 30 is more than enough to do this. So I don't understand why people believe 60 will look better. I have seen games run at both, and it's almost impossible to tell the difference.

 

The brain and the eyes are two different entities.  The brain can perceive data coming in at up to 1000 fps, but the eyes generally can't tell the difference over 150 fps.  Maybe you don't have great vision, but it's extremely easy to see the difference of 30 fps vs 60 fps.

 

http://boallen.com/fps-compare.html

Link to comment
Share on other sites

My point was and is that yes, you CAN see differences if you're specifically looking for them or are looking at them side-by-side. But who is actually doing that while they're playing a game? I certainly don't. I'm more interested in what's happening in the game rather than noticing a bit of motion blur when you turn too fast. That's especially true with multiplayer games.

 

So I don't see what the big deal is that the PS4 launch games are "only" running at 1080p30. It will still look great, And as long as it's smooth, I don't care.

 

Would it be better/nice if they were running at 1080p60? Of course, but they're not all of a sudden sh-- because they're aren't.

Link to comment
Share on other sites

My point was and is that yes, you CAN see differences if you're specifically looking for them or are looking at them side-by-side. But who is actually doing that while they're playing a game? I certainly don't. I'm more interested in what's happening in the game rather than noticing a bit of motion blur when you turn too fast. That's especially true with multiplayer games.

 

So I don't see what the big deal is that the PS4 launch games are "only" running at 1080p30. It will still look great, And as long as it's smooth, I don't care.

 

Would it be better/nice if they were running at 1080p60? Of course, but they're not all of a sudden sh-- because they're aren't.

 

It matters because all the competitions games are running at 1080p 60fps and has been confirmed. This includes Ryse, Forza 5, MGS5, Titanfall and ALL of EA's games including BF4. They stated it numerous times in their press conference.

 

All of the PS4's games are struggling to hit 30fps. Its a very worrying issue this late on the development cycle, it really is.

Link to comment
Share on other sites

This topic is now closed to further replies.