Jump to content



Photo

PS4 and Xbox One resolution / frame rate discussion

ps4 xbox one microsoft sony frame rate resolution

1387 replies to this topic

#1381 vcfan

vcfan

    Doing the Humpty Dance

  • 5,128 posts
  • Joined: 12-June 11

Posted 22 October 2014 - 01:17

That's not how it works...

how what works? the 2 engines render side by side according to the developer
 
http://www.eurogamer...rosoft-confirms
 

H2A is running higher res textures, geo, characters and animation AND running the OG engine at the same time

He then revealed exactly what Halo 2: Anniversary is running simultaneously, before suggesting that if the game did not run the original engine the resolution could, in theory, be boosted:

"Two game (graphics) engines - the OG H2 and H2A, and the original audio (music and FX) and completely new music and FX. And the switch is instantaneous. If it weren't running the OG engine it could in theory run at a higher resolution but that's not the intended nature of the project. It's designed to be a remake that lets you switch between the two instantaneously. Now you can feel one way or another about that, but that is indeed the intent."

 
and actually, halo 2 classic does run in 1080p, so the game is rendering both, a 1080p frame and a 1328x1080 frame 60 times/second (3248x1080/60).


#1382 BajiRav

BajiRav

    Neowinian Senior

  • 10,734 posts
  • Joined: 15-July 04
  • Location: Xbox, where am I?
  • OS: Windows 8.1, Windows 8
  • Phone: Lumia 920

Posted 22 October 2014 - 03:09

Basically, a five year old PC graphics card has more power than the XB1. The CPU is also seriously underpowered. It used to be that consoles had cutting edges specs and were heavily subsidised over the lifecycle of the console - this generation they're basically budget PCs. It's shocking that Watch_Dogs only runs at 792p on XB1, as that's simply not 'next-gen'.

 
I think you are way off-topic for this thread. This thread is not about how PS4 and XB1 are weak compared to cutting edge PCs. Let's keep the PC master race stuff out of this thread.
This thread was specifically created for comparing relative performance of PS4 and XB1.
 

That's not how it works...

That's exactly what Frankie posted on NeoGAF. Two GFX engines and two audio streams. Halo:CEA didn't do that and hence had a slight blackout period when the framebuffer switch occurred. This time they are running everything all the time and therefore, no blackout.
Edit: Just realize that vcfan's link already quotes Frankie's post, including it here
 

Two game (graphics) engines - the OG H2 and H2A, and the original audio (music and FX) and completely new music and FX. And the switch is instantaneous. If it weren't running the OG engine it could in theory run at a higher resolution but that's not the intended nature of the project. It's designed to be a remake that lets you switch between the two instantaneously. Now you can feel one way or another about that, but that is indeed the intent.



#1383 theyarecomingforyou

theyarecomingforyou

    Tiger Trainer

  • 16,656 posts
  • Joined: 07-August 03
  • Location: Terra Prime Profession: Jaded Sceptic
  • OS: Windows 10 Preview
  • Phone: Galaxy Note 3 with Galaxy Gear

Posted 22 October 2014 - 09:48

I think you are way off-topic for this thread. This thread is not about how PS4 and XB1 are weak compared to cutting edge PCs. Let's keep the PC master race stuff out of this thread.

This topic is about the framerate and resolution on 'next-gen' consoles, which is exactly what I've been discussing. People were expecting 1080p to be the baseline for these consoles, not the 720p of games like Dead Rising 3 or the 792p of Watch_Dogs. As the video I posted shows it doesn't take much of a performance increase to achieve that and was possible with a five year old graphics card. My point isn't that PC performs better—which you would expect given how much more expensive they are—but what can or should have been done to improve the consoles. If console refreshes were released each year like phones and tablets then those buying one now would be receiving much better performance. Now that they're based on PC hardware there isn't anything preventing that.

 

Unfortunately what we're seeing now is developers coming under pressure to hit 1080p from Microsoft. Blizzard had Diablo 3 running at 900p on XB1 to deliver a smooth experience but Microsoft demanded that they increase it to 1080p, which resulted in framerate drops. Just because a title on both platforms runs at 1080p doesn't mean the performance is the same. Framerate drops really break immersion and ruin the experience for me. Microsoft doesn't want the bad press that comes from sub-1080p titles but the alternative is more framerate drops, which are generally more noticeable to gamers. It's a strategy that might work against it.



#1384 Emn1ty

Emn1ty

    Web Programmer

  • 2,805 posts
  • Joined: 09-April 06
  • Location: Irvine, CA
  • OS: Windows 8.1, OSX Mavericks
  • Phone: Driod Razr

Posted 22 October 2014 - 15:17

People were expecting 1080p to be the baseline for these consoles, not the 720p of games like Dead Rising 3 or the 792p of Watch_Dogs.

 

As far as I know, the baseline has been on more than one occasion 900p+. Not sure where you are getting that 720 and 792 are the baseline/target.

 

 

As the video I posted shows it doesn't take much of a performance increase to achieve that and was possible with a five year old graphics card. My point isn't that PC performs better—which you would expect given how much more expensive they are—but what can or should have been done to improve the consoles. If console refreshes were released each year like phones and tablets then those buying one now would be receiving much better performance. Now that they're based on PC hardware there isn't anything preventing that.

 

Unfortunately what we're seeing now is developers coming under pressure to hit 1080p from Microsoft. Blizzard had Diablo 3 running at 900p on XB1 to deliver a smooth experience but Microsoft demanded that they increase it to 1080p, which resulted in framerate drops. Just because a title on both platforms runs at 1080p doesn't mean the performance is the same. Framerate drops really break immersion and ruin the experience for me. Microsoft doesn't want the bad press that comes from sub-1080p titles but the alternative is more framerate drops, which are generally more noticeable to gamers. It's a strategy that might work against it.

 

And yet the gaming community refuses to accept sub 1080p titles, but on the flip-side want a perfect 60fps. There's no PR win here. Perhaps that is publisher/console maker's fault for making it appear that would be the standard this gen. But it's not.

 

We can throw anecdotal videos (yes that comparison is anecdotal) about how PC's are more powerful 5 years ago than the PS4/X1 all day. But that doesn't really represent the architecture in the consoles or fairly compare them. I think the power expectations of the gaming community here were set way too high. Expecting $1200 rig performance out of a $400 console.



#1385 Seketh

Seketh

    Neowinian

  • 283 posts
  • Joined: 20-March 10

Posted 22 October 2014 - 16:06

As far as I know, the baseline has been on more than one occasion 900p+. Not sure where you are getting that 720 and 792 are the baseline/target.

 

 

 

And yet the gaming community refuses to accept sub 1080p titles, but on the flip-side want a perfect 60fps. There's no PR win here. Perhaps that is publisher/console maker's fault for making it appear that would be the standard this gen. But it's not.

 

We can throw anecdotal videos (yes that comparison is anecdotal) about how PC's are more powerful 5 years ago than the PS4/X1 all day. But that doesn't really represent the architecture in the consoles or fairly compare them. I think the power expectations of the gaming community here were set way too high. Expecting $1200 rig performance out of a $400 console.

 

I think the gaming community (I share the same opinion) expected that after so many years after the 360 and PS3, we would jump from 720x30 to 1080x60.

 

And no, sub 1080 resolutions aren't acceptable given that we're entering 4k resolutions. 900p is a fair trade for 60fps, but for me, 900p and below is noticeable. Depends on the engine, but I can tell the difference between 1080p and 900p.

 

(Then again, that's why instead of buying a Xbox One or a PS4, I bought a PC. 1080p x 60 FTW.)

 

If this gen can't deliver 1080p x 60fps, then it failed. But it's not surprising, considering the crappy core efficiency  of the AMD APU.

 

And when Microsoft talked about balanced, they were right. The PS4 GPU is bottlenecked, engines that need CPU power will show that, and I believe future engines will hit that wall really fast. It's an assumption, but I don't think it's irrealistic considering the current AMD APU offering.



#1386 Emn1ty

Emn1ty

    Web Programmer

  • 2,805 posts
  • Joined: 09-April 06
  • Location: Irvine, CA
  • OS: Windows 8.1, OSX Mavericks
  • Phone: Driod Razr

Posted 22 October 2014 - 17:46

I think the gaming community (I share the same opinion) expected that after so many years after the 360 and PS3, we would jump from 720x30 to 1080x60.

 

And no, sub 1080 resolutions aren't acceptable given that we're entering 4k resolutions. 900p is a fair trade for 60fps, but for me, 900p and below is noticeable. Depends on the engine, but I can tell the difference between 1080p and 900p.

 

(Then again, that's why instead of buying a Xbox One or a PS4, I bought a PC. 1080p x 60 FTW.)

 

If this gen can't deliver 1080p x 60fps, then it failed. But it's not surprising, considering the crappy core efficiency  of the AMD APU.

 

And when Microsoft talked about balanced, they were right. The PS4 GPU is bottlenecked, engines that need CPU power will show that, and I believe future engines will hit that wall really fast. It's an assumption, but I don't think it's irrealistic considering the current AMD APU offering.

 

I think I can agree here. But in terms of 4K, it won't be a standard in televisions for some time. last gen was 720p because a majority of the televisions at the time weren't higher res than 720p. It took more than half of last gen's lifespan for 1080p to become pretty prevalent in homes. To say 4K is the new standard when there are only a couple TV's (which are thousands of dollars more expensive than their 1080p counterparts) and a couple of monitors (which are, iirc TN and not IPS panels and are also more than double the price of similar monitors that are less res) is a huge stretch.



#1387 George P

George P

    Neowinian Senior

  • 19,353 posts
  • Joined: 04-February 07
  • Location: Greece
  • OS: Windows 8.1 Pro 64bit
  • Phone: HTC Windows Phone 8X

Posted 22 October 2014 - 18:28

How many people are really going to game at 4k?   I keep seeing it posted that we're at 4k gaming but I don't see that as more than a niche segment of the PC race, the guys who like the SLi at this point, or spend $500+ on a video card.

 

Because the casual, or rather in this case, average gamer (console and those who don't tinker with their PC all the time), get to 4k a number of things have to happen.   Lots of home owners will have to get 4k TVs, this isn't going to be the majority of TVs in homes anytime soon, IMO.   For now 1080p will be the default HD res going forward, second more PC gamers will have to get 4k monitors to, my 24" is 1080p, I'm not looking to run at anything other than it's native res, and I can't find good priced 1440p monitors yet in my market, which is a shame.  

 

Let alone neither of the new consoles can do 4k gaming, so we'll have to wait for the next-next gen, PS5 and XB One + or whatever they call it, till we can look at that as a possibility.  These things are going to be around for at least 5 years, well 4 since this first year is almost over now, so 2018 for new consoles, at best.   Maybe then we'll be looking at 4k, but again, are enough people going to have 4k TVs and monitors by then?   I don't know if that will be the case.



#1388 BajiRav

BajiRav

    Neowinian Senior

  • 10,734 posts
  • Joined: 15-July 04
  • Location: Xbox, where am I?
  • OS: Windows 8.1, Windows 8
  • Phone: Lumia 920

Posted Today, 01:58

I think the gaming community (I share the same opinion) expected that after so many years after the 360 and PS3, we would jump from 720x30 to 1080x60.

 

And no, sub 1080 resolutions aren't acceptable given that we're entering 4k resolutions. 900p is a fair trade for 60fps, but for me, 900p and below is noticeable. Depends on the engine, but I can tell the difference between 1080p and 900p.

 

(Then again, that's why instead of buying a Xbox One or a PS4, I bought a PC. 1080p x 60 FTW.)

 

If this gen can't deliver 1080p x 60fps, then it failed. But it's not surprising, considering the crappy core efficiency  of the AMD APU.

 

And when Microsoft talked about balanced, they were right. The PS4 GPU is bottlenecked, engines that need CPU power will show that, and I believe future engines will hit that wall really fast. It's an assumption, but I don't think it's irrealistic considering the current AMD APU offering.

I would hazard a guess and say that the majority of the games on 360/ps3 were not even 720p. Is there even a single 1080p/60fps game with AAA graphics on either console?