PS4 and Xbox One resolution / frame rate discussion


Recommended Posts

It should be optional, as that eliminates the problem.

 

It depends how much work is put into them. Clearly a lot of work has been put into Halo 2: Anniversary Edition?it's a massive improvement from the original?but it's not up to AAA standards.

 

Last generation developers were required to target 720p, with some exceptions being allowed. This generation has been much less strict and that has led to a lot of problems for Microsoft, which has been the worst affected. It's a balance between developer freedom and consumer expectations. Microsoft's brand has been damaged by allowing low resolution releases.

 

I don't disagree but the end result for consumers is 'next-gen' consoles that aren't enough of an improvement over the last generation, especially when we're talking about games running at 720p / 792p. The visual fidelity is increased but it's not enough of a jump. I would much rather see the console manufacturers move to a Steam Machines model, whereby you have an open platform or at least multiple models to choose from. That's what was originally intended for the Xbox. That way if someone is happy with 720p they can buy they budget model, whereas those that want 4K can buy the premium model. All the settings would be optimised automatically, so it wouldn't be like PC gaming where you have to manually configure everything.

 

The idea of fixed spec consoles that last seven years is incredibly outdated and I don't see a future in it, not at the rate that mobile phones and tablets are improving. Already you're starting to see Android machines moving into the living room and Steam Machines is around the corner. Microsoft and Sony have a lot more competition from all directions this generation. They're struggling to hit 1080p while 4K gaming is starting to take off on PC.

I'm not sure about the rest of the world, but in the USofA, the hardcore gamer is not the majority buyer... It's lil Timmy and Lil'Suzies parents and the casual gamer...

Going to to a model with "Too many options" will be more of a headache than a good thing. Us geeks who know this stuff are the minor not the majority. U want world that basically is too pricey for lot of people to join in on.

$400 is still a back breaker for a lot of people the world over.

These consoles just so happen to developed and release when the world wasn't in the best economic state. And investers aren't going for takin a bit for millions of dollars early, with no guarantees that the money will be made back and some.

U keep talking about 1080/60 for every game. It just isn't going to happen.

Ryse has showed me, that it's fine if it doesn't reach that goal all the time.

Link to comment
Share on other sites

I'm not sure about the rest of the world, but in the USofA, the hardcore gamer is not the majority buyer... It's lil Timmy and Lil'Suzies parents and the casual gamer...

Going to to a model with "Too many options" will be more of a headache than a good thing. Us geeks who know this stuff are the minor not the majority. U want world that basically is too pricey for lot of people to join in on.

I'm not talking about 'too many' options. What I'm proposing is something similar to that of mobile phones - various models with different performance levels. Just because some people are happy with budget phones like the Lumia 520 doesn't mean that others won't want an iPhone 6 Plus or a Galaxy Note 4. At the moment there are no premium options for consoles in terms of performance.

Link to comment
Share on other sites

At the moment there are no premium options for consoles in terms of performance.

 

They perform better than a $600 PC would for gaming, that in and of itself makes them worth the money in my opinion. If you want 1080p/60fps on a PC you're going to be shelling out $400 minimum for a GPU and CPU that can do it (not mentioning the rest of the hardware you have to buy to make the PC work).

 

Consoles have and always will be budget gaming rigs. They are more powerful per $$ spent. It may not be as revolutionary this gen as it was last, but honestly I don't think that was the goal of this gen. It's sad, true, but I prefer looking at things as they are and not how we perceive they should be.

Link to comment
Share on other sites

What does that have to do with anything? :huh: I was commenting on the video of Halo 2: Anniversary Edition, which is graphically weak (even allowing for YouTube's video compression) and runs at 1328x1080 (which is less than 900p). Considering it doesn't push the graphical envelope that's pretty poor.

 

its rendering 2 engines simultaneously (classic and anniversary) so you can instantly switch between them. essentially it is rendering 2656x1080 pixels at 60fps

Link to comment
Share on other sites

its rendering 2 engines simultaneously (classic and anniversary) so you can instantly switch between them. essentially it is rendering 2656x1080 pixels at 60fps

That's not how it works...

Link to comment
Share on other sites

That's not how it works...

how what works? the 2 engines render side by side according to the developer

 

http://www.eurogamer.net/articles/2014-10-07-halo-2-anniversary-campaign-isnt-quite-1080p-microsoft-confirms

 

H2A is running higher res textures, geo, characters and animation AND running the OG engine at the same time

He then revealed exactly what Halo 2: Anniversary is running simultaneously, before suggesting that if the game did not run the original engine the resolution could, in theory, be boosted:

"Two game (graphics) engines - the OG H2 and H2A, and the original audio (music and FX) and completely new music and FX. And the switch is instantaneous. If it weren't running the OG engine it could in theory run at a higher resolution but that's not the intended nature of the project. It's designed to be a remake that lets you switch between the two instantaneously. Now you can feel one way or another about that, but that is indeed the intent."

 

and actually, halo 2 classic does run in 1080p, so the game is rendering both, a 1080p frame and a 1328x1080 frame 60 times/second (3248x1080/60).

Link to comment
Share on other sites

Basically, a five year old PC graphics card has more power than the XB1. The CPU is also seriously underpowered. It used to be that consoles had cutting edges specs and were heavily subsidised over the lifecycle of the console - this generation they're basically budget PCs. It's shocking that Watch_Dogs only runs at 792p on XB1, as that's simply not 'next-gen'.

 

I think you are way off-topic for this thread. This thread is not about how PS4 and XB1 are weak compared to cutting edge PCs. Let's keep the PC master race stuff out of this thread.

This thread was specifically created for comparing relative performance of PS4 and XB1.

 

That's not how it works...

That's exactly what Frankie posted on NeoGAF. Two GFX engines and two audio streams. Halo:CEA didn't do that and hence had a slight blackout period when the framebuffer switch occurred. This time they are running everything all the time and therefore, no blackout.

Edit: Just realize that vcfan's link already quotes Frankie's post, including it here

 

Two game (graphics) engines - the OG H2 and H2A, and the original audio (music and FX) and completely new music and FX. And the switch is instantaneous. If it weren't running the OG engine it could in theory run at a higher resolution but that's not the intended nature of the project. It's designed to be a remake that lets you switch between the two instantaneously. Now you can feel one way or another about that, but that is indeed the intent.

Link to comment
Share on other sites

I think you are way off-topic for this thread. This thread is not about how PS4 and XB1 are weak compared to cutting edge PCs. Let's keep the PC master race stuff out of this thread.

This topic is about the framerate and resolution on 'next-gen' consoles, which is exactly what I've been discussing. People were expecting 1080p to be the baseline for these consoles, not the 720p of games like Dead Rising 3 or the 792p of Watch_Dogs. As the video I posted shows it doesn't take much of a performance increase to achieve that and was possible with a five year old graphics card. My point isn't that PC performs better?which you would expect given how much more expensive they are?but what can or should have been done to improve the consoles. If console refreshes were released each year like phones and tablets then those buying one now would be receiving much better performance. Now that they're based on PC hardware there isn't anything preventing that.

 

Unfortunately what we're seeing now is developers coming under pressure to hit 1080p from Microsoft. Blizzard had Diablo 3 running at 900p on XB1 to deliver a smooth experience but Microsoft demanded that they increase it to 1080p, which resulted in framerate drops. Just because a title on both platforms runs at 1080p doesn't mean the performance is the same. Framerate drops really break immersion and ruin the experience for me. Microsoft doesn't want the bad press that comes from sub-1080p titles but the alternative is more framerate drops, which are generally more noticeable to gamers. It's a strategy that might work against it.

Link to comment
Share on other sites

People were expecting 1080p to be the baseline for these consoles, not the 720p of games like Dead Rising 3 or the 792p of Watch_Dogs.

 

As far as I know, the baseline has been on more than one occasion 900p+. Not sure where you are getting that 720 and 792 are the baseline/target.

 

 

As the video I posted shows it doesn't take much of a performance increase to achieve that and was possible with a five year old graphics card. My point isn't that PC performs better?which you would expect given how much more expensive they are?but what can or should have been done to improve the consoles. If console refreshes were released each year like phones and tablets then those buying one now would be receiving much better performance. Now that they're based on PC hardware there isn't anything preventing that.

 

Unfortunately what we're seeing now is developers coming under pressure to hit 1080p from Microsoft. Blizzard had Diablo 3 running at 900p on XB1 to deliver a smooth experience but Microsoft demanded that they increase it to 1080p, which resulted in framerate drops. Just because a title on both platforms runs at 1080p doesn't mean the performance is the same. Framerate drops really break immersion and ruin the experience for me. Microsoft doesn't want the bad press that comes from sub-1080p titles but the alternative is more framerate drops, which are generally more noticeable to gamers. It's a strategy that might work against it.

 

And yet the gaming community refuses to accept sub 1080p titles, but on the flip-side want a perfect 60fps. There's no PR win here. Perhaps that is publisher/console maker's fault for making it appear that would be the standard this gen. But it's not.

 

We can throw anecdotal videos (yes that comparison is anecdotal) about how PC's are more powerful 5 years ago than the PS4/X1 all day. But that doesn't really represent the architecture in the consoles or fairly compare them. I think the power expectations of the gaming community here were set way too high. Expecting $1200 rig performance out of a $400 console.

Link to comment
Share on other sites

As far as I know, the baseline has been on more than one occasion 900p+. Not sure where you are getting that 720 and 792 are the baseline/target.

 

 

 

And yet the gaming community refuses to accept sub 1080p titles, but on the flip-side want a perfect 60fps. There's no PR win here. Perhaps that is publisher/console maker's fault for making it appear that would be the standard this gen. But it's not.

 

We can throw anecdotal videos (yes that comparison is anecdotal) about how PC's are more powerful 5 years ago than the PS4/X1 all day. But that doesn't really represent the architecture in the consoles or fairly compare them. I think the power expectations of the gaming community here were set way too high. Expecting $1200 rig performance out of a $400 console.

 

I think the gaming community (I share the same opinion) expected that after so many years after the 360 and PS3, we would jump from 720x30 to 1080x60.

 

And no, sub 1080 resolutions aren't acceptable given that we're entering 4k resolutions. 900p is a fair trade for 60fps, but for me, 900p and below is noticeable. Depends on the engine, but I can tell the difference between 1080p and 900p.

 

(Then again, that's why instead of buying a Xbox One or a PS4, I bought a PC. 1080p x 60 FTW.)

 

If this gen can't deliver 1080p x 60fps, then it failed. But it's not surprising, considering the crappy core efficiency  of the AMD APU.

 

And when Microsoft talked about balanced, they were right. The PS4 GPU is bottlenecked, engines that need CPU power will show that, and I believe future engines will hit that wall really fast. It's an assumption, but I don't think it's irrealistic considering the current AMD APU offering.

  • Like 2
Link to comment
Share on other sites

I think the gaming community (I share the same opinion) expected that after so many years after the 360 and PS3, we would jump from 720x30 to 1080x60.

 

And no, sub 1080 resolutions aren't acceptable given that we're entering 4k resolutions. 900p is a fair trade for 60fps, but for me, 900p and below is noticeable. Depends on the engine, but I can tell the difference between 1080p and 900p.

 

(Then again, that's why instead of buying a Xbox One or a PS4, I bought a PC. 1080p x 60 FTW.)

 

If this gen can't deliver 1080p x 60fps, then it failed. But it's not surprising, considering the crappy core efficiency  of the AMD APU.

 

And when Microsoft talked about balanced, they were right. The PS4 GPU is bottlenecked, engines that need CPU power will show that, and I believe future engines will hit that wall really fast. It's an assumption, but I don't think it's irrealistic considering the current AMD APU offering.

 

I think I can agree here. But in terms of 4K, it won't be a standard in televisions for some time. last gen was 720p because a majority of the televisions at the time weren't higher res than 720p. It took more than half of last gen's lifespan for 1080p to become pretty prevalent in homes. To say 4K is the new standard when there are only a couple TV's (which are thousands of dollars more expensive than their 1080p counterparts) and a couple of monitors (which are, iirc TN and not IPS panels and are also more than double the price of similar monitors that are less res) is a huge stretch.

Link to comment
Share on other sites

How many people are really going to game at 4k?   I keep seeing it posted that we're at 4k gaming but I don't see that as more than a niche segment of the PC race, the guys who like the SLi at this point, or spend $500+ on a video card.

 

Because the casual, or rather in this case, average gamer (console and those who don't tinker with their PC all the time), get to 4k a number of things have to happen.   Lots of home owners will have to get 4k TVs, this isn't going to be the majority of TVs in homes anytime soon, IMO.   For now 1080p will be the default HD res going forward, second more PC gamers will have to get 4k monitors to, my 24" is 1080p, I'm not looking to run at anything other than it's native res, and I can't find good priced 1440p monitors yet in my market, which is a shame.  

 

Let alone neither of the new consoles can do 4k gaming, so we'll have to wait for the next-next gen, PS5 and XB One + or whatever they call it, till we can look at that as a possibility.  These things are going to be around for at least 5 years, well 4 since this first year is almost over now, so 2018 for new consoles, at best.   Maybe then we'll be looking at 4k, but again, are enough people going to have 4k TVs and monitors by then?   I don't know if that will be the case.

Link to comment
Share on other sites

I think the gaming community (I share the same opinion) expected that after so many years after the 360 and PS3, we would jump from 720x30 to 1080x60.

 

And no, sub 1080 resolutions aren't acceptable given that we're entering 4k resolutions. 900p is a fair trade for 60fps, but for me, 900p and below is noticeable. Depends on the engine, but I can tell the difference between 1080p and 900p.

 

(Then again, that's why instead of buying a Xbox One or a PS4, I bought a PC. 1080p x 60 FTW.)

 

If this gen can't deliver 1080p x 60fps, then it failed. But it's not surprising, considering the crappy core efficiency  of the AMD APU.

 

And when Microsoft talked about balanced, they were right. The PS4 GPU is bottlenecked, engines that need CPU power will show that, and I believe future engines will hit that wall really fast. It's an assumption, but I don't think it's irrealistic considering the current AMD APU offering.

I would hazard a guess and say that the majority of the games on 360/ps3 were not even 720p. Is there even a single 1080p/60fps game with AAA graphics on either console?

Link to comment
Share on other sites

The idea of fixed spec consoles that last seven years is incredibly outdated and I don't see a future in it, not at the rate that mobile phones and tablets are improving. Already you're starting to see Android machines moving into the living room and Steam Machines is around the corner. Microsoft and Sony have a lot more competition from all directions this generation. They're struggling to hit 1080p while 4K gaming is starting to take off on PC.

If you start putting out more powerful versions of a console, you run the risk of splintering development. There is a clear cost tied to the graphical fidelity of a game and that means not all developers will be able to properly optimize for every version of a console. Look at pc development. Sometimes games are made to focus on a lower performance level to reach the widest audience. So while something like 4k is a thing on the pc, the majority of games are not developed around that target. That is unlikely to change for quite some time.

If MS had more powerful models, pricing becomes an issue. People hesitate to buy a console past $500. Is there really a big enough market for a 600,700, or $800 console even if the specs are higher? Why not just build a pc at that point? I mean anyone that is committed solely to the pc is unlikely to be tempted by it. I would guess that it leads to few games truly taking advantage of the higher end models because most people bought the cheapest. Or even worse, customers don't buy any model because the cheapest feels like a bad value and yet they can't afford the higher priced models.

Steam machines are very much a question mark. Their limited release isn't exactly earth shattering and its still basically selling a pc.

I just think consoles have to stick to the current model due to consumer behavior. The end game may be streaming boxes where there is little local processing.

Link to comment
Share on other sites

I think I can agree here. But in terms of 4K, it won't be a standard in televisions for some time. last gen was 720p because a majority of the televisions at the time weren't higher res than 720p. It took more than half of last gen's lifespan for 1080p to become pretty prevalent in homes. To say 4K is the new standard when there are only a couple TV's (which are thousands of dollars more expensive than their 1080p counterparts) and a couple of monitors (which are, iirc TN and not IPS panels and are also more than double the price of similar monitors that are less res) is a huge stretch.

 

I didn't say that, I said we're entering 4K resolutions. As in, the technology is being introduced.

 

Just to make it clear.

Link to comment
Share on other sites

You know, for all the flack Ubisoft is getting, the PC requirements for Unity are on the beefy side, with them like that I"m not surprised they decided to lock the console versions low.  I doubt my PC will play the game at 1080p at 60fps with settings set to high.   If I have to go in and turn down the graphics quality just to try and hit 60 frames or 1080p I'd rather take lower frames and keep the game looking great.

Link to comment
Share on other sites

not a tech analysis, but from the eurogamer review of Sunset Overdrive.

 

Comboing, on the other hand, is a trick Insomniac supports with some beautiful tech, rendering a city that feels solid and intricate, while handling dozens of swarming foes and millions of particle effects at 30 fps without skipping a frame.

http://www.eurogamer.net/articles/2014-10-27-sunset-overdrive-review

Link to comment
Share on other sites

not a tech analysis, but from the eurogamer review of Sunset Overdrive.

 

http://www.eurogamer.net/articles/2014-10-27-sunset-overdrive-review

 

And yet do they every say that 30fps is an issue?  I doubt it, I mean people like to debate this, sure, but I still feel that a solid 30fps, depending on the game type (not a first person shooter for example) works fine.  And I'd really not drop graphics quality in order to hit 60fps.  

Link to comment
Share on other sites

You know, for all the flack Ubisoft is getting, the PC requirements for Unity are on the beefy side, with them like that I"m not surprised they decided to lock the console versions low.  I doubt my PC will play the game at 1080p at 60fps with settings set to high.   If I have to go in and turn down the graphics quality just to try and hit 60 frames or 1080p I'd rather take lower frames and keep the game looking great.

I can't be doing with sub-60fps and I'd much rather turn down the detail slightly than put up with poor performance. Saying that, I've typically found Ubisoft games to be well optimised for PC and take advantage of platform exclusive features (like DX10/11 back in the day). Far Cry 3 ran great at 2560x1600 @ 60fps, which was impressive given how much more demanding it is than the 720p @ 30fps that the X360 and PS3 run at (you're talking about 4.5x the resolution at twice the framerate).

 

As much as I dislike Uplay I have to say that Ubisoft is pretty good when it comes to PC support. The minimum specs for Assassin's Creed: Unity are a nearly five year old CPU, 8GB RAM (which has been standard for years and is cheap) and a two-and-a-half year old GPU (admittedly top end) - they're pretty high due to the GPU requirements but it depends what experience they'll deliver. Given that the recommended specs aren't much higher perhaps we shouldn't read too much into it. If it performs poorly on both PC and next-gen consoles then one has to question the wisdom of pushing graphics beyond what current hardware is able to support. We'll soon see whether this is a case of Ubisoft pushing the graphics too much or the next-gen consoles being underpowered.

Link to comment
Share on other sites

This topic is now closed to further replies.