PS4 and Xbox One resolution / frame rate discussion


Recommended Posts

Not sure I'd really agree with that. My experience is that PC games tend to be CPU limited, the exceptions being the glorified tech demos like Crysis which try to cram in as many shaders as possible into a scene.

Just see for yourself: Bioshock Infinite, Far Cry 4, GRID: Autosport, Crysis 3, Alien Isolation, Battlefield 4Dying Light, Evolve, Thief, Battlefield: Hardline, etc. The CPU can influence the performance of GPUs but it is typically not the limiting factor. The biggest difference tends to be between architectures rather than clock speed or cores. There's nothing to disagree about. Increasing the power of the CPU has little impact because it's the GPU that is the limiting factor.

 

DX12 will improve performance, as it introduces efficiencies across the board, but not to a substantial degree (i.e. one which would eliminate the gap to Sony) - it's games that are designed specifically around it that will benefit the most, particularly first-party XB1 titles. We're not going to see the outlandish improvements

Link to comment
Share on other sites

I'd trust Wardell over you (and GotBored) any day of the week.

 

http://www.dualshockers.com/2015/02/07/stardock-ceo-well-play-games-looking-like-the-lord-of-the-rings-on-xbox-one-and-ps4-by-generations-end/

http://wccftech.com/brad-wardell-talks-dx12-xbox/

 

 

 

If he is to be trusted (and why wouldn't be?) not even MS is sure how much the benefit will be.

 

How is that the two of you can say with absolute certainty that it won't be huge?

 

 

That article you posted quotes Wardell saying that the PS4 and X1 are both 'monsters' and that games that have come out for them and that are coming out aren't even remotely taxing on the consoles, he also says that games will look like real-life movies he uses Lord of the Rings films as comparison to what he expects of the games that will be released on the consoles in the future. The consoles are mid-range PC equivalents, games that are out now are taxing on both consoles (sure they can optimize them and make them less taxing and in return improve the graphics but even then its never going to be outputting graphics that look like a lord of the rings movie... Not even top of the range PC's are capable of that).

 

He also says that PS4 will have the same massive improvements (with the Vulcan API I assume) but you don't see anyone posting articles about that here because the guy has no clue what his talking about.

 

MS said it will be a minor improvement and this is from the guys who said the 'cloud' and 'kinect resources' were going to be huge improvements but ended up being unnoticeable, so I'd say it's safe to assume directx 12 isn't going to do much.

  • Like 1
Link to comment
Share on other sites

Just see for yourself: Bioshock Infinite, Far Cry 4, GRID: Autosport, Crysis 3, Alien Isolation, Battlefield 4Dying Light, Evolve, Thief, Battlefield: Hardline, etc. The CPU can influence the performance of GPUs but it is typically not the limiting factor. The biggest difference tends to be between architectures rather than clock speed or cores. There's nothing to disagree about. Increasing the power of the CPU has little impact because it's the GPU that is the limiting factor.

 

DX12 will improve performance, as it introduces efficiencies across the board, but not to a substantial degree (i.e. one which would eliminate the gap to Sony) - it's games that are designed specifically around it that will benefit the most, particularly first-party XB1 titles. We're not going to see the outlandish improvements

Link to comment
Share on other sites

 

 

Microsoft isn't talking about it because they have nothing to show. As for developers, most have downplayed the role of DX12. The hype is mostly from Xbox fans, who have a vested interest in seeing their platform prosper.

 

"most"? Who are those developers you speak of? Please provide sources for your claims.

 

I'm not hyping anything. Only quoting developers and other people who actually know what they are talking about, unlike many people here.

 

 

There's nothing for me to backup. Wardell isn't responsible for any AAA next-gen console games and is primarily a PC developer. As for his character, there are numerous articles

Link to comment
Share on other sites

That article you posted quotes Wardell saying that the PS4 and X1 are both 'monsters' and that games that have come out for them and that are coming out aren't even remotely taxing on the consoles, he also says that games will look like real-life movies he uses Lord of the Rings films as comparison to what he expects of the games that will be released on the consoles in the future. The consoles are mid-range PC equivalents, games that are out now are taxing on both consoles (sure they can optimize them and make them less taxing and in return improve the graphics but even then its never going to be outputting graphics that look like a lord of the rings movie... Not even top of the range PC's are capable of that).

 

Actually many of the effects in LoTR aren't amazing, as such. Of course he doesn't mean, that games will look like a real-life movie, he's referring to the special effects.

 

If you don't know what I'm talking about, check the scene from the first movie where they are running from the Balrog inside the mines. Not exactly high-quality stuff.

 

 

MS said it will be a minor improvement and this is from the guys who said the 'cloud' and 'kinect resources' were going to be huge improvements but ended up being unnoticeable, so I'd say it's safe to assume directx 12 isn't going to do much.

 

No, they said it's not a "dramatic improvement". There's a big difference there.

 

The freed kinect resources were absolutely noticable. It should be common knowledge for anyone interested in the consoles. I'm also quite sure, that you knew that.

Link to comment
Share on other sites

Picking a small list of GPU-bound games (Two of which are the same engine plus Crysis as I mentioned previously) and linking to benchmarks from Techspot (Could you link to a more reliable source?) doesn't really refute my argument.

I listed as many AAA games as I could think off the top of my head to support my position - in no way was my list cherry-picked. The games I listed support my point, though if you can refute my claim then please do. As for Techspot, please explain why you take issue with it and provide evidence to support that position.

 

You need to look at engines, not games. And you need test a little more rigorous than "we stopped getting FPS gain from raising the clockrate".

The games I listed use a wide variety of different game engines and it is widely known that most games are GPU bound. I don't know of any game that is primarily CPU bound, though that's not to say they don't exist.

 

"most"? Who are those developers you speak of? Please provide sources for your claims.

DX12 to bring slight boost for XB1

XB1 unlikely to hit 1080p with DX12 / XB1 will struggle to his 1080p, despite DX12

Titanfall developer claims XB1 doesn't need DX12

XB1 chiefs warns not to expect dramatic improvements from DX12

DX12 can result in performance improvements of upto 30% "under certain circumstances" (basically admitting the performance difference won't be close to that)

Wardell claims PS4 hardware is substantially better than XB1 despite DX12 (yes, even Wardell claims that DX12 won't allow the XB1 to overtake the PS4)

 

That's far from an exhaustive list but supports my point. I respect a credible AAA developer like CD Project RED (responsible for The Witcher) more than an indie developer with a loud mouth like Wardell.

  • Like 1
Link to comment
Share on other sites

The games I listed support my point, though if you can refute my claim then please do.

 

Not really no. It only shows that the small selection of engines you provided don't scale with increasing clock speed on the single resolution and hardware configuration that Techspot tested.

 

As for Techspot, please explain why you take issue with it and provide evidence to support that position.

 

I've read past articles by them that have drawn rather dumb conclusions from data indicating the contrary, so I don't place great value on the quality of their reporting as a result. But my personal distaste for Techspot isn't what is relevant here, you seem to be rather confident in your position, so finding some results from another source should be a trivial task for you.

 

The games I listed use a wide variety of different game engines...

 

You have barely a handful of engines. Frostbite 2, CryEngine 3 and a few in-house. In fact now that I think about it, -1 on the engine diversity as Evolve is also CryEngine 3. Maybe -.5 for Bioshock and Thief being UE3 derived.

 

widely known that most games are GPU bound

 

Widely known by whom?

 

I don't know of any game that is primarily CPU bound, though that's not to say they don't exist.

 

Source 1, Guild Wars 2, Civ 5/BE and pretty much every current MMO out there to name a few. And that's just off the top of my head.

Link to comment
Share on other sites

DX12 reduces a number of CPU bottlenecks but you have to remember that most games are GPU limited. Therefore you aren't going to see major improvements unless developers have written their game specifically to take advantage of DX12 and increase their reliance on the CPU, which isn't going to happen due to the multiplatform nature of modern gaming. I have no doubt that some developers, particularly Microsoft's first-party developers, will extract some impressive results but you're talking about a minority of the overall gaming scene.

I think you need to read up on DX12/Mantle/Vulcan. I guess even the op that started this latest subthread is a good starting point.
Link to comment
Share on other sites

 

So up to 30% is nothing to talk about? Even a 15% speed increase is huge. Your links are contradicting yourself.

 

You still haven't provided any links regarding Wardells credibility. I'll assume you made that up, then.

Link to comment
Share on other sites

Not really no. It only shows that the small selection of engines you provided don't scale with increasing clock speed on the single resolution and hardware configuration that Techspot tested.

CPU tests are always done at a fixed resolution - that's how they work. They also use the same hardware, aside from processors, in order to eliminate variables. You clearly don't understand how CPU tests work.

 

I've read past articles by them that have drawn rather dumb conclusions from data indicating the contrary, so I don't place great value on the quality of their reporting as a result. But my personal distaste for Techspot isn't what is relevant here, you seem to be rather confident in your position, so finding some results from another source should be a trivial task for you.

I'm not going to waste my time just because you refuse to accept the evidence provided. If I provided other sources you'd just come up with arbitrary reasons to reject them.

 

You have barely a handful of engines. Frostbite 2, CryEngine 3 and a few in-house. In fact now that I think about it, -1 on the engine diversity as Evolve is also CryEngine 3. Maybe -.5 for Bioshock and Thief being UE3 derived.

Engines I included: EGO, Frostbite, CryEngine, Alien Isolation (in-house engine), Chrome Engine, Unreal Engine 3 and Dunia Engine. That's SEVEN different engines. You're simply not being reasonable.

 

Widely known by whom?

People who follow gaming.

 

Source 1, Guild Wars 2, Civ 5/BE and pretty much every current MMO out there to name a few. And that's just off the top of my head.

Which has nothing to do with the XB1, which is what we're discussing. I have substantiated my claim with countless benchmarks across many engines - you have listed one dated PC engine (which was GPU limited at launch), one MMO and a PC strategy game that's based primarily on AI calculations. Hardly a credible rebuttal.

 

So up to 30% is nothing to talk about? Even a 15% speed increase is huge. Your links are contradicting yourself.

No, a 15% difference isn't huge when you consider that the PS4 has been managing more than 100% higher resolution in some games (1080p vs 720p). People don't seem to appreciate that 1080p is 45% more pixels than 900p, which is what so many games on XB1 are running at. That's without even considering visual fidelity. A 15% performance improvement isn't going to allow the XB1 to catch up to the PS4. Further, the 30% is "under certain circumstances", which basically means nothing like that in real world performance.

 

Even Microsoft is downplaying the performance gains from DX12 and major developers like CD Project RED have stated it won't be enough to allow it to hit 1080p. I really hope that DX12 does turn out to be as great as many here are claiming

  • Like 3
Link to comment
Share on other sites

You guys really like restarting old arguments to no end.  How about you wait and see what impact to games DX12 will bring when the time comes?  As with anything of this nature it's not a single XX% gain across the board, it'll vary by type of game and how it's coded to work with the API and the hardware.  Some will see a bigger gain than other games, we'll just have to wait and see it as new DX12 games come out or older ones are patched to support it.

  • Like 3
Link to comment
Share on other sites

This is why I try to avoid this topic. Every time someone posts something the resolution/fps of a new game, this argument/debate starts up again :p 

 

It's difficult but the proof will come, either for or against DX12 and how much of a difference it'll make. 

Link to comment
Share on other sites

No, a 15% difference isn't huge when you consider that the PS4 has been managing more than 100% higher resolution in some games (1080p vs 720p). People don't seem to appreciate that 1080p is 45% more pixels than 900p, which is what so many games on XB1 are running at. That's without even considering visual fidelity. A 15% performance improvement isn't going to allow the XB1 to catch up to the PS4. Further, the 30% is "under certain circumstances", which basically means nothing like that in real world performance.

 

Even Microsoft is downplaying the performance gains from DX12 and major developers like CD Project RED have stated it won't be enough to allow it to hit 1080p. I really hope that DX12 does turn out to be as great as many here are claiming

Link to comment
Share on other sites

Why are you bringing up the PS4 and talking about "catching up"? Everyone else is only discussing the benefits to the XB1. No mention has been made of the PS4, of comparing the two or claiming that it can "catch up", so stop trying to make a flame-war happen.

The performance difference between the two consoles is one of the main functions of this topic:

  • Screenshots and videos of games highlighting either their performance or difference
  • Hardware discussion, to any degree or depth. If you don't have an understanding you may still take part to learn more
  • Articles and news either released by the developers, platform holders or any other informable source
  • Your personal opinion about the above

The XB1 can certainly hit 1080p, so stop claiming it can't. It's exactly the sort of non-sense that makes people tired.

I have never claimed it can't hit 1080p

  • Like 2
Link to comment
Share on other sites

I have never claimed it can't hit 1080p

 

Glad we agree that the XB1 can do 1080p if the developers decide to do so.

 

The performance difference between the two consoles is one of the main functions of this topic

  • Hardware discussion, to any degree or depth. If you don't have an understanding you may still take part to learn more

 

But the conversation which you decided to stick your oar in didn't revolve around the PS4. It was about the XB1 and the possible benefits from DX12. Why bring PS4 into it? You clearly tried to start a flame-war, but to no luck.

Link to comment
Share on other sites

At higher resolution most games definitely are GPU bound nowadays. 

 

Even at 1080p 50+ fps most of them are gpu bound. Few games are cpu bound and most of them are strategy or simulation games (the type of games stardock is making).

 

Directx 12 will more than likely bring the same improvement as Mantle. For very weak cpu coupled with a high end gpu (like my PC) it's gonna be around 15-30%. For weak to mid range cpu coupled with a mid to weak range gpu (like current gen consoles) it's gonna be around 5%.

 

Current gen consoles are gpu bound (and cpu bound). There's no way they are struggling so much to output 1080p while being cpu bound only (having lot of room on gpu side). It would require an astonishingly bad cpu to bring down a good gpu so much at 1080p with the current games. I have a 6 years old core i5 750. Very weak cpu. Yet i can do 1080p 50+ fps without too much problem cause i have a 970. For most of my games (even the newer ones) i can put almost everything at max and still push 50 fps at 1080. When i see current gen consoles struggling to output 1080p with games that are not looking that great i'm totally baffled to be honest.

  • Like 1
Link to comment
Share on other sites

CPU holding up the GPU in most cases is what effects your framerate, not your res.  If DX12 on the XB1 can help the CPU side like it's been shown to on the PC then games that can do 1080p but can't hit a smooth frame rate (like what we're seeing more and more of) then those will see boosts in the end.   It's really not that hard to wrap your head around, all the current DX12 demos that talk about CPU performance gains show you gains in the frame rate, if it's a matter of gaining 60fps or as close to 60 yet stable, compared to current dips then I'd say it's a huge benefit.

 

It's like taking a current game that does 1080p but has a frame rate that jumps from 50-60 down to 35-40 and getting something that's more in the 50-60 range.   Now it will just depend on how the game is coded and what work the developers can do to gain off of the changes to the API.

 

Another thing to keep in mind is that all the PC side DX12 talk and demos look at quad core CPUs, we know the consoles have 8 cores, more like 7 since one is locked for other tasks.  We'll have to see how better it'll scale with those additional 3 cores.

Link to comment
Share on other sites

DX12 won't have that much of an impact upon performance, as have other developers in the know.

 

Actually, it is quite the opposite.

Link to comment
Share on other sites

CPU tests are always done at a fixed resolution - that's how they work. They also use the same hardware, aside from processors, in order to eliminate variables. You clearly don't understand how CPU tests work.

 

You clearly don't understand hardware and software engineering if you think merely raising the clockrate is hard proof of games being GPU bound, did it ever occur to you there may be other bottlenecks in play that Techspot's limited testing is not revealing?

 

I'm not going to waste my time just because you refuse to accept the evidence provided. If I provided other sources you'd just come up with arbitrary reasons to reject them.

 

A rather fine double standard to demand evidence in that pretentious manner of yours to then on the other hand act like single-sourcing of data is good scientific practice. I don't like Techspot, get over it. Is it really that insurmountable a request that you source from elsewhere, or is it because no other sources support your claim?

 

Engines I included: EGO, Frostbite, CryEngine, Alien Isolation (in-house engine), Chrome Engine, Unreal Engine 3 and Dunia Engine. That's SEVEN different engines. You're simply not being reasonable.

 

No, when talking about "most games" I think saying the engines of seven recent releases not being a sufficient sample size to support your claim is perfectly reasonable.

 

Which has nothing to do with the XB1, which is what we're discussing. I have substantiated my claim with countless benchmarks across many engines - you have listed one dated PC engine (which was GPU limited at launch), one MMO and a PC strategy game that's based primarily on AI calculations. Hardly a credible rebuttal.

 

No, I think it's quite clear that when I said the following:

Not sure I'd really agree with that. My experience is that PC games tend to be CPU limited, the exceptions being the glorified tech demos like Crysis which try to cram in as many shaders as possible into a scene.

I was discussing PC games, not the Xbox - which doesn't stand to gain much from DirectX 12 beyond the new feature enablement as consoles already have fine tuned driver stacks which avoid a lot of the overhead that DirectX 12, Vulkan, and Mantle are designed to ameliorate. :rofl: @ saying Civ5 is "based on AI calculations" by the way, I hate to break it to you but the AI is only processed between turns, additionally it's not just one MMO - it's pretty much every MMO in recent history.

Link to comment
Share on other sites

You clearly don't understand hardware and software engineering if you think merely raising the clockrate is hard proof of games being GPU bound, did it ever occur to you there may be other bottlenecks in play that Techspot's limited testing is not revealing?

Is it perfect? No. Does it tell us what we need to know about performance? Absolutely.

 

A rather fine double standard to demand evidence in that pretentious manner of yours to then on the other hand act like single-sourcing of data is good scientific practice. I don't like Techspot, get over it. Is it really that insurmountable a request that you source from elsewhere, or is it because no other sources support your claim?

I provided evidence. You have done nothing to refute what I've posted, you simply stated you don't like the source. If Techspot's CPU performance reviews are so inaccurate then you should provide evidence to support that. Regardless, Anandtech supports my position, though I suppose you'll have a problem with that too.  :rolleyes:

 

No, when talking about "most games" I think saying the engines of seven recent releases not being a sufficient sample size to support your claim is perfectly reasonable.

 

I'm not here to write a PhD thesis. My point was that most games are GPU limited and I provided a wide variety of games across different genres and engines to support my position. I never claimed it was an exhaustive list but it was ten games across seven engines. How many engines do I need to list... twenty, fifty, a hundred!?!? And how many different websites do I need to list... five, ten, twenty? It's not like there are more than a few gaming websites doing that sort of in-depth testing. You are being utterly unreasonable. I made a casual and uncontroversial point, that most games are GPU limited. When challenged I provided a wealth of evidence, more than could be reasonably asked. Your response? To reject it out of hand and demand even more.

 

I was discussing PC games, not the Xbox - which doesn't stand to gain much from DirectX 12 beyond the new feature enablement as consoles already have fine tuned driver stacks which avoid a lot of the overhead that DirectX 12, Vulkan, and Mantle are designed to ameliorate.  :rofl: @ saying Civ5 is "based on AI calculations" by the way, I hate to break it to you but the AI is only processed between turns, additionally it's not just one MMO - it's pretty much every MMO in recent history.

That has nothing to do with how DX12 will affect performance on the XB1. It doesn't matter that MMOs are CPU bound as there are barely any on console. Civilization V is PC only, so isn't relevant to this discussion - even on PC it's an outlier, much like Cities: Skylines. Please stick to the discussion at hand.

Link to comment
Share on other sites

 

I won't go into it any further, as it's a) off-topic, and b) he's a Founder on this website.

 

However, I think it's worth looking at what Wardell has actually said. On the onehand he talks about games going from 8 to 60fps and accuses Intel, AMD, nVidia and Microsoft of downplaying DX12; on the other he claims that DX12 won't be a magic bullet for XB1.

 

With regards to the first point, we know that certain tasks benefit substantially from DX12 but the idea that the four main companies behind DX12 would downplay it isn't credible - they have a financial incentive to hype it and none of those companies can be accused of being modest when it comes to advertising. It's a rather sensational claim designed to grab headlines and that isn't supported by what we've heard from more credible developers. As for the second, that's in line with what we know about the XB1 - that Microsoft opted for slower, cheaper memory and that the ESRAM solution does not offset the difference. DX12 will offer improvements but cannot make up for the hardware limitations.

 

That's right, even Wardell is downplaying DX12 for XB1. He claims it will have a big impact on PC (which remains to be seen) but that it can't get around the hardware limitations of the XB1.

 

If you don't want to talk about it, why did you bring it up then? It's awful to make a claim that someone is a laughing stock, if you are unwilling to back it up in any way what so ever.

 

It doesn't seem unlikely that AMD, nVidia et. al would downplay (or not talk about) the advantages of DX12. If they came out and said "DX12 will be X times faster than DX11", GPU sales would grind to a halt for months. None of them can really afford that. MS doesn't want to upset either of them, and Intel probably doesn't really care.

 

Not many believe that DX12 will magically make the XB1 as fast as the PS4 or that DX12 will overcome the hardware limitations, but it's obvious that there will be an improvement. Aaron Greenberg said as such:

 

Javascript is not enabled or refresh the page to view.

Click here to view the Tweet
Link to comment
Share on other sites

It doesn't seem unlikely that AMD, nVidia et. al would downplay (or not talk about) the advantages of DX12. If they came out and said "DX12 will be X times faster than DX11", GPU sales would grind to a halt for months.

What's your reasoning behind this? I don't see how any of them saying DX12 is amazeballs would slow down sales since even cards 2 generations old support it.

  • Like 1
Link to comment
Share on other sites

Can't wait till all these arguments about DX12 are put to bed! Thankfully it's not going to be like the who's going to win the console race ones, where we have to wait 8-10 years. There's a finite amount of time to speculate then there's no need to anymore, we'll have actual games to see the results.

Link to comment
Share on other sites

This topic is now closed to further replies.