Call of Duty: Ghosts And Titanfall To Run At 720p On Xbox One?


Recommended Posts

Erm, it says it uses DX in the article I just gave you.  Heh.  :)  From what I've read I don't think they'll allow direct hardware access at first, but the DX build is optimized for a single platform so shouldn't have any notable performance issues.

Well yeah, I saw that, but I was thinking beyond the article itself. That pretty much settles the question.

DX being optimized for the hardware of the X1 must be what AMD was referring to when they said Mantle would allow console-like performance optimization.

Link to comment
Share on other sites

I'm sure it's doing it different compared to the other two , which is why you get the ability to have better control and take the CPU out of it more than with the other two.   Honestly it sounds nice but I'm just against the idea of having hardware vendor specific APIs today like we did 20+ years ago.   What happens when Nvidia decides it wants to make it's own specific API now for it's own GPUs?   Is that what we really need?

 

I think OpenGL and DirectX have been doing a fine job and with the hardware we have today in PCs most peoples CPU is siting idle doing nothing anyways, why not use it?  I don't remember the last time I maxed out my quad core i7 920 and it's hardly new.

 

Mantle isnt AMD specific its an open api developed by AMD along with developers that wanted something like this to harness more power of the GPU. Nvidia could use it to if they wanted. It provides a way for it to talk to the metal of the GPU bypassing the "translation" layers in the dx api which saves alot of time and overhead. DX has these layers cus of all the different hardware it needs to run on and MS dont seem interested in improving it (according to the devs). So once you can code to the optimised paths in a specific set of hardware like the GCN architecture in radeon GPU's it becomes more efficient and doesnt need to "ask" the CPU to do stuff first so it becomes alot faster. It also uses DX High Level Shader Language to so thats good. I can see mantle mainly developed for by 1st party titles and big 3rd party devs. No point in making a mantle path for some indie games that wont need anywhere near the power to run etc.

 

The consoles can do alot more with less because MS  and sony could streamline there  api and dump alot of the translation crap because its only ever going to be working with one set of hardware components, so they arent getting held back.

Link to comment
Share on other sites

I'm sure it's doing it different compared to the other two , which is why you get the ability to have better control and take the CPU out of it more than with the other two.   Honestly it sounds nice but I'm just against the idea of having hardware vendor specific APIs today like we did 20+ years ago.   What happens when Nvidia decides it wants to make it's own specific API now for it's own GPUs?   Is that what we really need?

 

I think OpenGL and DirectX have been doing a fine job and with the hardware we have today in PCs most peoples CPU is siting idle doing nothing anyways, why not use it?  I don't remember the last time I maxed out my quad core i7 920 and it's hardly new.

It might be vendor specific at first, but AMDs been pretty clear it isn't vendor locked like most of the crap NV has been doing.  My next card will be AMD for sure.

Link to comment
Share on other sites

Means consoles can do alot more with less, alot more than a PC could do with the same hardware in it also the dx api is really inefficient as its designed to run on anything and everything so theres no specific optimizations for anything. Mantle should close the gap considerably on what same level GPU's can do or as will be the case for high end hardware, blow a hole in the consoles capabilities... hopefully. Guess we'll find out in december in BF4 :D

 

Although saying that if they cant push 1080p 60fps on the next set of games released its time give up and go home cus learning curve between PC and console architecture this time round wont be much so devs could leverage the power unless the ESRAM in the x1 isnt all that good. no point having this if it doesnt work as well in games sure it had real code tested on it but in a gaming environment who knows.

yes I understand that a console can be optimized on a lower level for games, since it's not running a heavy OS with services in the background.  but let's be realistic here, even with software optimization, you can only push the gpu physically so much  of the Xb1/PS4 , and that far isn't enough to run BF4, BF3 or Titanfall natively on 1080p unscaled by any chip, with constant 60fps output. It's just not gonna happen, and that's the problem of folks over exaggerating the performance of these consoles. Its a relatively slow, mid end gpu from a few years ago used with so called 'esram' to 'unify' the ram pools. RAM doesn't solve the problem of the GPU being weak to handle newer games on a decent fps level natively at 60fps and more on 1080p. Then again, that's why the consoles are  pegged at those prices, they are doing 'value for money' gaming, not specialization gaming, where the latest hardware and the limits of gaming are being pushed as we see on the pc. GPU vendors can barely keep up with the pace of PC game development when it comes to MSA and AA goodness and demanding DX11 games.

Link to comment
Share on other sites

I find it funny to say Mantle is low level compared to DX or OGL when it's just another API sitting between hardware and software like the other two.  The difference here, and something I don't agree with one bit in todays world, is that it's written specifically for AMDs GPUs, so it's custom as custom can get.  That's all well and good if you're running an AMD GPU but what if you aren't?   Are we now going to have NVidia coming out with their own API for their own cards?   It's like we're suddenly taking not 2 but 20 steps back in to the early 90s when each GPU maker had their own APIs developers had to target.  

 

The whole idea about DX and even more with OGL is that they're not specific to platform or hardware.   If that means we lose a bit on the performance side I could live with it instead of having the market split all over the place once again.

This brings us back to where graphics cards was before Vista was when MS clamped down on Ati and ?Nvidia making custom hardware and software functions that required games to be specifically coded for the, any o have the right graphics card. MS then decided that DX was their standard and if they where to support it they had to follow the standards they set and they agreed on, no custom non standard extensions.

These "low" level extensions never increased performance they just added more or less useless graphics fluff for the vendor card. Mantle likely won't do anything different and likely won't provide any real performance benefit over DX.

Link to comment
Share on other sites

I'm going to wait until the consoles actually launch to make judgement. So much crap got said about the PS3 and Xbox 360 before they launched.

 

The PS3 was supposed to have 8 USB ports

The PS3 was supposed to be able to drive 2 x 1080p pictures simultaneously

 

When the Consoles launched, the games looked marginally better than the previous generation, as time went on the games looked better and better, so I assume this will be the case. 

  • Like 1
Link to comment
Share on other sites

Mantle likely won't do anything different and likely won't provide any real performance benefit over DX.

DICE doesn't do things halfassed.  If there was no benefit, they wouldn't be bothering.

 

There's most likely something MS has fallen behind on (my suspicion is it's WDDM 2.0 related, but I'd still have to do some reading to confirm that.)

Link to comment
Share on other sites

I can't believe this thread is still going.

For tv's, 720p is perfectly fine, especially if they're scaling, windowing or doing other algorithmic processing to enhance the detail of focus points or increase scene vitality and whatnot.  Unless you *NEED* more absolute pixels most people won't see the difference of 720p vs 1080p

 

Also,retail tv's are largely crap still, many consumer tvs simply process 1080p but don't have the pixel density for 1:1 display of 1080p. and if they do have the pixel density they often ship with quirky upscalers/downscalers/filters/processes that mess things up.

 

I know a ton of people running their consoles on a 1080p/24 cinema mode, which means your TV is dropping frames or re-processing/filtering and after schooling them to use a raw input they then complain about soft colors and detail because their tv has been messing with colors for so long they got used to "HDR" coloring on everything. (but at least they can read text now..)

 

lastly, i'm amazed at how many people still PC game at resolutions of less than 1080p and don't even know it..

 

BUT, with all that said, its launch time, some games will blow our mind, some will fall flat on their face and not meet expectations. play a launch 360 title today and most of them are cringe worthy, same with ps3 launch titles.

Link to comment
Share on other sites

DICE doesn't do things halfassed.  If there was no benefit, they wouldn't be bothering.

 

There's most likely something MS has fallen behind on (my suspicion is it's WDDM 2.0 related, but I'd still have to do some reading to confirm that.)

 

Why would WDDM be the cause?

Link to comment
Share on other sites

DICE doesn't do things halfassed.  If there was no benefit, they wouldn't be bothering.

 

There's most likely something MS has fallen behind on (my suspicion is it's WDDM 2.0 related, but I'd still have to do some reading to confirm that.)

Yeah they never do anything half asses, like the fully destructible environments in BF4. ;)

And yeah I still think the game is awesome and the maps would suck pretty bad if they really where fully destructible.

Link to comment
Share on other sites

Because IIRC AMD already supported 2.0, and there's no 2.0.

 

2.0 had some major scheduler changes (again, IIRC.)

 

Aren't they already using 2.1 or newer?  You're talking 7 year old technology now

Link to comment
Share on other sites

Kinda negates the "next-gen" moniker then doesn't it?

Considering how painful it is to use the last gen for high end games, and how little PC games have been using current tech because they're tied to the last gen?

 

No.  They are next-gen, for consoles.

Link to comment
Share on other sites

Kinda negates the "next-gen" moniker then doesn't it?

Not if the IQ is much better than a PC with 4k with low IQ.

Link to comment
Share on other sites

Kinda negates the "next-gen" moniker then doesn't it?

 

next gen was never about 720p vs 1080p for me..

 

Next gen for me is immersion, graphics, massive online multiplayer, kinect interaction, voice interaction, smart glass integration

 

When i'm 10 feet away from my tv, pixel density isn't as next gen as immersion is and you can do a lot of immersion at 720p

Link to comment
Share on other sites

next gen was never about 720p vs 1080p for me..

 

Next gen for me is immersion, graphics, massive online multiplayer, kinect interaction, voice interaction, smart glass integration

 

When i'm 10 feet away from my tv, pixel density isn't as next gen as immersion is and you can do a lot of immersion at 720p

Sounds a little like excuses.

 

The issue is, if confirmed, it raises questions about the capability and power of the console which eats into all the stuff you just mentioned.

Link to comment
Share on other sites

Kinda negates the "next-gen" moniker then doesn't it?

Previous gen games were often slightly sub-HD at 30 fps, its still a big improvement.

 

30 vs 60 fps is quite a noticeable difference in how smooth and responsive the game looks and feels.

Link to comment
Share on other sites

yes I understand that a console can be optimized on a lower level for games, since it's not running a heavy OS with services in the background.  but let's be realistic here, even with software optimization, you can only push the gpu physically so much  of the Xb1/PS4 , and that far isn't enough to run BF4, BF3 or Titanfall natively on 1080p unscaled by any chip, with constant 60fps output. It's just not gonna happen, and that's the problem of folks over exaggerating the performance of these consoles. Its a relatively slow, mid end gpu from a few years ago used with so called 'esram' to 'unify' the ram pools. RAM doesn't solve the problem of the GPU being weak to handle newer games on a decent fps level natively at 60fps and more on 1080p. Then again, that's why the consoles are  pegged at those prices, they are doing 'value for money' gaming, not specialization gaming, where the latest hardware and the limits of gaming are being pushed as we see on the pc. GPU vendors can barely keep up with the pace of PC game development when it comes to MSA and AA goodness and demanding DX11 games.

 

Yeah i know but im saying if you stuck equivalent PC cards into computers as the x1 and PS4 cus there both crap lets be honest. A developer can get better visuals, frame rates and resolution out of the consoles than they could for the PC. I guess 1080p is still a stretch for them at 60fps in visually demanding games but they can employ tricks like lower the resolution on far away stuff that you probably wont even notice anyway to keep the visuals sharp up close, dynamically changing the resolution, ive already seen killzone doing it on a early build video from february. Dunno if theyve bothered implement things like that on PC versions of a game but i guess its a trick that can be used to get a "higher" noticable res, if ya know what i mean

Link to comment
Share on other sites

Sounds a little like excuses.

 

The issue is, if confirmed, it raises questions about the capability and power of the console which eats into all the stuff you just mentioned.

 

No it doesn't..  I know a 720p game at 60fps can play and look beautiful - so much so that 99.9% of everyone in the world wouldn't be able to tell it apart from a 1080p game.

 

With superior texture fill rates, memory bandwidth and more cores to distribute the load games will look great at launch but where they really shine is how well they implement the parallel processing, cloud processing, visual / audio processing and glass integration... aslo, next gen is about filling the holes of the prior gen - and the slow apps, slow switching between apps and games we're a major gotcha of the platform that appear to be fully resolved with the limited leaks we have seen and public engineering discussions.

 

i fully expect games on both platforms to have limitations.. the hardware is finite and like many computational problems, you can better solve them over time through software instead of assuming throwing more hardware at something makes it faster.

Link to comment
Share on other sites

Sounds a little like excuses.

 

The issue is, if confirmed, it raises questions about the capability and power of the console which eats into all the stuff you just mentioned.

 

Yeah, theres a hilarious amount of revisionism being done with regards to what people claim they were expecting with next gen consoles now that we've seen how pathetic their performance is turning out to be.

  • Like 1
Link to comment
Share on other sites

Yeah, theres a hilarious amount of revisionism being done with regards to what people claim they were expecting with next gen consoles now that we've seen how pathetic their performance is turning out to be.

 

Only revisionist if your single metric is pixel density..   if that's all next gen is for you, that's a sorry metric if you ask me.

Link to comment
Share on other sites

Not if the IQ is much better than a PC with 4k with low IQ.

 

Didn't you know, Quake running at 4K is next gen, Crysis 3 running at 720 is last gen, resolution is the only thing that matters, just like megahurtz, these people need a number they can clearly understand higher is better or they get confused. 

Link to comment
Share on other sites

Yeah, theres a hilarious amount of revisionism being done with regards to what people claim they were expecting with next gen consoles now that we've seen how pathetic their performance is turning out to be.

I think it is safe to say that no one is claiming 720p is better than 1080p. I am and I guess others in this thread are saying that resolution should not be the only metric to judge image quality.

 

Ryse at 900p looks next gen to me and it would be even better at 1080p if these consoles could manage that. As it stands both next-gen consoles can't manage that but anything more and we will be looking at $700-1000 consoles which is not practical.

Link to comment
Share on other sites

Yeah, theres a hilarious amount of revisionism being done with regards to what people claim they were expecting with next gen consoles now that we've seen how pathetic their performance is turning out to be.

I think that it's the complete opposite, people are expecting CG quality everywhere at 1080/60.

 

I'd like to say give me 1080/60 or GTFO but Ryse totally invalidates that statement.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.