Xbox One exclusive Ryse runs at 900p


Recommended Posts

Perhaps i don't find the halo series as impressive cause i own a capable gaming pc. Thus i've seen visuals that actually are impressive.

 

It's not that i "cant accept the power of the xbox one". It's that i'm not a mindless drone who believes it's an unstoppable console with the power of a god like people such as yourself. Anyway, i'm pretty confident that when the game(forza 5) comes out and we get the normal tech analysis article at eurogamer we will indeed see some sacrifices.

If it looks great and runs fast, how are they sacrifices? Because it doesn't have some arbitrary graphics function that you won't even notice the difference if it had it or not.

Link to comment
Share on other sites

Perhaps i don't find the halo series as impressive cause i own a capable gaming pc. Thus i've seen visuals that actually are impressive.

 

Oh and Halo isn't visually impressive because PC games running a rig that costs 4+times as much has better graphics?

Seriously that's not even a valid comparison and shows you're in this thread for one purpose only. To troll.

Link to comment
Share on other sites

Perhaps i don't find the halo series as impressive cause i own a capable gaming pc. Thus i've seen visuals that actually are impressive.

 

It's not that i "cant accept the power of the xbox one". It's that i'm not a mindless drone who believes it's an unstoppable console with the power of a god like people such as yourself. Anyway, i'm pretty confident that when the game(forza 5) comes out and we get the normal tech analysis article at eurogamer we will indeed see some sacrifices.

 

 

That's certainly possible.  But lets not make the ps4 out to have the power of a god either here.  1080p is not being used in most launch titles.  Even Killzone was running at a lower res, I'm not sure if they hit 1080 or not now.

 

Why is it that people are now taking launch titles and assuming that is the best a console can do?  Again, both Mark Cerny and MS sources claim that we are unlikely to see either console used properly hardware wise for at least a couple years. Its as if you guys think launch titles are using 100% of a console's potential.

 

This is like every new generation of consoles.  You start off sloppy and then developers will get more experience with the hardware and do more impressive things over time. 

Link to comment
Share on other sites

If it looks great and runs fast, how are they sacrifices? Because it doesn't have some arbitrary graphics function that you won't even notice the difference if it had it or not.

 

It doesn't matter how big or small the graphical feature is that was dropped or toned down. If they had to do it to hit their goal, it's called a sacrifice.

 

Oh and Halo isn't visually impressive because PC games running a rig that costs 4+times as much has better graphics?

Seriously that's not even a valid comparison and shows you're in this thread for one purpose only. To troll.

 

Oh, so any gaming platform that's more powerful that your xbox is considered invalid for comparison purposes?

 

Well, i guess i'll humor you for a moment and compare with consoles then. If we look at the last games coming from either platform, it's the ps3 that has the more visually impressive final year games.

 

 

That's certainly possible.  But lets not make the ps4 out to have the power of a god either here.  1080p is not being used in most launch titles.  Even Killzone was running at a lower res, I'm not sure if they hit 1080 or not now.

 

Why is it that people are now taking launch titles and assuming that is the best a console can do?  Again, both Mark Cerny and MS sources claim that we are unlikely to see either console used properly hardware wise for at least a couple years. Its as if you guys think launch titles are using 100% of a console's potential.

 

This is like every new generation of consoles.  You start off sloppy and then developers will get more experience with the hardware and do more impressive things over time. 

 

I asked that same question a few months or so ago with the wii u. Why were people assuming its launch titles(which clearly were mostly rushed ports) are the best it can do? Why is it that same kind of thought is only a problem now when it wasn't back then?

 

Also, i never mentioned the ps4.

Link to comment
Share on other sites

Oh and Halo isn't visually impressive because PC games running a rig that costs 4+times as much has better graphics?

Seriously that's not even a valid comparison and shows you're in this thread for one purpose only. To troll.

PCs may be a bit expensiver, but hardly 4x+, if anything, with another 100 dollars form the base price of the X1 it would just slaughter it. AMD platforms that is.

Link to comment
Share on other sites

 I asked that same question a few months or so ago with the wii u. Why were people assuming its launch titles(which clearly were mostly rushed ports) are the best it can do? Why is it that same kind of thought is only a problem now when it wasn't back then?

 

Also, i never mentioned the ps4.

 

 

I don't know why people do that, but it has something to do with choosing to ignore history and the realities of new platform development.

 

Trying to assuming that launch titles represent all a console can do is ridiculous.  So while I get being a bit disappointed that more games are not running at 1080p on either consoles, lets all take a breath and remember how every console generation starts.

 

Regarding my ps4 mention, I didn't mean to imply that you did, but some in this thread have used this announcement as evidence that the X1 is inferior when neither console is meeting the expectations some have at launch.  I should have separated that comment from your quote though. 

Link to comment
Share on other sites

It doesn't matter how big or small the graphical feature is that was dropped or toned down. If they had to do it to hit their goal, it's called a sacrifice.

No, the importance is how good it looks not what features it does or doesn't use. There are many great looking computer games that doesn't use every graphical function around. In fact no games do pretty much.

Oh, so any gaming platform that's more powerful that your xbox is considered invalid for comparison purposes?

Not even what I said. But you can't claim that halo isn't visually impressive just because it's not as high on graphical fidelity as a 10times more powerful machine. The fact it looks so good compared to them means it's damn visually impressive

Well, i guess i'll humor you for a moment and compare with consoles then. If we look at the last games coming from either platform, it's the ps3 that has the more visually impressive final year games.

I believe you forgot to put in the "in my opinion" there, since that's exactly what it is. For the record I don't really agree. I'd say they're pretty damn equal in fact.

On that note, I suppose it doesn't count that they "sacrifice" when it's the PS3 that does it in this case huh ?

Link to comment
Share on other sites

PCs may be a bit expensiver, but hardly 4x+, if anything, with another 100 dollars form the base price of the X1 it would just slaughter it. AMD platforms that is.

We where comparing with the xbox360 at this point, since the One isn't out yet. For the record, I'd like to see the computer that costs a 100 more than a one, and can do not just the same graphics, but "slaughter" it.

Also you would have to add another 200 the the price since that's the supposed value of the kinect you would have to buy for the PC to compare.

Link to comment
Share on other sites

For the record, while it never said the wii U wouldn't get better over time. The Wii U should have been able to hit maximum performance pretty much right away since it's merely a faster version of the same hardware from the Wii, there's no new magical tricks to learn.

Link to comment
Share on other sites

We where comparing with the xbox360 at this point, since the One isn't out yet. For the record, I'd like to see the computer that costs a 100 more than a one, and can do not just the same graphics, but "slaughter" it.

Also you would have to add another 200 the the price since that's the supposed value of the kinect you would have to buy for the PC to compare.

Well, a 600$ PC might not "slaughter" the consoles when released, but it's very likely that it will by the end of next year. Hell, AMD might even release consumer APUs as powerful or more that what's inside the consoles.

Link to comment
Share on other sites

Well, a 600$ PC might not "slaughter" the consoles when released, but it's very likely that it will by the end of next year. Hell, AMD might even release consumer APUs as powerful or more that what's inside the consoles.

We all know that PCs will advance faster and leave consoles behind, that's not why I personally have and game on a console though. I also game on a PC but my HD7870 which isn't even that new or top end cost me half the price of the XB1 and how long will I be able to play new PC games at 1080p@60f with things set to high like I can today? The simple answer to that is not very long, yet when you buy a console, any one, PS4/XB1 w/e, you know you're going to play games on it without issue for at least 5-6 years. On the PC in around 3 years you'll be looking to upgrade that video card to keep up with PC games unless you're fine with running them sub-par, which actually I am because I'm tired of shelling out $200 or more every other year or so to keep up now. I had a HD5770 before and gamed just fine at 720p so I could get a smooth framerate which to me is way more important to the experience of a game.

Link to comment
Share on other sites

Well, a 600$ PC might not "slaughter" the consoles when released, but it's very likely that it will by the end of next year. Hell, AMD might even release consumer APUs as powerful or more that what's inside the consoles.

 

Doesn't matter. a PC with the same specs as a console won't even come close to the same game performance. 

Link to comment
Share on other sites

For the record, while it never said the wii U wouldn't get better over time. The Wii U should have been able to hit maximum performance pretty much right away since it's merely a faster version of the same hardware from the Wii, there's no new magical tricks to learn.

 

That's not quite accurate. While the wii was indeed an overclocked gamecube with extra ram, the wii u is not like that in relation to the wii. The only similarity is the cpu core(even then it's not 100% the same due to 3 cores and the awkward L2 cache arrangement). The GPU and edram set up is entirely different.

Link to comment
Share on other sites

your info seems outdated. xb1 has 218GB/S esram + 68GB/S DDR3 which are independent buses that operate simultaneously giving the ram system a peak bandwidth of 286GB/S. And if you still have doubts about the esram's role, in the sdk diagrams, each of the 4 8MB chunks plugs directly into each of the 4 L2 cache blocks in the gpu. Since the GCN architecture works on  small "tiles" of jobs(wavefronts), and is not an out of order architecture, the esram prefetching works beautifully to get the required data from ddr3 ram. And when the ddr3 is not busy helping the esram, it is out doing other chores like helping the audio system,and so on,because it is independent. You're also forgetting the fact that xbox one has 3x the coherent bandwidth between cpu and gpu,giving gpgpu computations quite an edge.

 

for a guy that loves to talk about all this bandwidth,and bandwidth on other cards, based on your paper spec assumptions, im guessing that you probably didnt bother to read the GCN architecture docs,and im also guessing that you probably didnt read what the theoretical bandwidth capacity of 18 compute units. It is something like 450GB/S,whereas the gddr5 ram only has a theoretical ceiling of 176GB/S. you can add all the compute blocks you want,if you dont have the bandwidth to fill them, then they are sitting idle a lot of the time.

The ESRAM is just a small 32MB L3 cache/frame buffer, it does not hardly effect performance at all. I do believe that performance on the Xbox One would suck even worse without it, but it is not something that really does much of anything but help support the general operation of the system. The problem is, the ESRAM cache is so small, only 32MB. That's about as much VRAM as my old NVidia TNT2 had from 2000. In comparison, modern GPUs were designed to run with 1GB-4GB running at a full 264GB/sec all by itself.The tests done that showed the PS4 was 50% faster were with the Xbox One utilizing it's ESRAM.. It is not 218GB/sec, it is 109GB/sec, and possibly faster with compression - that is if the data is compressible. That is not going to make up for the fact that the PS4 has 8GB DDR5 memory clocked in at 174GB/sec across the full spectrum, direct to GPU and CPU. This literally blows the doors off the Xbox One, and it necessary to drive high resolutions without performance issues, just like in PC graphics.

Also I could not care less if the Xbox One and GCN architecture have more "internal" bandwidth. If the GPU can't get data to work on, if it's bottlenecked by the main system memory, it isn't going to matter.

Link to comment
Share on other sites

We all know that PCs will advance faster and leave consoles behind, that's not why I personally have and game on a console though. I also game on a PC but my HD7870 which isn't even that new or top end cost me half the price of the XB1 and how long will I be able to play new PC games at 1080p@60f with things set to high like I can today? The simple answer to that is not very long, yet when you buy a console, any one, PS4/XB1 w/e, you know you're going to play games on it without issue for at least 5-6 years. On the PC in around 3 years you'll be looking to upgrade that video card to keep up with PC games unless you're fine with running them sub-par, which actually I am because I'm tired of shelling out $200 or more every other year or so to keep up now. I had a HD5770 before and gamed just fine at 720p so I could get a smooth framerate which to me is way more important to the experience of a game.

 

Considering this is a topic about console titles already not managing to achieve 1080p and/or 60fps, don't you think it's a tad unfair of you to try and make a case of PC hardware not being able to keep up with running future titles at 1080p60?

Link to comment
Share on other sites

The ESRAM is just a small 32MB L3 cache/frame buffer, it does not hardly effect performance at all. I do believe that performance on the Xbox One would suck even worse without it, but it is not something that really does much of anything but help support the general operation of the system. The problem is, the ESRAM cache is so small, only 32MB. That's about as much VRAM as my old NVidia TNT2 had from 2000. In comparison, modern GPUs were designed to run with 1GB-4GB running at a full 264GB/sec all by itself.The tests done that showed the PS4 was 50% faster were with the Xbox One utilizing it's ESRAM.. It is not 218GB/sec, it is 109GB/sec, and possibly faster with compression - that is if the data is compressible. That is not going to make up for the fact that the PS4 has 8GB DDR5 memory clocked in at 174GB/sec across the full spectrum, direct to GPU and CPU. This literally blows the doors off the Xbox One, and it necessary to drive high resolutions without performance issues, just like in PC graphics.

Also I could not care less if the Xbox One and GCN architecture have more "internal" bandwidth. If the GPU can't get data to work on, if it's bottlenecked by the main system memory, it isn't going to matter.

 

 

The semi-accurate articles I posted put the bandwidth at 109GB/s at the minimum and 204GB/s at the max, coming down to how well the game is coded.  So its likely we could see games using a varied amount of bandwidth, at least until devs are use to the architecture. They claim that MS sources told them that the first wave of games are hitting the 140-150 GB/s mark without much optimization.  They also found that the DDR3 is running at 2133mhz.

 

Also, they clearly claim that the ESRAM is in fact not cache and certainly not being used in a traditional, pc-like manner. 

 

Also, its important to note that both the X1 and PS4 will be allowing 5GB of ram to be free for gaming as both have OS and other software that need dedicated resources.

Link to comment
Share on other sites

The ESRAM is just a small 32MB L3 cache/frame buffer, it does not hardly effect performance at all. I do believe that performance on the Xbox One would suck even worse without it, but it is not something that really does much of anything but help support the general operation of the system. The problem is, the ESRAM cache is so small, only 32MB. That's about as much VRAM as my old NVidia TNT2 had from 2000. In comparison, modern GPUs were designed to run with 1GB-4GB running at a full 264GB/sec all by itself.The tests done that showed the PS4 was 50% faster were with the Xbox One utilizing it's ESRAM.. It is not 218GB/sec, it is 109GB/sec, and possibly faster with compression - that is if the data is compressible. That is not going to make up for the fact that the PS4 has 8GB DDR5 memory clocked in at 174GB/sec across the full spectrum, direct to GPU and CPU. This literally blows the doors off the Xbox One, and it necessary to drive high resolutions without performance issues, just like in PC graphics.

Also I could not care less if the Xbox One and GCN architecture have more "internal" bandwidth. If the GPU can't get data to work on, if it's bottlenecked by the main system memory, it isn't going to matter.

 

It doesnt matter that the esram is small because of how GCN works. Each compute unit  can take in 32 bytes of data in each cpu cycle. Considering that the ddr3 can top back up this 32 bytes of data x 12 in the ESRAM far quicker than the amount of clock cycles wasted by the latency of GDDR5, this will effectively allow the esram to help the system achieve such a great amount of bandwidth.

 

And the esram bandwidth was confirmed by Microsoft at Hotchips as 204GB/S. With the upclock of the gpu,this brings it up to 218GB/S.

 

 

Also I could not care less if the Xbox One and GCN architecture have more "internal" bandwidth. If the GPU can't get data to work on, if it's bottlenecked by the main system memory, it isn't going to matter.

 

thats exactly what i said. you can add more compute units,and the bandwidth requirements will increase,but if you dont provide the bandwidth, then these extra compute units are wasted,so to speak, therefore the system memory bandwidth is the bottleneck,and xbox one has 55% more memory bandwidth. that doesnt mean that the ps4 extra compute units are total useless,it just means the cores will be less saturated,but the scheduler will be more efficient. thats why ive been saying paper specs are meaningless.

Link to comment
Share on other sites

The semi-accurate articles I posted put the bandwidth at 109GB/s at the minimum and 204GB/s at the max, coming down to how well the game is coded.  So its likely we could see games using a varied amount of bandwidth, at least until devs are use to the architecture. They claim that MS sources told them that the first wave of games are hitting the 140-150 GB/s mark without much optimization.  They also found that the DDR3 is running at 2133mhz.

 

Also, they clearly claim that the ESRAM is in fact not cache and certainly not being used in a traditional, pc-like manner. 

 

Also, its important to note that both the X1 and PS4 will be allowing 5GB of ram to be free for gaming as both have OS and other software that need dedicated resources.

If you calculate the memory bandwidth of the ESRAM yourself, it is 109GB/sec. It's done by calculating the clock rate of the GPU core, multiplying by the width of the interface. So the bandwidth it has is actually 109GB/sec. It is not going to be any faster than that. Some sources though, are trying to claim that "with" compression of the data, it is faster and perform more work, but that requires that the data be compressed and made smaller. You could really do the same thing with the PS4's bandwidth, making that 174GB/sec more like 400GB/sec, apparently.

Also, upon further review, both Xbox One and PS4 are based on GCN/AMD Trinity architecture. They might be more advanced, based on something a bit newer, but I am sure the GPU and CPU is nearly identical to the desktop platform which feature 4-core CPU with built in Radeon GCN part. The PS4/Xbox One have been upgraded to have 8-core Trinity CPU cores, which is probably going to be the main difference. The difference between the Xbox One and PS4 implentation is that the PS4 has more memory bandwidth thanks to higher clocks and DDR5 RAM, AND it has higher core clock rates. Xbox One is clocked in at 1.75 GHz, with 8-cores, while the PS4 is clocked at a maximum of 2.75 GHz with 8-cores (probably with some type of "turbo mode" like the desktop chips, because the default clock rate on the PS4 is not yet disclosed).

The Xbox One has 8GB of RAM, but apparently only 5GB is available for games. There is no such limitation for PS4 software; it has a full 8GB RAM, and because it's based on FreeBSD, if the OS remains loaded during gameplay at all, I am sure it's going to be much more memory efficient. You should expect nearly the entire 8GB of memory to be available for games on the PS4..

Link to comment
Share on other sites

Considering this is a topic about console titles already not managing to achieve 1080p and/or 60fps, don't you think it's a tad unfair of you to try and make a case of PC hardware not being able to keep up with running future titles at 1080p60?

No? I have no doubt they'll run games at 1080p@60fps with time, my expectation for launch titles is never high as I've stated a few times already. The point though was the reason I console game as well as PC game. There's an advantage to knowing you'll get good gaming, as long as devs do their job, for at least 5 years while on the PC if you want to stay at a constant performance level you'll need to upgrade again in around 3 or so years on average or you'll lower the games settings. I already had to tweak the graphics settings in Metro Last Light because I got frame drops on my HD7870.

Link to comment
Share on other sites

If you calculate the memory bandwidth of the ESRAM yourself, it is 109GB/sec. It's done by calculating the clock rate of the GPU core, multiplying by the width of the interface. So the bandwidth it has is actually 109GB/sec. It is not going to be any faster than that. Some sources though, are trying to claim that "with" compression of the data, it is faster and perform more work, but that requires that the data be compressed and made smaller. You could really do the same thing with the PS4's bandwidth, making that 174GB/sec more like 400GB/sec, apparently.

 

Maybe your right, but again, read the Semi-Accurate article for the detailed explanation.  It clearly contradicts what your trying to claim.  It's conclusion is that the bandwidth numbers will end up very close thanks to MS' custom work.  That might mean that in order to take advantage of that, its requires more work on the developers part though.  We will see.

 

 

Also, upon further review, both Xbox One and PS4 are based on GCN/AMD Trinity architecture. They might be more advanced, based on something a bit newer, but I am sure the GPU and CPU is nearly identical to the desktop platform which feature 4-core CPU with built in Radeon GCN part. The PS4/Xbox One have been upgraded to have 8-core Trinity CPU cores, which is probably going to be the main difference. The difference between the Xbox One and PS4 implentation is that the PS4 has more memory bandwidth thanks to higher clocks and DDR5 RAM, AND it has higher core clock rates. Xbox One is clocked in at 1.75 GHz, with 8-cores, while the PS4 is clocked at a maximum of 2.75 GHz with 8-cores (probably with some type of "turbo mode" like the desktop chips, because the default clock rate on the PS4 is not yet disclosed).

 

Where did you hear that the ps4 cpu is clocked at 2.75Ghz with 8 cores?  Even Sony's own spec sheet doesn't back you up on that number.  Now maybe the Amd cpu does allow for a turbo boost on say 4 or 2 cores when the other cores are not being used, but then that wouldn't be limited to the PS4 since the X1 cpu is the same core architecture. 

 

Also, both consoles are using custom methods to add certain features that the amd cpu alone was not capable of. 

 

The Xbox One has 8GB of RAM, but apparently only 5GB is available for games. There is no such limitation for PS4 software; it has a full 8GB RAM, and because it's based on FreeBSD, if the OS remains loaded during gameplay at all, I am sure it's going to be much more memory efficient. You should expect at least 7GB of that memory to be free for games on the PS4.

 

Wait a sec, I'm pretty sure it was confirmed that the ps4 is also reserving  about 3gb of ram for other system functions.  You should probably go and confirm that.  It would be a reversal if they changed it to all 8GB being available.  I think the 8GB value was a rumor early on.  I remember people using it as a way to bash the X1, since it was first announced for their console, but then it was learned later that the PS4 is also doing that.

 

The reason both console are reserving the resources is because they do so much more than just game.  All of the apps, social features, media, etc need to work fluidly with the OS.  The resources allow better multitasking, etc.

Link to comment
Share on other sites

That PS4 OS reserve part is nonsense and a fair bit of your post.

 

They will be reserving a fair chunk just like Xbox One. Probably very similar.

 

Yes, the PS4 is more powerful, end of discussion. The Xbox One may excel at the odd thing but no magic to overcome the PS4 overall.

Link to comment
Share on other sites

Maybe your right, but again, read the Semi-Accurate article for the detailed explanation.  It clearly contradicts what your trying to claim.  It's conclusion is that the bandwidth numbers will end up very close thanks to MS' custom work.  That might mean that in order to take advantage of that, its requires more work on the developers part though.  We will see.

 

 

 

 

Where did you hear that the ps4 cpu is clocked at 2.75Ghz with 8 cores?  Even Sony's own spec sheet doesn't back you up on that number.  Now maybe the Amd cpu does allow for a turbo boost on say 4 or 2 cores when the other cores are not being used, but then that wouldn't be limited to the PS4 since the X1 cpu is the same core architecture. 

 

Also, both consoles are using custom methods to add certain features that the amd cpu alone was not capable of. 

 

 

 

 

Wait a sec, I'm pretty sure it was confirmed that the ps4 is also reserving  about 3gb of ram for other system functions.  You should probably go and confirm that.  It would be a reversal if they changed it to all 8GB being available.  I think the 8GB value was a rumor early on.  I remember people using it as a way to bash the X1, since it was first announced for their console, but then it was learned later that the PS4 is also doing that.

 

The reason both console are reserving the resources is because they do so much more than just game.  All of the apps, social features, media, etc need to work fluidly with the OS.  The resources allow better multitasking, etc.

I was reading this over on the Wikipedia page for PS4, which is usually pretty accurate. Here's the link: http://en.wikipedia.org/wiki/PlayStation_4

Notice how it doesn't say anything about reserving memory space for games, or OS. I am certain that the PS4 will be quite a bit more memory efficient, but we cannot confirm what it does and doesn't use right now. The issue is, why does Xbox One need 3GB of RAM? That's like running an entire OS on the side, at the same time as the games. In reality, to load up the FreeBSD kernel on the PS4, it could not possibly use more than a few MB to a few hundred MB of RAM. Just go look up the system requirements for FreeBSD, it's really streamlined, and resource efficient, and only requires 24MB of RAM to run. When you are in-game, the RAM should be freed for game use, and when you switch back to the dash board, that is the only time the OS features interface should be loaded into memory. If this is the case, then the full 8GB is going to be available to PS4 games.

Link to comment
Share on other sites

I was reading this over on the Wikipedia page for PS4, which is usually pretty accurate. Here's the link: http://en.wikipedia.org/wiki/PlayStation_4

 

 

Just do a search for ps4 reserve 3gb and you will find all the stories that revealed this.  The number was actually put at 3.5GB reserved but that developers might be able to make use of up to 1GB in some fashion.  That part has not been detailed.

 

So that Wikipedia article is out of date unfortunately.

 

The ps4 will be reserving a similar chunk of memory to the X1.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.