Xbox One Architecture Finally Explained


Recommended Posts

snip

Your sentence structure makes no sense.

But I've only had one RROD with three consoles. It was on my original Xbox that was bought one year after launch and got it one year later, pretty much on schedule with normal amounts of use and about one shutdown and startup every day.

My elite never had one and my finances newer small Xbox never had one. So it's starting to be a few years since I saw an RROD. And so what. MS fixed them all for free on extended 5 year warranty. That's the lifetime EU countries require for electronics and mandate factory fault warranty on anyway.

It's not like there isn't a widely known issue of YLOD and laser dying on the PS3 either. The difference is that Sony has had problems with their lasers for every generation of PlayStation. And even if you're within warranty their response is "user fault. No repair".

No one said they could do no wrong. But you don't realize the cause behind the RROD and the fact that they cleaned up and took care of their customers, which is worth a lot more.

Link to comment
Share on other sites

snip

I haven't had an RROD for at least three years now, and that was on my first generation Xbox 360, which I bought in Christmas 2006. My current Xbox 360 works without issue.

Link to comment
Share on other sites

snip

I said that because I'm especially amazed how they are transforming Windows to power all their services (phone, tablet, desktop, console). As a developer; that interests me.

I don't know enough about hardware to deep dive into the details. But I've read about all the custom chips that were created to get max. efficiency. Move Engines; Sounds engines etc... all sound great from a developer perspective.

The fact that my Xbox One delivers the graphics and diversity of services with the hardware and price at which they're sold (relative to the amount of years they last) I believe that's pretty epic.

This is a console made specifically to support their 8-10 year long term vision for the Xbox brand; so you can be pretty sure they know what they're doing; and every hardware decision has been made to specifically support everything they have planed for the future.

PS: same goes for the PlayStation; obviously; but you're specifically attacking the Xbox here so yea...

  • Like 2
Link to comment
Share on other sites

snip

 

I agree. The build quality of Microsoft consoles has always been superior to PS. Playstations are all self-tapper screws in plastic, the boards are screwed straight into the plastic body. The Xboxes have a metal chassis, to which everything is bolted with MACHINE screws for a more secure metal-to-metal fixing, and ground point. The only self-tappers in an Xbox simply holds the outer body/top lid on. Xboxes also have tons more room for cooling, whereas the PS routes hot air through an already hot built in PSU with hot regulators in.

 

The PS3 and 4 build quality is essentially the same. Cramped with tons of plastic, and a heat spreader over the entire board in a pathetic attempt at routing heat. I know this because console repairs is part of my livelihood :) I personally am an Xbox fanboy and proud of it, never had an Xbox go wrong on me, but had loads of PS paperweights over the years. My latest PS3 Slim that lasted just 12 hours usage over 2 years due to overheated VRAM was the last straw :(

  • Like 2
Link to comment
Share on other sites

snip

And unlike the widely occurring known faults on the PS3,MS acknowledged their fault and fixed it and took care of their customers.

The RROD fault is also a result of several thousand full power cycles. That means heating up and cooling down, which over time deformed the clamp causing the CPU to move to far from the motherboard for the solder beads to maintain connection.

It was an unforeseen side effect that didn't come up during testing because it simply required certain special circumstances. The QnA ran power cycle tests that turned on and off consoles thousand and thousands of times, but not long enough for them to warm up and not long enough off to cool off. So the flexing didn't occur. On consoles that where on all the time for extended use testing for weeks on end wouldn't show the problem because a console that was never shut off would have the problem. The less you power cycles the console and the longer you had it on, the longer it would take for the problem to occur. So gamers who kept it on all the time wouldn't see the issue until 3-5 years.

The Playstation issue however is simply bad components, and was known during development and QnA gnat a certain amount of them can and will fail, and Sony still refuses to accept it as a warranty issue for most customers.

  • Like 3
Link to comment
Share on other sites

I said that because I'm especially amazed how they are transforming Windows to power all their services (phone, tablet, desktop, console). As a developer; that interests me.

I don't know enough about hardware to deep dive into the details. But I've read about all the custom chips that were created to get max. efficiency. Move Engines; Sounds engines etc... all sound great from a developer perspective.

The fact that my Xbox One delivers the graphics and diversity of services with the hardware and price at which they're sold (relative to the amount of years they last) I believe that's pretty epic.

This is a console made specifically to support their 8-10 year long term vision for the Xbox brand; so you can be pretty sure they know what they're doing; and every hardware decision has been made to specifically support everything they have planed for the future.

PS: same goes for the PlayStation; obviously; but you're specifically attacking the Xbox here so yea...

I didn't really particularly biased about the Xbox but just felt disappointed overall and think they can do better than that. 

Link to comment
Share on other sites

Microsoft designed a Box, that had tons of wiggle room for the future, same as with a PC as long as the resources are their.

Every app can be improved over and over again, newer apps apps can be created, and newer games can be created, and all of them being able to take advantage of everything the Xbox One can do.

What people aren't seeing it this, just about every Xbox One game looks more than good enough. I hate AC series, but it looks good enough on the Xbox One. Ryse proves that, beautiful games can be had on the Xbox One. The way that Twitch and Michinima were created to work flawlessly with the Xbox One's feature set.

Xbox One is very close to a PC as it gets, I guess.

Link to comment
Share on other sites

What people aren't seeing it this, just about every Xbox One game looks more than good enough. 

 

That's subjective though.

Link to comment
Share on other sites

That's subjective though.

Ryse is one of the best looking games I've seen next gen.

NBA2K14 looks good

BF4 looks good

FORZA looks amazing

Kinect Sports:Rivals looks good

Link to comment
Share on other sites

Yep. Although X1 have more memory but still suffers in terms of performance in many ways due to slower speed that used DDR3. I have no idea how they gonna convince many developer that it's sufficient for games to target 1080p. Watch dogs only run at 900p max which isn't surprised at all while PS4 won't have problem making it 1080p/30fps

Why is DDR3 insufficient for games to hit 1080p?

I'd really like to see the numbers laid out for GDDR3 + edram vs DDR3 + esram. If someone could chime in with the numbers, that would be great.

I did some quick digging and about the best I could find for the 360 was that the memory bandwidth from edram to gpu was 32GB/s and from ram to the gpu was 22.4GB/s. On the X1, the numbers are 133-192GB/s from the esram to gpu and 68.3GB/s from the ram to the gpu.

I'm not sure how accurate the comparison is, but it seems like the X1 has a higher performing ram system.

  • Like 1
Link to comment
Share on other sites

Why is DDR3 insufficient for games to hit 1080p?I'd really like to see the numbers laid out for GDDR3 + edram vs DDR3 + esram. If someone could chime in with the numbers, that would be great.I did some quick digging and about the best I could find for the 360 was that the memory bandwidth from edram to gpu was 32GB/s and from ram to the gpu was 22.4GB/s. On the X1, the numbers are 133-192GB/s from the esram to gpu and 68.3GB/s from the ram to the gpu.I'm not sure how accurate the comparison is, but it seems like the X1 has a higher performing ram system.

We don't need numbers we have actual examples. One of this generations currently best looking games runs 1080p60, on the XO.

  • Like 4
Link to comment
Share on other sites

Why is DDR3 insufficient for games to hit 1080p?

I'd really like to see the numbers laid out for GDDR3 + edram vs DDR3 + esram. If someone could chime in with the numbers, that would be great.

I did some quick digging and about the best I could find for the 360 was that the memory bandwidth from edram to gpu was 32GB/s and from ram to the gpu was 22.4GB/s. On the X1, the numbers are 133-192GB/s from the esram to gpu and 68.3GB/s from the ram to the gpu.

I'm not sure how accurate the comparison is, but it seems like the X1 has a higher performing ram system.

You forgotten the X360 only contain 512MB of total ram compared to 8GB ram on X1. 

Link to comment
Share on other sites

You forgotten the X360 only contain 512MB of total ram compared to 8GB ram on X1.

That is true of course, but that doesn't change the numbers regarding performance. You can't ignore the fact that the X1 has far more ram in the system. If the end result is that the X1 has a faster memory system than the 360, does it really matter how it gets there?

Link to comment
Share on other sites

Ryse is one of the best looking games I've seen next gen.

NBA2K14 looks good

BF4 looks good

FORZA looks amazing

Kinect Sports:Rivals looks good

 

Agree with you Showan to an extent, except there is no such thing as good enough for me with graphics :laugh:

That would be the matrix, at that point they'll be able to say good enough, as it won't be any different from reality. Until then we can always use improvement. While both X1 and PS4 had the best launches in entertainment electronics history in terms of software support (none of the previous generations had so much choice or such quality, whatever the negativity fueled trolls keep saying), neither have shown us what they can really do. Ryse is a spectacular game and has good visuals, but it's clearly not a native X1 title and would have looked much nicer were it built for X1 rather than mostly for 360 as Project Kingdoms.

 

As Hawk said, we have not seen what can be done for these machines because developers have yet to bother trying. It will take some time. Same as my friend with the dual Titans who keeps complaining his games looks pretty much like on my PC with its solitary 7950. There's not a developer in the world that would bother working to a spec as high as two GTX Titans at this point in time.

Link to comment
Share on other sites

Yep. Although X1 have more memory but still suffers in terms of performance in many ways due to slower speed that used DDR3. I have no idea how they gonna convince many developer that it's sufficient for games to target 1080p. Watch dogs only run at 900p max which isn't surprised at all while PS4 won't have problem making it 1080p/30fps

DDR3 directly doesn't halt games being 1080p. When people try to discuss hardware specifics, research and look into things before having a discussion. DDR3s purpose in the X1, like most pools of memory for games in asset streaming. If DDR3 was holding anything back on the box, you'd see textures not loading correctly, models and low poly counts before full loading (Remember the UE3 pop-in like in GoW).

 

GDDR is good because you can place the frame buffers on there, frame buffers can be huge depending on the rendering techniques used and graphical techniques in games. Having a large, fast memory pool to store them which is unified with the rest of the assets is a nice easy way of working with memory for developers. The only negative is, like Infamous, you could end up eating 500MB of your unified memory pool which leaves less room for assets.

 

On the X1 on the other hand, the purpose of the eSRAM is to hold the framebuffers due to its high speed nature and read & write ability. The only problem is this is 32mb. To hold a 1080p frame with deferred rendering, you're looking at a lot more than that and a size which wouldn't simply fit on eSRAM. To counteract this, you use the DMAs on the box to move separate render targets between memory pools (which are unified with the same page tables) and partially render the frames after each other by moving frames from DDR3. The only problem is, this is a technique which isn't really done at all and considered until this box was designed. It's not really incorporated into DX11, but with DX12 and bundles this is something which can be managed by the API and performed on the box without the developer really doing anything, thus producing a deferred rendered full native 1080p frame.

 

Quite a nice little webpage describing each:

http://gamedevelopment.tutsplus.com/articles/forward-rendering-vs-deferred-rendering--gamedev-12342

You can see how forward rendering provides one render target. Whereas deferred produces multiple render targets. Bundles and DX12 esentially lets you use different parts of the frame into multiple groups of render targets all which can use deferred rendering and stored in RAM. In case of the X1, these would all be pulled from DDR and rendered in order on eSRAM to get the frame.

 

To all regarding Forza, it uses forward rendering with light maps to fit it all on eSRAM so it can hit 1080p at 60 frames. This is where the negativity and 'downgrade' comes into it as these sacrifices were made to fit the frame on the eSRAM.

Link to comment
Share on other sites

DDR3 directly doesn't halt games being 1080p. When people try to discuss hardware specifics, research and look into things before having a discussion. DDR3s purpose in the X1, like most pools of memory for games in asset streaming. If DDR3 was holding anything back on the box, you'd see textures not loading correctly, models and low poly counts before full loading (Remember the UE3 pop-in like in GoW).

 

GDDR is good because you can place the frame buffers on there, frame buffers can be huge depending on the rendering techniques used and graphical techniques in games. Having a large, fast memory pool to store them which is unified with the rest of the assets is a nice easy way of working with memory for developers. The only negative is, like Infamous, you could end up eating 500MB of your unified memory pool which leaves less room for assets.

 

On the X1 on the other hand, the purpose of the eSRAM is to hold the framebuffers due to its high speed nature and read & write ability. The only problem is this is 32mb. To hold a 1080p frame with deferred rendering, you're looking at a lot more than that and a size which wouldn't simply fit on eSRAM. To counteract this, you use the DMAs on the box to move separate render targets between memory pools (which are unified with the same page tables) and partially render the frames after each other by moving frames from DDR3. The only problem is, this is a technique which isn't really done at all and considered until this box was designed. It's not really incorporated into DX11, but with DX12 and bundles this is something which can be managed by the API and performed on the box without the developer really doing anything, thus producing a deferred rendered full native 1080p frame.

 

Quite a nice little webpage describing each:

http://gamedevelopment.tutsplus.com/articles/forward-rendering-vs-deferred-rendering--gamedev-12342

You can see how forward rendering provides one render target. Whereas deferred produces multiple render targets. Bundles and DX12 esentially lets you use different parts of the frame into multiple groups of render targets all which can use deferred rendering and stored in RAM. In case of the X1, these would all be pulled from DDR and rendered in order on eSRAM to get the frame.

 

To all regarding Forza, it uses forward rendering with light maps to fit it all on eSRAM so it can hit 1080p at 60 frames. This is where the negativity and 'downgrade' comes into it as these sacrifices were made to fit the frame on the eSRAM.

I can't imagine how Forza 5 will run on PS4 if they port it over and optimize from top to bottom because that will truly give us a clearer idea of the ultimate performance discrepancy before moving on to DX12. 

 

To be honest, it really isn't surprised that they managed to target 1080p/60fps since those games don't usually require enormous of GI,DOF, shadow lightning and advanced particle effect as much as open world games. 

 

Nevertheless, X1 definitely still have a lot things to prove and hopefully it won't be another disappointment down the road. The X1 have great unique hardware features and it's interesting to see how well it can strive in the coming years.

Link to comment
Share on other sites

I can't imagine how Forza 5 will run on PS4 if they port it over and optimize from top to bottom because that will truly give us a clearer idea of the ultimate performance discrepancy before moving on to DX12. 

 

To be honest, it really isn't surprised that they managed to target 1080p/60fps since those games don't usually require enormous of GI,DOF, shadow lightning and advanced particle effect as much as open world games. 

 

Nevertheless, X1 definitely still have a lot things to prove and hopefully it won't be another disappointment down the road. The X1 have great unique hardware features and it's interesting to see how well it can strive in the coming years.

The problem is with everything that is going on is with people thinking all the games are GPU bound due to the drop in resolution, where they're simply not. If Forza 5 was ported to the PS4 it would run the exact same, the problem is that they left out things like better lighting and AA and some of the effects above simply due to their rendering technique and what they could fit on eSRAM this early on. I'm guessing what you're referring to optimisation is the including of the effects you've mentioned, this would not be 'optimisation' but rather a re-write of the engine based on new rendering techniques they'd have to implement to support this. You've also got to remember that FM5 is the only locked first party 60fps title of this generation, adding effects like DOF, motion blur and simply more dynamics in the engine will cause uncertainty in the frame rate. I'm guessing Turn 10 are just more focused on providing a rock solid experience rather than pretty in priorities.

 

Obviously there's a lot of benefit incorporating DX12 and using that in their games due to them porting their engine renderer to over to it for Build. It took them 4 man months to do this and keeping mind they're primarily a console developer, obviously this will be used in commercial use for future projects (Forza 6, Horizon 2 maybe?). The features in DX12 will be a massive boost for the X1 primarily in what I've said above and the better threading of the DX processes across the CPU cores, keeping in mind they're low powered 8 core CPUs. 

Link to comment
Share on other sites

The problem is with everything that is going on is with people thinking all the games are GPU bound due to the drop in resolution, where they're simply not. If Forza 5 was ported to the PS4 it would run the exact same, the problem is that they left out things like better lighting and AA and some of the effects above simply due to their rendering technique and what they could fit on eSRAM this early on. I'm guessing what you're referring to optimisation is the including of the effects you've mentioned, this would not be 'optimisation' but rather a re-write of the engine based on new rendering techniques they'd have to implement to support this. You've also got to remember that FM5 is the only locked first party 60fps title of this generation, adding effects like DOF, motion blur and simply more dynamics in the engine will cause uncertainty in the frame rate. I'm guessing Turn 10 are just more focused on providing a rock solid experience rather than pretty in priorities.

 

Obviously there's a lot of benefit incorporating DX12 and using that in their games due to them porting their engine renderer to over to it for Build. It took them 4 man months to do this and keeping mind they're primarily a console developer, obviously this will be used in commercial use for future projects (Forza 6, Horizon 2 maybe?). The features in DX12 will be a massive boost for the X1 primarily in what I've said above and the better threading of the DX processes across the CPU cores, keeping in mind they're low powered 8 core CPUs. 

Simply put, X1 is harder for developer to target 1080p even do technically it's possible because what you might experience is poorer frame rate compare to the competition since it also require you to learn some new technique in order for optimum performance. PS4 don't have much hassle as much as X1 because developer don't need to mess with two different memory pool and some FANBOY OR General People blatantly defend or make excuse 720p or 900p is sufficient for the X1  :omg:  which flat out freak me out why do such people willing to pay extra $ 100 for not getting a better product that's available in the market. It's like paying a intel i7 price tag for i5 performance  :laugh:  http://www.ign.com/wikis/xbox-one/PS4_vs._Xbox_One_Native_Resolutions_and_Framerates

Link to comment
Share on other sites

Yep. Although X1 have more memory but still suffers in terms of performance in many ways due to slower speed that used DDR3. I have no idea how they gonna convince many developer that it's sufficient for games to target 1080p. Watch dogs only run at 900p max which isn't surprised at all while PS4 won't have problem making it 1080p/30fps

 

This competitive landscape isn't identical to 7th generation console that at the time both were pretty much running the same resolution despite PS3 is far more powerful. It's easy to optimized the games for both platform since the architecture way too similar and PS4 happened to be more like giant leap than X1 and it's no brainer not make it better since they don't have to deal with myriad of issues such as the cell processors anymore. 

 

PS3 wasn't "far more powerful". The CPU was better than the 360's, but the GPU was inferior, and it's system memory was split from its video memory which most devs disliked.

  • Like 3
Link to comment
Share on other sites

Simply put, X1 is harder for developer to target 1080p even do technically it's possible because what you might experience is poorer frame rate compare to the competition since it also require you to learn some new technique in order for optimum performance. PS4 don't have much hassle as much as X1 because developer don't need to mess with two different memory pool and some FANBOY OR General People blatantly defend or make excuse 720p or 900p is sufficient for the X1  :omg:  which flat out freak me out why do such people willing to pay extra $ 100 for not getting a better product that's available in the market. It's like paying a intel i7 price tag for i5 performance  :laugh:  http://www.ign.com/wikis/xbox-one/PS4_vs._Xbox_One_Native_Resolutions_and_Framerates

In the current scenario yes, but in the release of DX12 with it handling multiple render targets, it's done all for you so there's no development complexity there. That's what the middle-tier API is for. Just for the record, I never accepted 720p or 900p to be acceptable and was vastly disappointed when my primary console in 2014 couldn't handle games at 1080p. 

 

Regarding the price, its a fair argument based on the context of the box and what first party IPs you prefer and where your friends are. You also could argue about superior online and the OS which has shaped since launch to be excellent. Oh, TV integration in the UK is also superb.

  • Like 2
Link to comment
Share on other sites

some FANBOY OR General People blatantly defend or make excuse 720p or 900p is sufficient for the X1  :omg:  which flat out freak me out why do such people willing to pay extra $ 100 for not getting a better product that's available in the market. It's like paying a intel i7 price tag for i5 performance  :laugh:  http://www.ign.com/wikis/xbox-one/PS4_vs._Xbox_One_Native_Resolutions_and_Framerates

I would point out that if someone has different priorities when it comes to buying a console, I could see why some would be ok with 900p in a game.

A lot of people don't follow the numbers like we do.

I agree that there are silly arguments made about it, but lets not make it a general statement that because some games are running at 720p/900p that the console is not 'worth' the $100 difference. Its not worth it to you and anyone with 1080p resolution being a high priority over all.

I'm in the camp where I want as high a resolution as I can get, so I have a pc to game on. For my console gaming, I am not as concerned as long as the game looks good to me in motion. Would 1080p be preferable in every game? Sure. I want my ps4 and x1 games running at 1080p whenever possible, but I'm not going to avoid games that don't hit that mark.

  • Like 3
Link to comment
Share on other sites

Simply put, X1 is harder for developer to target 1080p even do technically it's possible because what you might experience is poorer frame rate compare to the competition since it also require you to learn some new technique in order for optimum performance. PS4 don't have much hassle as much as X1 because developer don't need to mess with two different memory pool and some FANBOY OR General People blatantly defend or make excuse 720p or 900p is sufficient for the X1  :omg:  which flat out freak me out why do such people willing to pay extra $ 100 for not getting a better product that's available in the market. It's like paying a intel i7 price tag for i5 performance  :laugh:  http://www.ign.com/wikis/xbox-one/PS4_vs._Xbox_One_Native_Resolutions_and_Framerates

 

Why do you think getting one means not getting the other? Forgive me for saying this but that is "fanboy" thinking in the flesh. Sure, many don't want to spend money on multiple devices but it's not like if you choose one then that's it, you're locked in. Don't like Ryse in 900p? No problem, as trooper said go and put together a $2000 PC so you can play Rome Total War in 4K on a $800 monitor. These are great times because we have more options than ever, and honestly none of those options are bad. I just used a friend's Fire TV and Sev Zero is freakin awesome on a big TV. I'm also playing Shadowrun Returns on an Android tablet, and it's only 800p. Looks the same as the 1080p version i saw on PC, to be frank. Keep things in perspective. The X1 could do much more and hopefully it will, but it's not anywhere as bad as some might have you believe.

Link to comment
Share on other sites

This topic is now closed to further replies.