Jump to content



Photo

Microsoft Xbox One -> Info that may not be well known

xbox one microsoft kinect xbox live

  • Please log in to reply
74 replies to this topic

#61 vetFourjays

Fourjays

    Neowinian Senior

  • Joined: 09-September 05
  • Location: Staffordshire, UK

Posted 11 July 2013 - 15:16

Look at it this way, the graphics engine would have to tick over twice as fast to hit 60fps from 30 which means they have to optimize it by 50%. A very big task.

Speaking from personal experience, just because it is locked at 30fps doesn't necessarily mean they would have 30fps of optimizations to find. It may be capable of running at 60fps mostly, with certain situations dipping it to 50fps or 40fps. As it has to be 30fps or 60fps, you'd obviously have to go with 30fps as a lower constant frame rate is less noticeable than frame rate drops. The big question is whether Sony can find the optimizations they seem to require.
 

Yeah, I am not buying that entire PC thing at all.  If that is true then why are so many PS4 games at 30 Frames per second.  Killzone should be running 60 easily.  Nope not buying that.

I did post a likely explanation earlier. As I understand it, from a developer standpoint both the Xbox 360 and Xbox One are quite similar. DirectX is used for graphics, etc. The Xbox has a mature and very well developed system of developer tools. So all developers have got to is take advantage of the newly available power. With the PS4, Sony have gone from the weird and wonderful architecture of the PS3 to x86. They don't have access to age-old tools like DirectX (for obvious reasons), so have had to write their own equivalents. Developers have got to learn these and have probably had to significantly rewrite (or write from scratch) their own engines and tools as well. No idea of the actual FPS involved, but I can see this kind of situation evidenced in the early PS3 games I've played - many seem to have frame rate issues that their later equivalents do not (GTA IV > RDR, Oblivion > Skyrim).
 
No offense intended, but I think you are adding two and two and getting five. You've taken one part of an already unspecific quote from Turn 10, taken the respective frame rate differences and decided that the cloud is the reason for this, while simultaneously disregarding a whole load of far more likely scenarios. Architectural difficulties on the PS4 seem far more likely to me than the cloud being solely responsible for 60fps on the Xbox One. Given the supposed power advantage, I'd be far more surprised if the PS4 was physically incapable of 60fps (especially as none of the games shown for either console were that much of a jump visually). I'll be happy to be proved wrong when the Xbox One is released and we can see just how it all works.
 
We'll have to agree to disagree for now though and see how it ends up. I'm most interested in what the cloud can bring to gameplay (Ubisoft's line-up in particular is very impressive - was far and away the best presentation at E3 for me).




#62 George P

George P

    Neowinian Senior

  • Tech Issues Solved: 1
  • Joined: 04-February 07
  • Location: Greece
  • OS: Windows 8.1 Pro 64bit
  • Phone: HTC Windows Phone 8X

Posted 11 July 2013 - 16:07

There is a difference between artificially locking a games frame rate to 30fps or leaving it unlocked.  Some games want to push a slower type of gameplay mechanic, if that's what the new KZ is going for then limiting it to 30 frames targets that goal.  Now if a game that's suppose to run faster but can't because of poor coding at the time, that's a whole different issue and optimization is key to fixing that.

 

Case in point, I've been playing bioshock infinite the past few days and I'm pretty sure I saw an option in there that would lock the framerate to a fixed number, I turned it off because I wanted the quicker movement but I do believe it was on by default?   Anyways, this and things like v-sync, aren't anything new.  Some games don't need 60FPS, some do.  It's up to the developer really.



#63 BajiRav

BajiRav

    Neowinian Senior

  • Joined: 15-July 04
  • Location: Xbox, where am I?
  • OS: Windows 8.1, Windows 8
  • Phone: Lumia 920

Posted 11 July 2013 - 17:32

They can't make a good software stack to compliment their hardware. Running a box with a modified version of FreeBSD is not a good way to go.

 

The Xbox runs 3 HyperVM's which compliment each other and specifically designed to work exactly in harmony with the hardware the software sits on. You have hardware compression/decompression too offload data compression to sent to the cloud from the CPU. You've got memory buses to inject data from the cloud straight into the RAM. Just things like that which make a huge difference.

2 VMs and 1 hypervisor that hosts them.



#64 George P

George P

    Neowinian Senior

  • Tech Issues Solved: 1
  • Joined: 04-February 07
  • Location: Greece
  • OS: Windows 8.1 Pro 64bit
  • Phone: HTC Windows Phone 8X

Posted 11 July 2013 - 17:38

2 VMs and 1 hypervisor that hosts them.

 

And one VM only runs when you're playing a Xbox game.  I don't think the dashboard runs on the Xbox OS anymore like on the 360.   So we can finally switch over to the full dashboard without having to quit a game or app, thank god. 



#65 OP Yogurtmaster

Yogurtmaster

    Neowinian

  • Tech Issues Solved: 1
  • Joined: 18-February 12

Posted 11 July 2013 - 19:07

Speaking from personal experience, just because it is locked at 30fps doesn't necessarily mean they would have 30fps of optimizations to find. It may be capable of running at 60fps mostly, with certain situations dipping it to 50fps or 40fps. As it has to be 30fps or 60fps, you'd obviously have to go with 30fps as a lower constant frame rate is less noticeable than frame rate drops. The big question is whether Sony can find the optimizations they seem to require.
 

I did post a likely explanation earlier. As I understand it, from a developer standpoint both the Xbox 360 and Xbox One are quite similar. DirectX is used for graphics, etc. The Xbox has a mature and very well developed system of developer tools. So all developers have got to is take advantage of the newly available power. With the PS4, Sony have gone from the weird and wonderful architecture of the PS3 to x86. They don't have access to age-old tools like DirectX (for obvious reasons), so have had to write their own equivalents. Developers have got to learn these and have probably had to significantly rewrite (or write from scratch) their own engines and tools as well. No idea of the actual FPS involved, but I can see this kind of situation evidenced in the early PS3 games I've played - many seem to have frame rate issues that their later equivalents do not (GTA IV > RDR, Oblivion > Skyrim).
 
No offense intended, but I think you are adding two and two and getting five. You've taken one part of an already unspecific quote from Turn 10, taken the respective frame rate differences and decided that the cloud is the reason for this, while simultaneously disregarding a whole load of far more likely scenarios. Architectural difficulties on the PS4 seem far more likely to me than the cloud being solely responsible for 60fps on the Xbox One. Given the supposed power advantage, I'd be far more surprised if the PS4 was physically incapable of 60fps (especially as none of the games shown for either console were that much of a jump visually). I'll be happy to be proved wrong when the Xbox One is released and we can see just how it all works.
 
We'll have to agree to disagree for now though and see how it ends up. I'm most interested in what the cloud can bring to gameplay (Ubisoft's line-up in particular is very impressive - was far and away the best presentation at E3 for me).

 

  The x86 platform is the most well known platform out of them all.  I don't believe there is an easier platform to work with, it's like 40 years old evolved over time.  It has Direct 3D from Microsoft, it has OpenGL and the the software stack should be very mature by now (for most OSes, including BSD).  Sony has had plenty of time to optimize that software stack.  I know that they are using their own libraries but I seriously doubt this is the reason.   Macs run on x86, Linux runs on x86, Windows of course runs on x86, even the PS4's BSD runs on x86.  So, yeah.  I am not buying it. This isn't the cell, this is x86.  

 

 I have asked several times to tell me how Halo 5 (or whatever it's being called) is running at 60 frames per second and we know Battlefield is as well and then we have that statement from Turn 10.  Let me say this easily, when you take the processing load off of the console in a significant manner, how hard is it to get more frames per second?  When you off load processing to a server, you gain what?  Speed?  It's not a hard concept.  It's not 2+2=5, it's pretty common sense.  Now the speed can be different depending on what you are offloading and how much of it, but the common sense is still there. 



#66 BajiRav

BajiRav

    Neowinian Senior

  • Joined: 15-July 04
  • Location: Xbox, where am I?
  • OS: Windows 8.1, Windows 8
  • Phone: Lumia 920

Posted 11 July 2013 - 19:30

  The x86 platform is the most well known platform out of them all.  I don't believe there is an easier platform to work with, it's like 40 years old evolved over time.  It has Direct 3D from Microsoft, it has OpenGL and the the software stack should be very mature by now (for most OSes, including BSD).  Sony has had plenty of time to optimize that software stack.  I know that they are using their own libraries but I seriously doubt this is the reason.   Macs run on x86, Linux runs on x86, Windows of course runs on x86, even the PS4's BSD runs on x86.  So, yeah.  I am not buying it. This isn't the cell, this is x86.  

 

 I have asked several times to tell me how Halo 5 (or whatever it's being called) is running at 60 frames per second and we know Battlefield is as well and then we have that statement from Turn 10.  Let me say this easily, when you take the processing load off of the console in a significant manner, how hard is it to get more frames per second?  When you off load processing to a server, you gain what?  Speed?  It's not a hard concept.  It's not 2+2=5, it's pretty common sense.  Now the speed can be different depending on what you are offloading and how much of it, but the common sense is still there. 

 

According to popular opinion (here and elsewhere) in order,

 

1. Microsoft doesn't care for gamers and is cheap so they used cheap DDR3 RAM in XB1 (and they hate indies)

2. Sony cares for gamers and indies and went with GDDR5

 

which lead to

 

1. XB1 now has a complex SoC layout with esRAM and Move Engines to compensate for "pathetic and slow" main RAM

2. PS4 has a simple traditional x86-64 layout SoC

 

which lead to

 

1. XB1 development will be complex requiring more developer time and efforts

2. PS4 has simple and less complex development cycle compared to XB1

 

which lead to

 

1. XB1 dev. tools will be complex and "headache" for developers ...and indies don't forget the indies

2. PS4 has dev. friendly tools that gives "direct access to metal/hardware" = better performance and is more dev-friendly console

 

which lead to

 

1. All/many* XB1 E3 demos ran at 1080p@60FPS

1. All/majority* PS4 E3 demos ran at 1080p@30FPS

 

Brilliant, isn't it?

 

 

*I don't know for sure and those * were put for obvious reasons. If you reached this far..congrats.

**I also didn't know where to put "XB1 was 6months behind schedule and all launch games will suffer from the delays" in the timeline.



#67 francescob

francescob

    Neowinian Senior

  • Tech Issues Solved: 1
  • Joined: 04-November 08

Posted 11 July 2013 - 19:41

 

- Can recognize up to 6 skeletons at once

 

Good for Halloweens!


 

- Can recognize up to 6 skeletons at once

 

Good for Halloweens!



#68 OP Yogurtmaster

Yogurtmaster

    Neowinian

  • Tech Issues Solved: 1
  • Joined: 18-February 12

Posted 12 July 2013 - 07:10

According to popular opinion (here and elsewhere) in order,

 

1. Microsoft doesn't care for gamers and is cheap so they used cheap DDR3 RAM in XB1 (and they hate indies)

2. Sony cares for gamers and indies and went with GDDR5

 

which lead to

 

1. XB1 now has a complex SoC layout with esRAM and Move Engines to compensate for "pathetic and slow" main RAM

2. PS4 has a simple traditional x86-64 layout SoC

 

which lead to

 

1. XB1 development will be complex requiring more developer time and efforts

2. PS4 has simple and less complex development cycle compared to XB1

 

which lead to

 

1. XB1 dev. tools will be complex and "headache" for developers ...and indies don't forget the indies

2. PS4 has dev. friendly tools that gives "direct access to metal/hardware" = better performance and is more dev-friendly console

 

which lead to

 

1. All/many* XB1 E3 demos ran at 1080p@60FPS

1. All/majority* PS4 E3 demos ran at 1080p@30FPS

 

Brilliant, isn't it?

 

 

*I don't know for sure and those * were put for obvious reasons. If you reached this far..congrats.

**I also didn't know where to put "XB1 was 6months behind schedule and all launch games will suffer from the delays" in the timeline.

 

  You know whats even worse?  A few days before E3 there was a huge rumor that most people on gaming forums like "NeoGAF" believed that Microsoft was "paying money" to third party developers so they would not display their PS4 version of the game because it looked so much better than the XB1.  

 

LOL * 10,000!

 

   Look, I am not claiming that the XB1 is the best console in history, but the hate is highly overrated.  The DRM made everyone go insane and rationality went out of the window.  



#69 +D. FiB3R

D. FiB3R

    aka DARKFiB3R

  • Tech Issues Solved: 3
  • Joined: 06-November 02
  • Location: SE London
  • OS: Windows 8.1 Pro x64
  • Phone: Lumia 625

Posted 12 July 2013 - 08:55

And still you avoid answering my question, Yogurtmaster.

 

I can only conclude that the very first "fact" in the op, is in fact made up ######.



#70 vetFourjays

Fourjays

    Neowinian Senior

  • Joined: 09-September 05
  • Location: Staffordshire, UK

Posted 12 July 2013 - 10:24

  The x86 platform is the most well known platform out of them all.  I don't believe there is an easier platform to work with, it's like 40 years old evolved over time.  It has Direct 3D from Microsoft, it has OpenGL and the the software stack should be very mature by now (for most OSes, including BSD).  Sony has had plenty of time to optimize that software stack.  I know that they are using their own libraries but I seriously doubt this is the reason.   Macs run on x86, Linux runs on x86, Windows of course runs on x86, even the PS4's BSD runs on x86.  So, yeah.  I am not buying it. This isn't the cell, this is x86.  

 

 I have asked several times to tell me how Halo 5 (or whatever it's being called) is running at 60 frames per second and we know Battlefield is as well and then we have that statement from Turn 10.  Let me say this easily, when you take the processing load off of the console in a significant manner, how hard is it to get more frames per second?  When you off load processing to a server, you gain what?  Speed?  It's not a hard concept.  It's not 2+2=5, it's pretty common sense.  Now the speed can be different depending on what you are offloading and how much of it, but the common sense is still there. 

Battlefield has been confirmed to be running at 1080p @ 60fps on the PS4. Sort of proves my point that the hardware is capable of it and that the cloud on the Xbox isn't the sole reason for 60fps instead of 30fps. Of course Dice could be running the PS4 version with lower effects, but multiplatform developers don't usually do that (hence why multiplatform games are usually the best indicator of system flaws). There is also a lot of talk at the moment of developers having to create their own low-level APIs on the PS4, which could perhaps be more of a problem to the smaller Sony exclusive developers than the might of EA/Dice.

 

I've never disputed that offloading to the cloud can give more power for more frames (or more graphics or more features). I'm not sure why you keep insisting I don't get the concept. I've only disputed your claim, based on a selective one-half of a single out-of-context quote to a biased source, that it can give another 30fps (I've still not seen any developer claim it directly, nor have I even seen Microsoft claim it). Think about what you can afford to offload to the cloud as a developer of your average game and it is the minor details that you never directly interact with, like tree sway and flying debris (anything more major than that becomes problematic). Maybe I'm grossly underestimating how much processing is wasted on these details, but it doesn't seem like enough to gain another 30fps. We'd be talking about these kind of offload-able details taking up half of all the processing time as games stand currently, which I find hard to believe from my brief foray into game development.

 

I'm not sure why we're even trying to use fps as a measurement at this point. Until we can compare multiplatform titles side-by-side like-for-like on each platform, it doesn't really tell us anything. 
 
Also, whoever claimed the Xbox One would be hard to develop for because of its architecture is frankly an idiot. If there is one thing Microsoft does well, it is making architecturally sound software and hardware.


#71 OP Yogurtmaster

Yogurtmaster

    Neowinian

  • Tech Issues Solved: 1
  • Joined: 18-February 12

Posted 12 July 2013 - 10:29

 

Battlefield has been confirmed to be running at 1080p @ 60fps on the PS4. Sort of proves my point that the hardware is capable of it and that the cloud on the Xbox isn't the sole reason for 60fps instead of 30fps. Of course Dice could be running the PS4 version with lower effects, but multiplatform developers don't usually do that (hence why multiplatform games are usually the best indicator of system flaws). There is also a lot of talk at the moment of developers having to create their own low-level APIs on the PS4, which could perhaps be more of a problem to the smaller Sony exclusive developers than the might of EA/Dice.

 

I've never disputed that offloading to the cloud can give more power for more frames (or more graphics or more features). I'm not sure why you keep insisting I don't get the concept. I've only disputed your claim, based on a selective one-half of a single out-of-context quote to a biased source, that it can give another 30fps (I've still not seen any developer claim it directly, nor have I even seen Microsoft claim it). Think about what you can afford to offload to the cloud as a developer of your average game and it is the minor details that you never directly interact with, like tree sway and flying debris (anything more major than that becomes problematic). Maybe I'm grossly underestimating how much processing is wasted on these details, but it doesn't seem like enough to gain another 30fps. We'd be talking about these kind of offload-able details taking up half of all the processing time as games stand currently, which I find hard to believe from my brief foray into game development.

 

I'm not sure why we're even trying to use fps as a measurement at this point. Until we can compare multiplatform titles side-by-side like-for-like on each platform, it doesn't really tell us anything. 
 
Also, whoever claimed the Xbox One would be hard to develop for because of its architecture is frankly an idiot. If there is one thing Microsoft does well, it is making architecturally sound software and hardware.

 

 

Thanks for the link, that is the first I have heard about battlefield, I will update my documentation.  As far as why Sony would have developers make their own low level API's when they have theirs.  I forgot the name for it, but it isn't OpenGL, it interfaces with OpenGL, but it's called something else.  Is this supposed to be something that allows "close to the metal" coding?



#72 JonnyLH

JonnyLH

    I say things.

  • Joined: 15-February 13
  • Location: UK
  • OS: W8, W7, WP8, iOS, Ubuntu
  • Phone: Nokia Lumia 920

Posted 12 July 2013 - 12:46

Thanks for the link, that is the first I have heard about battlefield, I will update my documentation.  As far as why Sony would have developers make their own low level API's when they have theirs.  I forgot the name for it, but it isn't OpenGL, it interfaces with OpenGL, but it's called something else.  Is this supposed to be something that allows "close to the metal" coding?

It's a bit up in the air, because they've never confirmed what they're using.

 

Sony have gave a very mis-leading interpretation of what it's running stating that the CPU was designed around Direct X instruction sets. This makes me think they're not actually using OpenGL and have created a CPU which can understand instruction sets which are based around DirectX. This makes me think that they're not actually running anything between the application and the hardware and are encouraging developers to code on the metal with some familiarity due to it being similar to how DirectX interacts.

 

This would make for very hard optimization and development. Hence why we would be seeing sub 30FPS across some games. 

 

PS: Sorry if I'm not accurate, I've been knocking out false info all over here ha.



#73 BajiRav

BajiRav

    Neowinian Senior

  • Joined: 15-July 04
  • Location: Xbox, where am I?
  • OS: Windows 8.1, Windows 8
  • Phone: Lumia 920

Posted 12 July 2013 - 12:54

It's a bit up in the air, because they've never confirmed what they're using.
 
Sony have gave a very mis-leading interpretation of what it's running stating that the CPU was designed around Direct X instruction sets. This makes me think they're not actually using OpenGL and have created a CPU which can understand instruction sets which are based around DirectX. This makes me think that they're not actually running anything between the application and the hardware and are encouraging developers to code on the metal with some familiarity due to it being similar to how DirectX interacts.
 
This would make for very hard optimization and development. Hence why we would be seeing sub 30FPS across some games. 
 
PS: Sorry if I'm not accurate, I've been knocking out false info all over here ha.

An OpenGL wrapper that looks like DX feature set (or whatever Sony called it). It is a given that they will be either OpenGL or DX, inventing their own stuff won't make it easy for anyone.

#74 JonnyLH

JonnyLH

    I say things.

  • Joined: 15-February 13
  • Location: UK
  • OS: W8, W7, WP8, iOS, Ubuntu
  • Phone: Nokia Lumia 920

Posted 12 July 2013 - 13:04

An OpenGL wrapper that looks like DX feature set (or whatever Sony called it). It is a given that they will be either OpenGL or DX, inventing their own stuff won't make it easy for anyone.

Done a bit of digging and found the best post I've seen on the topic. Seems to be from a developer who's used both.

http://www.rage3d.co...93&postcount=10



#75 ahhell

ahhell

    Neowinian Senior

  • Joined: 30-June 03
  • Location: Winnipeg - coldest place on Earth - yeah

Posted 12 July 2013 - 13:15

Done a bit of digging and found the best post I've seen on the topic. Seems to be from a developer who's used both.

http://www.rage3d.co...93&postcount=10

 

Wow, that was excellent.





Click here to login or here to register to remove this ad, it's free!