Microsoft Xbox One -> Info that may not be well known


Recommended Posts

You seem to be disagreeing with me agreeing with you.  :wacko: As I said, and as devs have said, and as you have now said, the cloud is going to be used for latency insensitive operations (I have never disagreed with this). All I'm saying is this isn't going to result in double frame rates. It will improve frame rates and graphics as more local power is available, but to get that much out of it would require latency sensitive operations to be offloaded. Of course all this does depend on the game in question.

 

I'm not entirely convinced on game streaming either (at least for the near future). It needs heavy server side optimization, excellent compression/decompression at either end and much more reliable Internet connections than most people probably have. However, the bandwidth requirements shouldn't be much more than those required for streaming an HD video. My point is that by the time you can start to offload latency sensitive operations to the cloud, you'd be in a position to just stream whole games. If anything it would be more desirable as instead of uploading and download many different individual snippets of game data once a frame, you only upload input and download video. I'm sure when the time comes Microsoft will be all over it.

 

Just to prove that it can speed up frame rates.  I am going to give you a source and a quote.  This is why I also think that the next Halo will run at 60 instead of 30, which of course is double the frame rate.  Battlefield 4 was also mentioned to be 64-player which sounds like Microsoft's form of "Dedicated Servers" and I could be wrong on this, but they also said 60 but for Xbox One and they did not mention the PS4 (other people assumed it, but it is still an assumption, as far as I can tell, I haven't found this to be confirmed on the PS4). 

 

Here is the source and the quote that you will want to read that proves my point.  Keep in mind that Dan Greenwalt is the head of Turn 10 Studios. Notice what "offloading" means.  Usually when you "offload" you can gain back some speed. 

 

Source: http://www.oxm.co.uk/56323/turn-10-explains-forza-5s-xbox-one-cloud-processing-actual-file-transfers-are-small/

 

"Forza's known for a very solid 60. and so having more power on the box obviously, but also offloading power to the cloud allows us to do that 1080p and 60 frames at a level that most games would just be considering for 30."

 

 My own analysis about losing your connection (which would rarely happen), it would then default to that of what is on the disk or drive.  So, the graphics would scale down or the speed would scale down.  Something is scaling down if the Internet connection is lost.  

 

  As for streaming HD Video, you rarely are streaming 1080p for more than just 1 and a half hours.  Usually when people play games, they play much longer and those same people are also going to be using things like Netflix as well.  All of this adds up.  

 

The higher the video quality the more bandwidth that is needed and thus the bigger Internet connections.    What about frame rates of 60 frames per second, how is that going to be handled on something like this?

 

The Microsoft method is far superior, it won't take up nearly as much bandwidth at all and the worlds can be a lot more dynamic as well.

 

This is what I found about the requirements before Sony bought them.... 30 frames per second at 720p....

 

"Gaikai recommends an Internet connection of 5 Mbit/s or faster, and a 3 Mbit/s connection meets the minimum system requirements." 

 

I have a 6 Megabit connection and if I use this at 1080p nobody could do anything else in the house. 

Link to comment
Share on other sites

Here is the source and the quote that you will want to read that proves my point.  Keep in mind that Dan Greenwalt is the head of Turn 10 Studios. Notice what "offloading" means.  Usually when you "offload" you can gain back some speed.

I know what offloading is. I have even stated you get more power available for other things as a result.

 

Source: http://www.oxm.co.uk/56323/turn-10-explains-forza-5s-xbox-one-cloud-processing-actual-file-transfers-are-small/

 

"Forza's known for a very solid 60. and so having more power on the box obviously, but also offloading power to the cloud allows us to do that 1080p and 60 frames at a level that most games would just be considering for 30."[/size]

You are really basing the idea of doubling frame rates on one fairly unspecific quote from the Turn 10 Studios head talking to the Official Xbox Magazine? I find the information provided by Respawn and Ubisoft far more telling of what the cloud can do than one remark to a (naturally) biased media outlet.

 

I'm not sure the quote even says quite what you seem to think it does:

 

Forza's known for a very solid 60. and so having more power on the box obviously, but also offloading power to the cloud allows us to do that 1080p and 60 frames at a level that most games would just be considering for 30.

It is open to interpretation I guess, but to me it says the combined power of the Xbox One's increased power and the cloud allow them to do a solid 60fps instead of 30fps. Not that the cloud specifically allows them to reach that and the Xbox would chug along at 30fps without it. The whole quote reads like it is about the difference from this-gen to next-gen.

 

I'm not going to continue arguing with you though as it seems like an exercise in futility. I'll believe the cloud can double frame rates when Forza/Halo played with an Internet connection runs at 60fps and without one it runs at 30fps.

Link to comment
Share on other sites

I know what offloading is. I have even stated you get more power available for other things as a result.

 

You are really basing the idea of doubling frame rates on one fairly unspecific quote from the Turn 10 Studios head talking to the Official Xbox Magazine? I find the information provided by Respawn and Ubisoft far more telling of what the cloud can do than one remark to a (naturally) biased media outlet.

 

I'm not sure the quote even says quite what you seem to think it does:

 

It is open to interpretation I guess, but to me it says the combined power of the Xbox One's increased power and the cloud allow them to do a solid 60fps instead of 30fps. Not that the cloud specifically allows them to reach that and the Xbox would chug along at 30fps without it. The whole quote reads like it is about the difference from this-gen to next-gen.

 

I'm not going to continue arguing with you though as it seems like an exercise in futility. I'll believe the cloud can double frame rates when Forza/Halo played with an Internet connection runs at 60fps and without one it runs at 30fps.

 

I know what offloading is. I have even stated you get more power available for other things as a result.

 

You are really basing the idea of doubling frame rates on one fairly unspecific quote from the Turn 10 Studios head talking to the Official Xbox Magazine? I find the information provided by Respawn and Ubisoft far more telling of what the cloud can do than one remark to a (naturally) biased media outlet.

 

I'm not sure the quote even says quite what you seem to think it does:

 

It is open to interpretation I guess, but to me it says the combined power of the Xbox One's increased power and the cloud allow them to do a solid 60fps instead of 30fps. Not that the cloud specifically allows them to reach that and the Xbox would chug along at 30fps without it. The whole quote reads like it is about the difference from this-gen to next-gen.

 

I'm not going to continue arguing with you though as it seems like an exercise in futility. I'll believe the cloud can double frame rates when Forza/Halo played with an Internet connection runs at 60fps and without one it runs at 30fps.

 

"At a level that most games would just be considering for 30".  In other words I see it as that adding in the cloud offloads processes so that they can do things that most games couldn't do (I.E running at 60 frames per second).  

 

If you look at Halo for Xbox One, why would they do 60 frames per second for the first time.  I don't see many shooters on consoles doing 60 frames per second.   I know Killzone isn't 60 frames per second, they are doing 30.

Link to comment
Share on other sites

Sigh.

 

"At a level that most games would just be considering for 30".  In other words I see it as that adding in the cloud offloads processes so that they can do things that most games couldn't do (I.E running at 60 frames per second).  

 

If you look at Halo for Xbox One, why would they do 60 frames per second for the first time.  I don't see many shooters on consoles doing 60 frames per second.   I know Killzone isn't 60 frames per second, they are doing 30.

You are picking up on the second part of the quote and not the first. The "but" joins the two parts, and the "also" makes the comment on the cloud connected to the first about the increased power of the Xbox One. Hence why it reads like a comment about Xbox One's capabilities as a whole over this/last-gen.

 

Probably because the Xbox One now has a whole boatload more graphical power? The Xbox One now has the same kind of graphics capabilities as a semi-decent PC, which have been capable of 60fps for some time. Hell, I've got a 4+ year old video card in my PC that could probably run at 60fps and wipe the floor with anything the PS3 or Xbox 360 could do. And the graphics going into the One/PS4 are much more powerful than that, so it isn't hard to see where the 60fps is coming from.

Link to comment
Share on other sites

Sigh.

 

You are picking up on the second part of the quote and not the first. The "but" joins the two parts, and the "also" makes the comment on the cloud connected to the first about the increased power of the Xbox One. Hence why it reads like a comment about Xbox One's capabilities as a whole over this/last-gen.

 

Probably because the Xbox One now has a whole boatload more graphical power? The Xbox One now has the same kind of graphics capabilities as a semi-decent PC, which have been capable of 60fps for some time. Hell, I've got a 4+ year old video card in my PC that could probably run at 60fps and wipe the floor with anything the PS3 or Xbox 360 could do. And the graphics going into the One/PS4 are much more powerful than that, so it isn't hard to see where the 60fps is coming from.

You do know comparing consoles to PC's are like chalk and cheese? Even though this gen is running X86.

 

Consoles have a fixed architecture with a fixed platform. Regarding development, developers only have to worry about how to use all the power they have in front of them. On computers, they have to worry about how to support all the types of architectures with different specs and OS's overheads. So even though the consoles don't have the raw power of PC's, because of how they are and the development process for them they produce games which look just as good as the high-end PC's at the time to start off with. On a side note, I wish I had the spare money these days for a high-end PC, I miss PC gaming :(

 

Regarding the FPS argument with the cloud, I think he's referring to both as a combined platform. The thing is offloading AI into the cloud is going to really offload some local power, which does help with the optimizing and hitting the 60fps mark.

Link to comment
Share on other sites

Sigh.

 

You are picking up on the second part of the quote and not the first. The "but" joins the two parts, and the "also" makes the comment on the cloud connected to the first about the increased power of the Xbox One. Hence why it reads like a comment about Xbox One's capabilities as a whole over this/last-gen.

 

Probably because the Xbox One now has a whole boatload more graphical power? The Xbox One now has the same kind of graphics capabilities as a semi-decent PC, which have been capable of 60fps for some time. Hell, I've got a 4+ year old video card in my PC that could probably run at 60fps and wipe the floor with anything the PS3 or Xbox 360 could do. And the graphics going into the One/PS4 are much more powerful than that, so it isn't hard to see where the 60fps is coming from.

 

   Yeah, I am not buying that entire PC thing at all.  If that is true then why are so many PS4 games at 30 Frames per second.  Killzone should be running 60 easily.  Nope not buying that.   

Link to comment
Share on other sites

   Yeah, I am not buying that entire PC thing at all.  If that is true then why are so many PS4 games at 30 Frames per second.  Killzone should be running 60 easily.  Nope not buying that.   

They can't make a good software stack to compliment their hardware. Running a box with a modified version of FreeBSD is not a good way to go.

 

The Xbox runs 3 HyperVM's which compliment each other and specifically designed to work exactly in harmony with the hardware the software sits on. You have hardware compression/decompression too offload data compression to sent to the cloud from the CPU. You've got memory buses to inject data from the cloud straight into the RAM. Just things like that which make a huge difference.

Link to comment
Share on other sites

   Yeah, I am not buying that entire PC thing at all.  If that is true then why are so many PS4 games at 30 Frames per second.  Killzone should be running 60 easily.  Nope not buying that.   

WTF?

KZ running at 30FPS was a decision made by the dev.  The KZ series had slower game play anyway.  Focusing on 30FPS for the first gen of games isn't that big of deal (unfamiliar hardware).  If they are still doing 30fps, 3 years down the road THAT is a problem.

Link to comment
Share on other sites

WTF?

KZ running at 30FPS was a decision made by the dev.  The KZ series had slower game play anyway.  Focusing on 30FPS for the first gen of games isn't that big of deal (unfamiliar hardware).  If they are still doing 30fps, 3 years down the road THAT is a problem.

Unfortunately its usually graphic fidelity what improves over the life span not the fps. If devs target 30fps, it'll usually stay at that and target more fidelity in the game once they get more familiar.

 

Look at it this way, the graphics engine would have to tick over twice as fast to hit 60fps from 30 which means they have to optimize it by 50%. A very big task.

 

If every game is hitting 60fps on the X1 with the same graphic fidelity, then the difference between consoles will be insane and they'll have to do something about it.

Link to comment
Share on other sites

Look at it this way, the graphics engine would have to tick over twice as fast to hit 60fps from 30 which means they have to optimize it by 50%. A very big task.

Speaking from personal experience, just because it is locked at 30fps doesn't necessarily mean they would have 30fps of optimizations to find. It may be capable of running at 60fps mostly, with certain situations dipping it to 50fps or 40fps. As it has to be 30fps or 60fps, you'd obviously have to go with 30fps as a lower constant frame rate is less noticeable than frame rate drops. The big question is whether Sony can find the optimizations they seem to require.

 

Yeah, I am not buying that entire PC thing at all.  If that is true then why are so many PS4 games at 30 Frames per second.  Killzone should be running 60 easily.  Nope not buying that.

I did post a likely explanation earlier. As I understand it, from a developer standpoint both the Xbox 360 and Xbox One are quite similar. DirectX is used for graphics, etc. The Xbox has a mature and very well developed system of developer tools. So all developers have got to is take advantage of the newly available power. With the PS4, Sony have gone from the weird and wonderful architecture of the PS3 to x86. They don't have access to age-old tools like DirectX (for obvious reasons), so have had to write their own equivalents. Developers have got to learn these and have probably had to significantly rewrite (or write from scratch) their own engines and tools as well. No idea of the actual FPS involved, but I can see this kind of situation evidenced in the early PS3 games I've played - many seem to have frame rate issues that their later equivalents do not (GTA IV > RDR, Oblivion > Skyrim).

 

No offense intended, but I think you are adding two and two and getting five. You've taken one part of an already unspecific quote from Turn 10, taken the respective frame rate differences and decided that the cloud is the reason for this, while simultaneously disregarding a whole load of far more likely scenarios. Architectural difficulties on the PS4 seem far more likely to me than the cloud being solely responsible for 60fps on the Xbox One. Given the supposed power advantage, I'd be far more surprised if the PS4 was physically incapable of 60fps (especially as none of the games shown for either console were that much of a jump visually). I'll be happy to be proved wrong when the Xbox One is released and we can see just how it all works.

 

We'll have to agree to disagree for now though and see how it ends up. I'm most interested in what the cloud can bring to gameplay (Ubisoft's line-up in particular is very impressive - was far and away the best presentation at E3 for me).

Link to comment
Share on other sites

There is a difference between artificially locking a games frame rate to 30fps or leaving it unlocked.  Some games want to push a slower type of gameplay mechanic, if that's what the new KZ is going for then limiting it to 30 frames targets that goal.  Now if a game that's suppose to run faster but can't because of poor coding at the time, that's a whole different issue and optimization is key to fixing that.

 

Case in point, I've been playing bioshock infinite the past few days and I'm pretty sure I saw an option in there that would lock the framerate to a fixed number, I turned it off because I wanted the quicker movement but I do believe it was on by default?   Anyways, this and things like v-sync, aren't anything new.  Some games don't need 60FPS, some do.  It's up to the developer really.

Link to comment
Share on other sites

They can't make a good software stack to compliment their hardware. Running a box with a modified version of FreeBSD is not a good way to go.

 

The Xbox runs 3 HyperVM's which compliment each other and specifically designed to work exactly in harmony with the hardware the software sits on. You have hardware compression/decompression too offload data compression to sent to the cloud from the CPU. You've got memory buses to inject data from the cloud straight into the RAM. Just things like that which make a huge difference.

2 VMs and 1 hypervisor that hosts them.

Link to comment
Share on other sites

2 VMs and 1 hypervisor that hosts them.

 

And one VM only runs when you're playing a Xbox game.  I don't think the dashboard runs on the Xbox OS anymore like on the 360.   So we can finally switch over to the full dashboard without having to quit a game or app, thank god. 

Link to comment
Share on other sites

Speaking from personal experience, just because it is locked at 30fps doesn't necessarily mean they would have 30fps of optimizations to find. It may be capable of running at 60fps mostly, with certain situations dipping it to 50fps or 40fps. As it has to be 30fps or 60fps, you'd obviously have to go with 30fps as a lower constant frame rate is less noticeable than frame rate drops. The big question is whether Sony can find the optimizations they seem to require.

 

I did post a likely explanation earlier. As I understand it, from a developer standpoint both the Xbox 360 and Xbox One are quite similar. DirectX is used for graphics, etc. The Xbox has a mature and very well developed system of developer tools. So all developers have got to is take advantage of the newly available power. With the PS4, Sony have gone from the weird and wonderful architecture of the PS3 to x86. They don't have access to age-old tools like DirectX (for obvious reasons), so have had to write their own equivalents. Developers have got to learn these and have probably had to significantly rewrite (or write from scratch) their own engines and tools as well. No idea of the actual FPS involved, but I can see this kind of situation evidenced in the early PS3 games I've played - many seem to have frame rate issues that their later equivalents do not (GTA IV > RDR, Oblivion > Skyrim).

 

No offense intended, but I think you are adding two and two and getting five. You've taken one part of an already unspecific quote from Turn 10, taken the respective frame rate differences and decided that the cloud is the reason for this, while simultaneously disregarding a whole load of far more likely scenarios. Architectural difficulties on the PS4 seem far more likely to me than the cloud being solely responsible for 60fps on the Xbox One. Given the supposed power advantage, I'd be far more surprised if the PS4 was physically incapable of 60fps (especially as none of the games shown for either console were that much of a jump visually). I'll be happy to be proved wrong when the Xbox One is released and we can see just how it all works.

 

We'll have to agree to disagree for now though and see how it ends up. I'm most interested in what the cloud can bring to gameplay (Ubisoft's line-up in particular is very impressive - was far and away the best presentation at E3 for me).

 

  The x86 platform is the most well known platform out of them all.  I don't believe there is an easier platform to work with, it's like 40 years old evolved over time.  It has Direct 3D from Microsoft, it has OpenGL and the the software stack should be very mature by now (for most OSes, including BSD).  Sony has had plenty of time to optimize that software stack.  I know that they are using their own libraries but I seriously doubt this is the reason.   Macs run on x86, Linux runs on x86, Windows of course runs on x86, even the PS4's BSD runs on x86.  So, yeah.  I am not buying it. This isn't the cell, this is x86.  

 

 I have asked several times to tell me how Halo 5 (or whatever it's being called) is running at 60 frames per second and we know Battlefield is as well and then we have that statement from Turn 10.  Let me say this easily, when you take the processing load off of the console in a significant manner, how hard is it to get more frames per second?  When you off load processing to a server, you gain what?  Speed?  It's not a hard concept.  It's not 2+2=5, it's pretty common sense.  Now the speed can be different depending on what you are offloading and how much of it, but the common sense is still there. 

Link to comment
Share on other sites

  The x86 platform is the most well known platform out of them all.  I don't believe there is an easier platform to work with, it's like 40 years old evolved over time.  It has Direct 3D from Microsoft, it has OpenGL and the the software stack should be very mature by now (for most OSes, including BSD).  Sony has had plenty of time to optimize that software stack.  I know that they are using their own libraries but I seriously doubt this is the reason.   Macs run on x86, Linux runs on x86, Windows of course runs on x86, even the PS4's BSD runs on x86.  So, yeah.  I am not buying it. This isn't the cell, this is x86.  

 

 I have asked several times to tell me how Halo 5 (or whatever it's being called) is running at 60 frames per second and we know Battlefield is as well and then we have that statement from Turn 10.  Let me say this easily, when you take the processing load off of the console in a significant manner, how hard is it to get more frames per second?  When you off load processing to a server, you gain what?  Speed?  It's not a hard concept.  It's not 2+2=5, it's pretty common sense.  Now the speed can be different depending on what you are offloading and how much of it, but the common sense is still there. 

 

According to popular opinion (here and elsewhere) in order,

 

1. Microsoft doesn't care for gamers and is cheap so they used cheap DDR3 RAM in XB1 (and they hate indies)

2. Sony cares for gamers and indies and went with GDDR5

 

which lead to

 

1. XB1 now has a complex SoC layout with esRAM and Move Engines to compensate for "pathetic and slow" main RAM

2. PS4 has a simple traditional x86-64 layout SoC

 

which lead to

 

1. XB1 development will be complex requiring more developer time and efforts

2. PS4 has simple and less complex development cycle compared to XB1

 

which lead to

 

1. XB1 dev. tools will be complex and "headache" for developers ...and indies don't forget the indies

2. PS4 has dev. friendly tools that gives "direct access to metal/hardware" = better performance and is more dev-friendly console

 

which lead to

 

1. All/many* XB1 E3 demos ran at 1080p@60FPS

1. All/majority* PS4 E3 demos ran at 1080p@30FPS

 

Brilliant, isn't it?

 

 

*I don't know for sure and those * were put for obvious reasons. If you reached this far..congrats.

**I also didn't know where to put "XB1 was 6months behind schedule and all launch games will suffer from the delays" in the timeline.

Link to comment
Share on other sites

According to popular opinion (here and elsewhere) in order,

 

1. Microsoft doesn't care for gamers and is cheap so they used cheap DDR3 RAM in XB1 (and they hate indies)

2. Sony cares for gamers and indies and went with GDDR5

 

which lead to

 

1. XB1 now has a complex SoC layout with esRAM and Move Engines to compensate for "pathetic and slow" main RAM

2. PS4 has a simple traditional x86-64 layout SoC

 

which lead to

 

1. XB1 development will be complex requiring more developer time and efforts

2. PS4 has simple and less complex development cycle compared to XB1

 

which lead to

 

1. XB1 dev. tools will be complex and "headache" for developers ...and indies don't forget the indies

2. PS4 has dev. friendly tools that gives "direct access to metal/hardware" = better performance and is more dev-friendly console

 

which lead to

 

1. All/many* XB1 E3 demos ran at 1080p@60FPS

1. All/majority* PS4 E3 demos ran at 1080p@30FPS

 

Brilliant, isn't it?

 

 

*I don't know for sure and those * were put for obvious reasons. If you reached this far..congrats.

**I also didn't know where to put "XB1 was 6months behind schedule and all launch games will suffer from the delays" in the timeline.

 

  You know whats even worse?  A few days before E3 there was a huge rumor that most people on gaming forums like "NeoGAF" believed that Microsoft was "paying money" to third party developers so they would not display their PS4 version of the game because it looked so much better than the XB1.  

 

LOL * 10,000!

 

   Look, I am not claiming that the XB1 is the best console in history, but the hate is highly overrated.  The DRM made everyone go insane and rationality went out of the window.  

  • Like 2
Link to comment
Share on other sites

And still you avoid answering my question, Yogurtmaster.

 

I can only conclude that the very first "fact" in the op, is in fact made up ######.

Link to comment
Share on other sites

  The x86 platform is the most well known platform out of them all.  I don't believe there is an easier platform to work with, it's like 40 years old evolved over time.  It has Direct 3D from Microsoft, it has OpenGL and the the software stack should be very mature by now (for most OSes, including BSD).  Sony has had plenty of time to optimize that software stack.  I know that they are using their own libraries but I seriously doubt this is the reason.   Macs run on x86, Linux runs on x86, Windows of course runs on x86, even the PS4's BSD runs on x86.  So, yeah.  I am not buying it. This isn't the cell, this is x86.  

 

 I have asked several times to tell me how Halo 5 (or whatever it's being called) is running at 60 frames per second and we know Battlefield is as well and then we have that statement from Turn 10.  Let me say this easily, when you take the processing load off of the console in a significant manner, how hard is it to get more frames per second?  When you off load processing to a server, you gain what?  Speed?  It's not a hard concept.  It's not 2+2=5, it's pretty common sense.  Now the speed can be different depending on what you are offloading and how much of it, but the common sense is still there. 

Battlefield has been confirmed to be running at 1080p @ 60fps on the PS4. Sort of proves my point that the hardware is capable of it and that the cloud on the Xbox isn't the sole reason for 60fps instead of 30fps. Of course Dice could be running the PS4 version with lower effects, but multiplatform developers don't usually do that (hence why multiplatform games are usually the best indicator of system flaws). There is also a lot of talk at the moment of developers having to create their own low-level APIs on the PS4, which could perhaps be more of a problem to the smaller Sony exclusive developers than the might of EA/Dice.

 

I've never disputed that offloading to the cloud can give more power for more frames (or more graphics or more features). I'm not sure why you keep insisting I don't get the concept. I've only disputed your claim, based on a selective one-half of a single out-of-context quote to a biased source, that it can give another 30fps (I've still not seen any developer claim it directly, nor have I even seen Microsoft claim it). Think about what you can afford to offload to the cloud as a developer of your average game and it is the minor details that you never directly interact with, like tree sway and flying debris (anything more major than that becomes problematic). Maybe I'm grossly underestimating how much processing is wasted on these details, but it doesn't seem like enough to gain another 30fps. We'd be talking about these kind of offload-able details taking up half of all the processing time as games stand currently, which I find hard to believe from my brief foray into game development.

 

I'm not sure why we're even trying to use fps as a measurement at this point. Until we can compare multiplatform titles side-by-side like-for-like on each platform, it doesn't really tell us anything. 
 
Also, whoever claimed the Xbox One would be hard to develop for because of its architecture is frankly an idiot. If there is one thing Microsoft does well, it is making architecturally sound software and hardware.
Link to comment
Share on other sites

 

Battlefield has been confirmed to be running at 1080p @ 60fps on the PS4. Sort of proves my point that the hardware is capable of it and that the cloud on the Xbox isn't the sole reason for 60fps instead of 30fps. Of course Dice could be running the PS4 version with lower effects, but multiplatform developers don't usually do that (hence why multiplatform games are usually the best indicator of system flaws). There is also a lot of talk at the moment of developers having to create their own low-level APIs on the PS4, which could perhaps be more of a problem to the smaller Sony exclusive developers than the might of EA/Dice.

 

I've never disputed that offloading to the cloud can give more power for more frames (or more graphics or more features). I'm not sure why you keep insisting I don't get the concept. I've only disputed your claim, based on a selective one-half of a single out-of-context quote to a biased source, that it can give another 30fps (I've still not seen any developer claim it directly, nor have I even seen Microsoft claim it). Think about what you can afford to offload to the cloud as a developer of your average game and it is the minor details that you never directly interact with, like tree sway and flying debris (anything more major than that becomes problematic). Maybe I'm grossly underestimating how much processing is wasted on these details, but it doesn't seem like enough to gain another 30fps. We'd be talking about these kind of offload-able details taking up half of all the processing time as games stand currently, which I find hard to believe from my brief foray into game development.

 

I'm not sure why we're even trying to use fps as a measurement at this point. Until we can compare multiplatform titles side-by-side like-for-like on each platform, it doesn't really tell us anything. 
 
Also, whoever claimed the Xbox One would be hard to develop for because of its architecture is frankly an idiot. If there is one thing Microsoft does well, it is making architecturally sound software and hardware.

 

 

Thanks for the link, that is the first I have heard about battlefield, I will update my documentation.  As far as why Sony would have developers make their own low level API's when they have theirs.  I forgot the name for it, but it isn't OpenGL, it interfaces with OpenGL, but it's called something else.  Is this supposed to be something that allows "close to the metal" coding?

Link to comment
Share on other sites

Thanks for the link, that is the first I have heard about battlefield, I will update my documentation.  As far as why Sony would have developers make their own low level API's when they have theirs.  I forgot the name for it, but it isn't OpenGL, it interfaces with OpenGL, but it's called something else.  Is this supposed to be something that allows "close to the metal" coding?

It's a bit up in the air, because they've never confirmed what they're using.

 

Sony have gave a very mis-leading interpretation of what it's running stating that the CPU was designed around Direct X instruction sets. This makes me think they're not actually using OpenGL and have created a CPU which can understand instruction sets which are based around DirectX. This makes me think that they're not actually running anything between the application and the hardware and are encouraging developers to code on the metal with some familiarity due to it being similar to how DirectX interacts.

 

This would make for very hard optimization and development. Hence why we would be seeing sub 30FPS across some games. 

 

PS: Sorry if I'm not accurate, I've been knocking out false info all over here ha.

Link to comment
Share on other sites

It's a bit up in the air, because they've never confirmed what they're using.

 

Sony have gave a very mis-leading interpretation of what it's running stating that the CPU was designed around Direct X instruction sets. This makes me think they're not actually using OpenGL and have created a CPU which can understand instruction sets which are based around DirectX. This makes me think that they're not actually running anything between the application and the hardware and are encouraging developers to code on the metal with some familiarity due to it being similar to how DirectX interacts.

 

This would make for very hard optimization and development. Hence why we would be seeing sub 30FPS across some games. 

 

PS: Sorry if I'm not accurate, I've been knocking out false info all over here ha.

An OpenGL wrapper that looks like DX feature set (or whatever Sony called it). It is a given that they will be either OpenGL or DX, inventing their own stuff won't make it easy for anyone.
Link to comment
Share on other sites

An OpenGL wrapper that looks like DX feature set (or whatever Sony called it). It is a given that they will be either OpenGL or DX, inventing their own stuff won't make it easy for anyone.

Done a bit of digging and found the best post I've seen on the topic. Seems to be from a developer who's used both.

http://www.rage3d.com/board/showpost.php?s=69860d72dcd8dcbd914bf6f44b16b5e2&p=1337271493&postcount=10

  • Like 3
Link to comment
Share on other sites

This topic is now closed to further replies.