Microsoft Xbox One -> Info that may not be well known


Recommended Posts

Good post.
 

Many people dismissing "The Cloud" abilities are going to be really surprised over the next five years as this will evolve over time and many games that are expected to do only 30 frames per second will do 60.  That is one thing that Microsoft has over Sony at the moment.  This is why I believe Halo for Xbox One will be 1080p and 60 frames per second because the processing is offloaded on the server and this allows the game to be run at faster speeds, if Microsoft can get third parties on this, Sony could have a hard time with the same game going 30 frames per second.

I still find these kinds of claims about the cloud usage on the Xbox One extremely dubious. I'm not even sure Microsoft has claimed them directly. Do you have a source with detailed information on just what can be offloaded and how/why it would increase frame rates so much?

 

Maybe I am misunderstanding something, but the reason I find them dubious is because it doesn't make any sense when talking of things like doubling frame rates (it does make sense with dedicated server multiplayer and persistent worlds). Let's say you start offloading all your physics calculations to get more processing for "more frame rates". First you are assuming the user has Internet, but let's say they do (its the most likely scenario). You fire the gun, and the game asks the cloud to calculate the trajectory. It sends the data to the cloud and awaits a response. At a frame rate of 60 fps, the response would need to be returned within 16ms (1 sec / 60 frames) to be returned for the next frame. You could skip a frame, but that'd still need to be back within 32ms. So the first problem is you need very good Internet to get the data back quick enough. In some areas this kind of response time may be possible (to be honest I have no idea). What about those who don't? Or is the game fibre only? Anyway, let's assume the user's Internet is super awesome.

 

Oh no! Your wife/girlfriend/sister/mother/aunt/dog/cat starts watching a YouTube video and it is taking longer for the data to travel. And/or the network is struggling to handle the sudden increase in traffic. So the game, expecting the data back within 16/32ms is instead waiting much longer. Now what does it do? Calculate it itself? Why would the game go to all this effort, sending data across the Internet to get a response that it had the power available to do in a fraction of the time? Or do the graphics/framerate randomly decrease while it does the calculations itself? Or does it just keep on waiting for a response, leaving your bullet in limbo? And what happens if, while waiting for the response, a tank drives in front of your perfectly aligned sniper shot? It would need to ask for the cloud for a new calculation, further increasing the time it needs to wait before it actually knows where the bullet is going. The way I understand it, at the very least you get all the current problems of multiplayer gaming (lag, hit detection, etc) and more (afaik existing multiplayer games do only what's necessary to keep things synced).

 

As I said, I find these kind of claims about the cloud extremely dubious and I haven't seen them substantiated either.

 

The reason Sony are struggling to get 30fps is likely down to software. Microsoft came from a PC world, and their consoles are based on the same age-old DirectX APIs to my knowledge. Likewise, Xbox devs will be fully familiar with it. Meanwhile this is Sony's first foray into an x86 console, and the PS3 was on an entirely different weird and wonderful architecture. The API may have bugs in it, or may simply need to mature. Likewise, the developers are probably ironing out issues from going from PS3 to PS4 in their own software. I'd be surprised if it wasn't fixed by the time it launches. It may very well have been fixed by the time E3 was being shown - I doubt either company brought the most "up-to-date" hardware and software.

Link to comment
Share on other sites

Yeah I see what your talking about and I agree.  Thing is though, we're talking Dynamically changing worlds so things such as enemy positions, mission structure, item locations, day/night cycle won't work without an internet connection.  Obviously this is a requirement if its an MMO like Destiny, but I'm thinking that open world games like MGSV would miss out a ton of content if you don't have an internet connection.   This is where my concerns lie.

I always play single player games offline, so no one bothers me while I play them (Skyrim for example), I like having the choice, and I don't want to have most of the game cut off because of the choice I make.

.

Single player works offline like it does on Xbox 360/PS3/PS4 but if you are online, then things like drivatar in Forza 5 are possible. If you don't have connection, opponents are traditional AI controlled but if you are online then opponents are "drivatars".

Is it clear now?

Link to comment
Share on other sites

Good post.

 

I still find these kinds of claims about the cloud usage on the Xbox One extremely dubious. I'm not even sure Microsoft has claimed them directly. Do you have a source with detailed information on just what can be offloaded and how/why it would increase frame rates so much?

 

Maybe I am misunderstanding something, but the reason I find them dubious is because it doesn't make any sense when talking of things like doubling frame rates (it does make sense with dedicated server multiplayer and persistent worlds). Let's say you start offloading all your physics calculations to get more processing for "more frame rates". First you are assuming the user has Internet, but let's say they do (its the most likely scenario). You fire the gun, and the game asks the cloud to calculate the trajectory. It sends the data to the cloud and awaits a response. At a frame rate of 60 fps, the response would need to be returned within 16ms (1 sec / 60 frames) to be returned for the next frame. You could skip a frame, but that'd still need to be back within 32ms. So the first problem is you need very good Internet to get the data back quick enough. In some areas this kind of response time may be possible (to be honest I have no idea). What about those who don't? Or is the game fibre only? Anyway, let's assume the user's Internet is super awesome.

 

Oh no! Your wife/girlfriend/sister/mother/aunt/dog/cat starts watching a YouTube video and it is taking longer for the data to travel. And/or the network is struggling to handle the sudden increase in traffic. So the game, expecting the data back within 16/32ms is instead waiting much longer. Now what does it do? Calculate it itself? Why would the game go to all this effort, sending data across the Internet to get a response that it had the power available to do in a fraction of the time? Or do the graphics/framerate randomly decrease while it does the calculations itself? Or does it just keep on waiting for a response, leaving your bullet in limbo? And what happens if, while waiting for the response, a tank drives in front of your perfectly aligned sniper shot? It would need to ask for the cloud for a new calculation, further increasing the time it needs to wait before it actually knows where the bullet is going. The way I understand it, at the very least you get all the current problems of multiplayer gaming (lag, hit detection, etc) and more (afaik existing multiplayer games do only what's necessary to keep things synced).

 

As I said, I find these kind of claims about the cloud extremely dubious and I haven't seen them substantiated either.

 

The reason Sony are struggling to get 30fps is likely down to software. Microsoft came from a PC world, and their consoles are based on the same age-old DirectX APIs to my knowledge. Likewise, Xbox devs will be fully familiar with it. Meanwhile this is Sony's first foray into an x86 console, and the PS3 was on an entirely different weird and wonderful architecture. The API may have bugs in it, or may simply need to mature. Likewise, the developers are probably ironing out issues from going from PS3 to PS4 in their own software. I'd be surprised if it wasn't fixed by the time it launches. It may very well have been fixed by the time E3 was being shown - I doubt either company brought the most "up-to-date" hardware and software.

Right on the money, the way it was described in the OP is quite stupid. 

 

There is quite a lot of elements in games which aren't latency sensitive which could be sent out to the cloud, but its probably not going to give a fps boost like mentioned in OP. I think we're going to see more gameplay elements rather than graphical.

Link to comment
Share on other sites

I also fail to see how the cloud is going to make things SOOOOO much better.  For cases like you mentioned, if somebody wants to watch Netflix or YouTube, what happens with the cloud calculations?  How can a single player game be better with internet for use with calculations?  Ever play an MMO or FPS when somebody else on the network is watching a movie from Netflix or YouTube?

 

Also, I do not trust that family sharing feature.  Okay so now what needs to be done to share a 360 game?  Lend them the disc right?  How many people can play your game at once?  One right?  They are really fighting for a feature where not one but TWO people can play at the same time?  I fail to see how game companies will allow this to happen.  Forget the supposed "10 member" limit, just having the ability to have two people playing at the same times does not seem right to me.  I thought there were all of these horrible stories about used game sales, and MS made it EVEN EASIER to do this?  If people do not have to pay to play, they will not.  Even if it is a game I am DYING to play, I would rather split it with a friend and we can both play single player at the same time.  If I did not have that option, yes I would have to spend the full $60.

 

Can anybody explain how game companies would allow this?  I am not a cheapy (I just spend $150 on PSN in the last 3 days), but if I could split the cost with a friend and still have the full single player game available, that is a benefit,  Not to mention you can share it with 9 other people too.  How would this feature be allowed without some restrictions?  If they do not allow this, they would get $120 from us instead of $60.  I do not buy used.

 

Oh you know what will happen then?  No more single player.  Or at least single player with 10% of the usual budget that would only take you 3 hours to beat.  THANKS!  I do not care about multi-player.  I enjoy the single player.  But if sales are divided by 10, you know where the budget will go right?

Link to comment
Share on other sites

My Argument is valid, you still need an internet connection to play your games properly.   Shows the Xbox One is not all up to scratch.  Why should certain features (aside from online multiplayer) be disabled if people don't have an internet connection?

 

Think of Forza 5. When you're online you can use a simulation of other real players as the AI  in single player. However, when your offline, I guess your stuck with a stored copy of those simulations or the same old local AI that we have today. This is a SINGLE PLAYER feature that would be disabled in offline-mode, and this is because the devs chose too. Not because it's a One game...

 

Yeah I see what your talking about and I agree.  Thing is though, we're talking Dynamically changing worlds so things such as enemy positions, mission structure, item locations, day/night cycle won't work without an internet connection.  Obviously this is a requirement if its an MMO like Destiny, but I'm thinking that open world games like MGSV would miss out a ton of content if you don't have an internet connection.   This is where my concerns lie.

I always play single player games offline, so no one bothers me while I play them (Skyrim for example), I like having the choice, and I don't want to have most of the game cut off because of the choice I make.

.

 

Some games may require internet connection, others will prefer it.

 

This(what I highlighted) is what I meant with disable certain features. Dynamic worlds don't have to require a internet connection, but the games could be better with  internet.

 

Ex. your game has a dynamic world(enemies moving etc.etc.) that is offloaded to the cloud when you have internet. This frees up resources locally so you could get let's say better graphics. When you're disconnected, the graphics gets a little worse, and the dynamic world is processed locally with ex. less people in the city (limited dynamic world).

 

How the games uses the cloud to improve the gameplay and if+how it works without internet is a decision for the developers. That's why I said your argument wasn't valid because you pretty much said that the Xbox One cloud-offer(for devs) would limit the games to require internet, which is wrong. It's the devs choice how they handle all scenarios and whether or not the game works offline. :-)

Link to comment
Share on other sites

Good post.

 

I still find these kinds of claims about the cloud usage on the Xbox One extremely dubious. I'm not even sure Microsoft has claimed them directly. Do you have a source with detailed information on just what can be offloaded and how/why it would increase frame rates so much?

 

Maybe I am misunderstanding something, but the reason I find them dubious is because it doesn't make any sense when talking of things like doubling frame rates (it does make sense with dedicated server multiplayer and persistent worlds). Let's say you start offloading all your physics calculations to get more processing for "more frame rates". First you are assuming the user has Internet, but let's say they do (its the most likely scenario). You fire the gun, and the game asks the cloud to calculate the trajectory. It sends the data to the cloud and awaits a response. At a frame rate of 60 fps, the response would need to be returned within 16ms (1 sec / 60 frames) to be returned for the next frame. You could skip a frame, but that'd still need to be back within 32ms. So the first problem is you need very good Internet to get the data back quick enough. In some areas this kind of response time may be possible (to be honest I have no idea). What about those who don't? Or is the game fibre only? Anyway, let's assume the user's Internet is super awesome.

 

Oh no! Your wife/girlfriend/sister/mother/aunt/dog/cat starts watching a YouTube video and it is taking longer for the data to travel. And/or the network is struggling to handle the sudden increase in traffic. So the game, expecting the data back within 16/32ms is instead waiting much longer. Now what does it do? Calculate it itself? Why would the game go to all this effort, sending data across the Internet to get a response that it had the power available to do in a fraction of the time? Or do the graphics/framerate randomly decrease while it does the calculations itself? Or does it just keep on waiting for a response, leaving your bullet in limbo? And what happens if, while waiting for the response, a tank drives in front of your perfectly aligned sniper shot? It would need to ask for the cloud for a new calculation, further increasing the time it needs to wait before it actually knows where the bullet is going. The way I understand it, at the very least you get all the current problems of multiplayer gaming (lag, hit detection, etc) and more (afaik existing multiplayer games do only what's necessary to keep things synced).

 

As I said, I find these kind of claims about the cloud extremely dubious and I haven't seen them substantiated either.

 

The reason Sony are struggling to get 30fps is likely down to software. Microsoft came from a PC world, and their consoles are based on the same age-old DirectX APIs to my knowledge. Likewise, Xbox devs will be fully familiar with it. Meanwhile this is Sony's first foray into an x86 console, and the PS3 was on an entirely different weird and wonderful architecture. The API may have bugs in it, or may simply need to mature. Likewise, the developers are probably ironing out issues from going from PS3 to PS4 in their own software. I'd be surprised if it wasn't fixed by the time it launches. It may very well have been fixed by the time E3 was being shown - I doubt either company brought the most "up-to-date" hardware and software.

 

   There is sourced information from the Xbox team in the document talking about this.  The fall back is up to the developers to put in.  For example, the developers might default to the graphics on the disk or in the game if the Internet connection fails.  This is left up to the developer to put in.

 

   The good news is that there has been a lot of work put in so that if you have an Internet connection and low bandwidth that it can be dealt with.  You don't need that much bandwidth because data can be compressed and decompressed on the fly in hardware in the background, that along with buffering and injecting directly into the GPU's own memory and very small number of hops should help.  

 

About a 1.5 megabit connection should be in theory all that is needed.  If you are using 700k DSL, see if the cable company in your area is higher quality, if you are using AT&T Uverse, which is going to be slower, just check out cable in your area.  Cable is usually higher quality than that of AT&T Uverse or DSL. 

 

    I live in a rural area in Northern California.  I currently have a 6 Megabit connection, but Comcast can give me up to 25 megabits and my town is small, you can walk it one hour one way, two hours both ways (3 miles for the entire town).

 

    Comcast Cable = 25 Megabits for me

    DSL (normally gets up to 6 Megabits in the city with the best of phone lines) = 100k for me (Dial-up is around 50k)

 

    My ONLY option is Cable to be honest.  DSL is more like dial-up here. 

 

     NOTE: When they say they are offloading processes they are taking a look at the game, finding out what code has to run on the local Xbox one (ie collision is latency sensitive for example and can ONLY run on the console itself, but other things that are not dealing with collision can run on a server).

 

      So you examine a game and you see where there is latency sensitive code run this on the Xbox one console itself and where you could invest in server parts like lighting and such and you run that on the server.  This frees up the console to do other things or also can free up the game to run at a faster speed.   You are augmenting the console with server help as the document says.   So you split up the game to be local and server so that load is balanced and anything that is sensitive to latency you put on the xbox one box and everything else that can run on the server you put on a server.  The put hardware in the box to help this server process (Z77 Move engines) compress/decompress data on the fly in the background using dedicated hardware, this means that it doesn't take any processing from the game) and then with the low amount of hops, this helps with latency even more.  Even the new controller has less latency to help deal with all of this, it all comes together. 

Link to comment
Share on other sites

Forgive me if this is a completely stupid idea - my knowledge about online game infrastructure is rather limited.

 

In regards to increased graphics prettiness when using the cloud,would something along the lines of Gaikai/Onlive not be possible, where video elements are streamed into the game world, pre-computed by servers? 

 

What I think could happen is that a game engine like CryEngine would live on those servers and pre-render various elements and apply physics calculations as necessary and would stream the resulting video to the console.

This would be for things like very distant scenery, extra effects, and permutations on object models - like the appearance of a destroyed vehicle after the smoke has cleared,  (i.e. effects where latency is not the biggest issue)

 

This in my mind would let the console work on making things in the area immediately around the player prettier while maintaining a pretty larger environment.  

Link to comment
Share on other sites

Forgive me if this is a completely stupid idea - my knowledge about online game infrastructure is rather limited.

 

In regards to increased graphics prettiness when using the cloud,would something along the lines of Gaikai/Onlive not be possible, where video elements are streamed into the game world, pre-computed by servers? 

 

What I think could happen is that a game engine like CryEngine would live on those servers and pre-render various elements and apply physics calculations as necessary and would stream the resulting video to the console.

This would be for things like very distant scenery, extra effects, and permutations on object models - like the appearance of a destroyed vehicle after the smoke has cleared,  (i.e. effects where latency is not the biggest issue)

 

This in my mind would let the console work on making things in the area immediately around the player prettier while maintaining a pretty larger environment.  

 

Here is my understanding of what Sony is doing.  Sony bought Gaikai and they allow the entire servers to play the entire game instead of playing it on the console locally.  Because they are doing this, it requires a very fast codec to encode the game in video and then as it is streaming down to the PS4 side, the PS4 has dedicated hardware to decode the video and display it on the screen, the program on the PS4 then takes your commands from your game controller and send it back to the server so that the position or action of the player can be played through on the server.  This means that there is very little room for errors or room for latency and the bandwidth is going to be huge because it has to stream at 720p or 1080p, the higher the pixel rate and the longer you play the more you will use up bandwidth. 

 

So to kind of make it simple, Sony is playing the entire game on a server located near you and streaming that down as a movie to your PS4, so all of the graphics and everything is completely rendered on the server and just streamed down to your PS4.  Your move your thumb stick and that movement is being sent to the server really quick and then it is played on the server and streaming that back down.

 

So, the positives for Sony are....

 

    1) The game graphics can evolve over time.  So as they spend money, they can add in more graphics hardware.

    2) The games can be played on any platform almost as long as they have the hardware decoder chip or are fast enough hardware to decode the graphics in real-time. 

 

The negatives for Sony is that....

 

    1) They need a lot more servers because the games are going to use more resources.  Do they have the money for this even with PS+?  They have to scale well and have to be present all over the world. 

    2) The platforms need to have game pad controls because not every platform has the power or game pad to be useful.

    3) Because of the latency, the worlds can't evolve and can't be as dynamic as Microsoft's in real-time.

    4) This method is going to require a ton more bandwidth over time for 1080p.  You will need at least a 3 Megabit connection at the very least.  The higher the resolution, the more bandwidth you are going to need. 

 

   So, yes I believe what you said can be done, but it requires massive hardware on the server front and the worlds probably won't be as dynamic as what Microsoft is pushing.  With Microsoft they are splitting the load between the local box and the servers.  Sony will be putting all of the load on servers, even with virtual machines that is going to take a hefty toll on the servers if you have so many playing the PS4 like in Los Angeles or New York where there are huge populations of people playing your games. 

 

   Also the bandwidth is going be a high requirement for the individual users because if they want 1080p it's going to be a lot of bandwidth needed. 

Link to comment
Share on other sites

There is sourced information from the Xbox team in the document talking about this. The fall back is up to the developers to put in. For example, the developers might default to the graphics on the disk or in the game if the Internet connection fails. This is left up to the developer to put in.

What document? Where has Microsoft claimed it will double frame rates, or even improve them that significantly?

The fallback is up to the developer, but what fallback is there available? If the console is capable of doing the calculation in the first place, why would you not just do it on the console? And how much of a sacrifice to graphics, drop in framerate, etc would users put up with when their latency takes a dive? "Oh dear, your network is overloaded have Xbox 360 graphics and play at 15fps instead?" (Extreme example, but you get the point.)

 

About a 1.5 megabit connection should be in theory all that is needed. If you are using 700k DSL, see if the cable company in your area is higher quality, if you are using AT&T Uverse, which is going to be slower, just check out cable in your area. Cable is usually higher quality than that of AT&T Uverse or DSL.

1.5Mbps is never going to give the kind of latency required for processing the time sensitive data that would double frame rates, as you claim.

 

So you examine a game and you see where there is latency sensitive code run this on the Xbox one console itself and where you could invest in server parts like lighting and such and you run that on the server.  This frees up the console to do other things or also can free up the game to run at a faster speed.   You are augmenting the console with server help as the document says.   So you split up the game to be local and server so that load is balanced and anything that is sensitive to latency you put on the xbox one box and everything else that can run on the server you put on a server.  The put hardware in the box to help this server process (Z77 Move engines) compress/decompress data on the fly in the background using dedicated hardware, this means that it doesn't take any processing from the game) and then with the low amount of hops, this helps with latency even more.  Even the new controller has less latency to help deal with all of this, it all comes together.

That kind of offloading isn't going to double frame rates. The only things that could double frame rates would be all the latency sensitive operations such as physics, graphics processing, sound and (some) AI. I'm not saying the cloud won't help at all, but you are limited on what can be offloaded, which does limit how much can be gained. I think you are grossly under-estimating how many things are time sensitive in your average game.

 

  1) They need a lot more servers because the games are going to use more resources.  Do they have the money for this even with PS+?  They have to scale well and have to be present all over the world. 

    2) The platforms need to have game pad controls because not every platform has the power or game pad to be useful.

    3) Because of the latency, the worlds can't evolve and can't be as dynamic as Microsoft's in real-time.

    4) This method is going to require a ton more bandwidth over time for 1080p.  You will need at least a 3 Megabit connection at the very least.  The higher the resolution, the more bandwidth you are going to need.

1) True. I believe Sony's plan with Gaikai for now is to use it for getting PS3 games on the PS4 though. Still requires a heck of a lot of power and would need good server optimization to get as many users per server as possible.

2) Eh?

3) Yes they can. If anything they could be more dynamic. It would be easier to implement as you wouldn't need to worry about a fallback situation. The whole game is in sync with itself, only bit you need to worry about is the console to server input and the speed the video is streamed back. By the time the Internet is capable of doubling frame rates by doing time sensitive calculations, it would be preferable to put everything on the server itself.

4) If you can double frame rates on 1.5Mbps, you can stream video.

The cloud is great and can do a lot of things. It can even free up resources for the time sensitive calculations, as you say. And that is probably enough to make up the graphics power difference with the PS4 on exclusives, but it won't result in outright better graphics or double frame rates at the same graphics level.

Link to comment
Share on other sites

It's amazing how much of the info in the leaked PDF turned out to be true including these two:

- PS4 will be < $399 (so Microsoft guessed correctly when they decided to go with $499 target)

- Cloud support (i.e. it was not a desperate or last minute counter measure for PS4's beefier GPU and/or faster RAM as many around here like to think)

Link to comment
Share on other sites

What document? Where has Microsoft claimed it will double frame rates, or even improve them that significantly?

The fallback is up to the developer, but what fallback is there available? If the console is capable of doing the calculation in the first place, why would you not just do it on the console? And how much of a sacrifice to graphics, drop in framerate, etc would users put up with when their latency takes a dive? "Oh dear, your network is overloaded have Xbox 360 graphics and play at 15fps instead?" (Extreme example, but you get the point.)

 

1.5Mbps is never going to give the kind of latency required for processing the time sensitive data that would double frame rates, as you claim.

 

That kind of offloading isn't going to double frame rates. The only things that could double frame rates would be all the latency sensitive operations such as physics, graphics processing, sound and (some) AI. I'm not saying the cloud won't help at all, but you are limited on what can be offloaded, which does limit how much can be gained. I think you are grossly under-estimating how many things are time sensitive in your average game.

 

1) True. I believe Sony's plan with Gaikai for now is to use it for getting PS3 games on the PS4 though. Still requires a heck of a lot of power and would need good server optimization to get as many users per server as possible.

2) Eh?

3) Yes they can. If anything they could be more dynamic. It would be easier to implement as you wouldn't need to worry about a fallback situation. The whole game is in sync with itself, only bit you need to worry about is the console to server input and the speed the video is streamed back. By the time the Internet is capable of doubling frame rates by doing time sensitive calculations, it would be preferable to put everything on the server itself.

4) If you can double frame rates on 1.5Mbps, you can stream video.

The cloud is great and can do a lot of things. It can even free up resources for the time sensitive calculations, as you say. And that is probably enough to make up the graphics power difference with the PS4 on exclusives, but it won't result in outright better graphics or double frame rates at the same graphics level.

Dude, bandwidth <> latency. 

 

Do not ever compare bandwidth to latency they're completely different. Your never going to see cloud computations which directly effect frame-rate because the cloud isn't designed to assist frame rendered artifacts, it's for calculations which aren't done every frame and other logistics. Your just going to see prettier games due to the fact of the Xbox not handling some things. For example, Forza 5 looks stunning. The AI are done up in the cloud.

Game data takes hardly any bandwidth compared to video streams, the fact that game data is compressed/uncompressed as well. It won't be using much, probably around 40/50kpbs which a 1mb connection could easily handle with other people accessing the internet at the same time.

Link to comment
Share on other sites

Single player works offline like it does on Xbox 360/PS3/PS4 but if you are online, then things like drivatar in Forza 5 are possible. If you don't have connection, opponents are traditional AI controlled but if you are online then opponents are "drivatars".

Is it clear now?

 

No cause the advantage of drivatars have yet to be proven. It is still a buzz word until people try it. When the guy demoed drivatars we saw an AI controlled car initiate a contact with another care without any reason. That's not good AI as far as i'm concerned.

 

95% of people playing games are baddies. Taking the data of those baddies to create a powerful AI is kind of a stupid idea to me. It's not like you could not while developing the game ask 10 very good players to do lot of races, gather the data and create a powerful offline AI using those data. It would actually probably be more effective than doing it online with crap data gathered from bad players.

Link to comment
Share on other sites

I know how much some want to bang on about specs this and speed that but it's clear that MS already thought about the downside as far as bandwidth when picking DDR3 and not GDDR5 like Sony.    I'm not even talking about using the cloud or anything, all you have to do is look at the small demo for Direct X 11.2 to see that they've covered the slower speed of RAM with the new tile feature.

Link to comment
Share on other sites

No cause the advantage of drivatars have yet to be proven. It is still a buzz word until people try it. When the guy demoed drivatars we saw an AI controlled car initiate a contact with another care without any reason. That's not good AI as far as i'm concerned.

 

95% of people playing games are baddies. Taking the data of those baddies to create a powerful AI is kind of a stupid idea to me. It's not like you could not while developing the game ask 10 very good players to do lot of races, gather the data and create a powerful offline AI using those data. It would actually probably be more effective than doing it online with crap data gathered from bad players.

Uh... what? way to derail the conversation. I am just countering the point that Cloud AI won't result into dumb games when console is offline.

I know how much some want to bang on about specs this and speed that but it's clear that MS already thought about the downside as far as bandwidth when picking DDR3 and not GDDR5 like Sony.    I'm not even talking about using the cloud or anything, all you have to do is look at the small demo for Direct X 11.2 to see that they've covered the slower speed of RAM with the new tile feature.

which demo? do you have a link, I think I missed it.

Link to comment
Share on other sites

Ok heres how i see it and think it will work as ive read off the post and as long as you have an internet connection i think the xbox gfx will get better and better over time and heres why...

 

everyones banging on about how the ps4 has 50% more power than the xb1 and it has 1.8TFLOPS of compute power. It doesnt matter which way you swing it, its a mid range gfx card. a 7870 has 1280 stream processors, a 7850 has 1024 so its in between them. A high end 7970 atm has over 2000 and has 3.5TFLOPS of power.  The reason why you need more powerful gfx hardware to get better gfx and fps is because you start turning on anti-aliasing and whacking the effects upto the max, particles, stuff with specular lighting (i think), tessalation is a killer, also the level of draw detail (how much can be displayed the further away stuff is, i.e. instead of seeing just a valley you see the valley and the mountain) you need increasingly more powerful hardware. If you can offload the physics of stuff to the cloud, stuff like crashing waves against a rock (tesselation, the more complex the more gfx power you need), off load the drawing of complex scenes in the distance. Anything that isnt game critical, its not going to offload the car your driving while you shooting stuff up to the cloud, BUT it could off load say a volcano in the distance erupting in its magnificence and being able to leverage compute power of the cloud to draw an incredibly complex scene leaving the gfx to concentrate on rendering the the game critical bits.

 

I personally think itll get better because these consoles are supposed to last 10 years and to be honest the gfx cards out atm will beat the consoles into the ground before there even released. So by offloading complex physics even the AI of randomers for example grand theft auto, the ppl walking around could of there physics pre calculated reducing the amount of power needed as such artificially allowing for games to be ALOT better than the hardware in the box.  Im pretty damn sure thats what MS is going for and to be honest... it bloody clever! Thinking about it would they be able to addin dx 12/13/14 features if it can be processed in the cloud because the hardware obviously wont be able to support it??. for anyone without an internet connection for cloud to do this stuff im sure the game could scale stuff down or it wont be "enhanced". food for thought, if this is how MS are going about it, this is real forward thinking and is why MS is still around leading the way!

Link to comment
Share on other sites

I know how much some want to bang on about specs this and speed that but it's clear that MS already thought about the downside as far as bandwidth when picking DDR3 and not GDDR5 like Sony.    I'm not even talking about using the cloud or anything, all you have to do is look at the small demo for Direct X 11.2 to see that they've covered the slower speed of RAM with the new tile feature.

Never thought about that, very very true. Its also a MASSIVE feature. 

Link to comment
Share on other sites

Uh... what? way to derail the conversation. I am just countering the point that Cloud AI won't result into dumb games when console is offline.

which demo? do you have a link, I think I missed it.

 

There was a small demo at the keynote and there's a channel9 video on D3D 11.2 though they don't show a demo but they give you a good idea about what it's for and how you can use it.   Basically it will allow developers to pull in the textures as needed by the game, depending on game logic (like where the camera is looking or what area of the map is coming up next on screen etc), this means you don't have to load up the whole texture for something that will take up lots of memory.    It's the loading of all the HD textures, even when not actually needed, that we have the use of faster GDDR for graphics since they can fill and push out data faster.   But when you can actually manage and control the textures and show just the parts (the example they said was to think of having a large mountain texture that you'd have to load the whole thing in memory till now, now you can just pull in the parts (tiles) of it that are actually going to be visible on the screen so you don't have to load the whole thing into memory anymore.)   That frees up resources and so on.  

 

The way I see it, if you don't have to worry about pushing in as many highres textures as you can because you can now pull in just the parts you need then even on slower bandwidth system memory you won't actually notice a difference in performance.  

Link to comment
Share on other sites

Yeah I keep telling people your decision on what console to get should be which one has the games you are going to play and nothing else because graphically they will be the exact same. One game will probably look a little better on the ps4 and another will look a little better on the xbone.  For me im getting both on launch day however I plan on getting any games I can on xbone and just getting the ps4 for its exclusives.  

 

We went through this last time with the almighty cell processor and how much better the ps3 graphics were going to be and it was pretty even.  And the other thing that annoys me is people comparing these consoles to PCs like they are going to be inferior cause they are not using the very top of the line PC video card.  What you should be looking at is what they are doing graphics wise with the ps3 and xb360 and how much faster the new consoles are compared to that.  Also the pc gamers need to realize that new consoles mean you will finally get better looking games on your pc now too cause developers will finally start using things like directx 11.2 since consoles can do it.  

Link to comment
Share on other sites

Yeah I keep telling people your decision on what console to get should be which one has the games you are going to play and nothing else because graphically they will be the exact same. One game will probably look a little better on the ps4 and another will look a little better on the xbone.  For me im getting both on launch day however I plan on getting any games I can on xbone and just getting the ps4 for its exclusives.  

 

We went through this last time with the almighty cell processor and how much better the ps3 graphics were going to be and it was pretty even.  And the other thing that annoys me is people comparing these consoles to PCs like they are going to be inferior cause they are not using the very top of the line PC video card.  What you should be looking at is what they are doing graphics wise with the ps3 and xb360 and how much faster the new consoles are compared to that.  Also the pc gamers need to realize that new consoles mean you will finally get better looking games on your pc now too cause developers will finally start using things like directx 11.2 since consoles can do it.  

 

 

I'm going to go and get both as well though last time I was 360 only.  Still, not right away, I don't have enough spare money to drop $800 so they'll have to wait for late 2013 (xmas for one probably) and late Jan/Feb for the other, at best.  Bills come first.

Link to comment
Share on other sites

The key to all of this is to remember that the X1's offline gaming experience will be the same as the ps4 regarding features, etc.

 

Also, keep in mind that developers are creating 'single player' experiences that leverage an internet connection. This has nothing to do with the X1 or PS4, its just the trend in gaming. So those that are upset with requiring an online connection to play are in for a shock as game developers push more and more titles that way. Look at Watch Dogs for example.

Link to comment
Share on other sites

What document? Where has Microsoft claimed it will double frame rates, or even improve them that significantly?

The fallback is up to the developer, but what fallback is there available? If the console is capable of doing the calculation in the first place, why would you not just do it on the console? And how much of a sacrifice to graphics, drop in framerate, etc would users put up with when their latency takes a dive? "Oh dear, your network is overloaded have Xbox 360 graphics and play at 15fps instead?" (Extreme example, but you get the point.)

 

1.5Mbps is never going to give the kind of latency required for processing the time sensitive data that would double frame rates, as you claim.

 

That kind of offloading isn't going to double frame rates. The only things that could double frame rates would be all the latency sensitive operations such as physics, graphics processing, sound and (some) AI. I'm not saying the cloud won't help at all, but you are limited on what can be offloaded, which does limit how much can be gained. I think you are grossly under-estimating how many things are time sensitive in your average game.

 

1) True. I believe Sony's plan with Gaikai for now is to use it for getting PS3 games on the PS4 though. Still requires a heck of a lot of power and would need good server optimization to get as many users per server as possible.

2) Eh?

3) Yes they can. If anything they could be more dynamic. It would be easier to implement as you wouldn't need to worry about a fallback situation. The whole game is in sync with itself, only bit you need to worry about is the console to server input and the speed the video is streamed back. By the time the Internet is capable of doubling frame rates by doing time sensitive calculations, it would be preferable to put everything on the server itself.

4) If you can double frame rates on 1.5Mbps, you can stream video.

The cloud is great and can do a lot of things. It can even free up resources for the time sensitive calculations, as you say. And that is probably enough to make up the graphics power difference with the PS4 on exclusives, but it won't result in outright better graphics or double frame rates at the same graphics level.

 

   You will see.  It's already happening.... See this thread here....

 

    This is just a very, very, very small example and it's not using all of the features, just very minor ones.

    https://www.neowin.net/forum/topic/1163608-watch-dogs-will-have-more-of-a-dynamic-city-on-the-xbox-one/#entry595807814

 

    This is ONLY the beginning and it's a very small sample to boot.  There is a lot more than this.  Wait until Halo next year, that will be a game that really breaks things wide open.  It can allow for better graphics for sure, no doubt about that and yes offer better frame rates.   That is why I said a lot of people that think the PS4 is more powerful are going to get a huge shock.

 

     Sure, the PS4's own local GPU is better, that is a given, but when you add in server processing and Microsoft can expand this over time, the Xbox is going to be better.  Can Sony do this as well? Well the answer is yes and no.  They are set up for streaming, but not being as dynamic as this.  Microsoft also has that dedicated hardware that allows compression/decompression of assets and then injected right into the GPU's memory.  Sony doesn't have that feature in hardware and Microsoft is going to have a superior server platform, software stack, and superior method of delivery.  

 

     Microsoft's method is going to take a lot less bandwidth to deliver a superior experience.  Both ways of doing this are hard, they are not easy, but the method that is easier is the streaming method.  This is very innovative for a console because for the first time the console hardware can be augmented in real-time and it keeps the Internet bandwidth down.  

 

This is why it says "Cloud Powered", that document that I provided showed the four different things that can be done with the cloud.  It is not just marketing, there is actually good rationality behind it.   They are really pushing servers because servers can be a "game" changer (pun intended) for real.

 

Can everything be done by "server", not unless you do it the way Sony is doing it.  Microsoft can offload a lot to the server, but not everything and that is okay, because the Xbox One is powerful.  It's a lot more powerful than people think.   People think that if the GPU doesn't have as many FLOPS (Floating Point Operations per second) as Sony then it's not very powerful, but it's actually more powerful when you augment it with Servers or distributed computing. 

 

  As you said though Sony is going to give the PS4 back compat via the servers and that is going to be a long time before they even can upgrade their servers to be more powerful.  I wouldn't count on that for a long long time, probably 4-5 years or so.  Because it's going to take enormous server power to run full next-next generation games around the world.   That is going to be a ton of servers around the world and Sony isn't currently prepared for it.

 

Can Sony afford it?  

Link to comment
Share on other sites

   You will see.  It's already happening.... See this thread here....

 

    This is just a very, very, very small example and it's not using all of the features, just very minor ones.

    https://www.neowin.net/forum/topic/1163608-watch-dogs-will-have-more-of-a-dynamic-city-on-the-xbox-one/#entry595807814

Was just reading that. This is the type of thing the cloud is going to be used for and confirms my point that it can only be used for latency insensitive operations (they actually say tree physics don't need to be fully synced). My issue with the cloud is specifically the outlandish claims of double frame rates, etc. I won't say they are impossible, just not possible yet. By the time the Internet can handle offloading those kind of calculations to the cloud, we'll be streaming games anyway.

Link to comment
Share on other sites

Was just reading that. This is the type of thing the cloud is going to be used for and confirms my point that it can only be used for latency insensitive operations (they actually say tree physics don't need to be fully synced). My issue with the cloud is specifically the outlandish claims of double frame rates, etc. I won't say they are impossible, just not possible yet. By the time the Internet can handle offloading those kind of calculations to the cloud, we'll be streaming games anyway.

 

  Nope, already happening.  You won't want to stream the entire game, that takes up a huge amount of bandwidth and it takes a huge amount of money (see the link below), that is going to require them at least renting a lot of server space around the world.  Maybe Sony can talk to Google or Microsoft.

 

This talks a lot about Microsoft Azure Data centers for servers (Clouds)

http://www.youtube.com/watch?v=JJ44hEr5DFE

 

 I posted a lot of this in that document that I posted.  Latency sensitive game code is going to run on the local machine (which makes a lot of sense), the latency insensitive game code can run on a server that is nearby.   Game worlds can change dynamically and the bandwidth is going to be saved by using the Z77 hardware compression/decompression (This gives you speed and also saves a lot of bandwidth, with the gaming content coming from and to the servers) Move engines and then can be injected directly into the GPU's memory. 

 

You don't want the servers to run the entire game, that is a complete waste of bandwidth and server power.  

If Sony goes beyond the PS3 into the next generation beyond the PS4, it's going to be like 1080p or even 4K and that is going to need massive, massive servers and this is going to take up more energy from the servers themselves (this kind of processing isn't free).  I honestly would not expect 4K for a very long time because even with the latest codecs it's going to take a lot of bandwidth to do that and those bandwidth limits would be used up quickly because games are usually played longer than a hour and half movie. 

 

That is why I think Microsoft's idea makes a lot more sense.   They share the load between local and server and we should see a game like Halo next year offer some of the best cloud based experiences that Microsoft has for this new generation.   Microsoft keeps the load down on the servers which uses a lot less electricity and uses a lot less bandwidth as well.  It's a win all over. 

 

Sony won't have their server PS3 streaming started until 2014 and ONLY in the USA.  So, they are a long way of providing servers like what Microsoft has for the launch of the Xbox one, it's going to take them years and then they would have to provide an upgrade over time.  Microsoft is already starting the benefits of the servers starting at launch of the Xbox one, that is one reason why they haven't launched in all countries yet.

 

That video that I posted is huge.  Microsoft is building out new data centers in 9 months time for the 4th, 5th generation. 

Link to comment
Share on other sites

Nope, already happening.  You won't want to stream the entire game, that takes up a huge amount of bandwidth and it takes a huge amount of money (see the link below), that is going to require them at least renting a lot of server space around the world.  Maybe Sony can talk to Google or Microsoft.

 

This talks a lot about Microsoft Azure Data centers for servers (Clouds)

http://www.youtube.com/watch?v=JJ44hEr5DFE

 

 I posted a lot of this in that document that I posted.  Latency sensitive game code is going to run on the local machine (which makes a lot of sense), the latency insensitive game code can run on a server that is nearby.   Game worlds can change dynamically and the bandwidth is going to be saved by using the Z77 hardware compression/decompression (This gives you speed and also saves a lot of bandwidth, with the gaming content coming from and to the servers) Move engines and then can be injected directly into the GPU's memory. 

You seem to be disagreeing with me agreeing with you.  :wacko: As I said, and as devs have said, and as you have now said, the cloud is going to be used for latency insensitive operations (I have never disagreed with this). All I'm saying is this isn't going to result in double frame rates. It will improve frame rates and graphics as more local power is available, but to get that much out of it would require latency sensitive operations to be offloaded. Of course all this does depend on the game in question.

 

You don't want the servers to run the entire game, that is a complete waste of bandwidth and server power.  

If Sony goes beyond the PS3 into the next generation beyond the PS4, it's going to be like 1080p or even 4K and that is going to need massive, massive servers and this is going to take up more energy from the servers themselves (this kind of processing isn't free).  I honestly would not expect 4K for a very long time because even with the latest codecs it's going to take a lot of bandwidth to do that and those bandwidth limits would be used up quickly because games are usually played longer than a hour and half movie. 

 

That is why I think Microsoft's idea makes a lot more sense.   They share the load between local and server and we should see a game like Halo next year offer some of the best cloud based experiences that Microsoft has for this new generation.   Microsoft keeps the load down on the servers which uses a lot less electricity and uses a lot less bandwidth as well.  It's a win all over.

I'm not entirely convinced on game streaming either (at least for the near future). It needs heavy server side optimization, excellent compression/decompression at either end and much more reliable Internet connections than most people probably have. However, the bandwidth requirements shouldn't be much more than those required for streaming an HD video. My point is that by the time you can start to offload latency sensitive operations to the cloud, you'd be in a position to just stream whole games. If anything it would be more desirable as instead of uploading and download many different individual snippets of game data once a frame, you only upload input and download video. I'm sure when the time comes Microsoft will be all over it.

Link to comment
Share on other sites

This topic is now closed to further replies.