Build 2014 Xbox One Discussion


Recommended Posts

hate to come in being a skeptic but I'm curious why they are running different demos for the PC and the cloud?

 

The Pc one doesn't have anything in the environment but the 1 building, while the cloud one has a full environment.

Link to comment
Share on other sites

hate to come in being a skeptic but I'm curious why they are running different demos for the PC and the cloud?

 

The Pc one doesn't have anything in the environment but the 1 building, while the cloud one has a full environment.

 

To show the difference in processing ability?   If you take it as is then the PC (local physics processing) with just the main building and nothing else crawls while the 2nd (cloud offloaded physics to Azure) with even more objects in the world performs smoothly.    I think it's pretty straight forward really.

Link to comment
Share on other sites

To be honest, if they can achieve half of what they demonstrated when you take into affect latency etc it will still make quite a big difference. Whilst "cloud computing" is nothing new, I think what MS are trying to do is quite unique in the world of "cloud" so this is probably still quite young in its development but looks like there may be room for plenty potential.

 

Maybe the XB1 might start making those 1080/60fps numbers once developers truly get a hold on things a lot more frequently then the one or two games they have managed so far. 

It usually isn't the tech itself, but how it can be leveraged, that makes the biggest difference.  (Remember, the microwave oven was an "oops" discovery - a side effect of radar research!)

 

The same applies to Azure - it's not a new technology, but a vastly different, and more easily-leveraged, way of using existing technology.

 

It's not inventing the wheel, but improving it.

Link to comment
Share on other sites

I somehow think that demo was rigged but it is nice to see that they are going to be moving that content from xbone only to full pc support.

 

No reason but you think it is rigged. Well said.

 

I do as well. Even if you put the physics on the cloud, it's not going magically make the increased rendering load the physics bring to it disappear and keep the framerate from falling.

Because rendering is not bottlenecked; the physics processing is. The fps drop is because the cpu/gpu can not cope up with the massive amount of calculation involving physics; NOT rendering graphics.

 

Go study physics processing or AI processing. These two things can use up much more computational power than graphics processing if you let them...

Link to comment
Share on other sites

I'm not going to get into a circular debate about it being real or fixed or w/e.  I'm only going to think about where it can go from here.  It wouldn't be hard for a developer to write their game to check for a connection to the cloud (in this case Azure/XBL), if the connection is there and fast enough then they can do these types of offloading of physics and probably other things that add to the game.   If there isn't one then they fall back to local and do less heavy processing.   The game still looks good because the graphics assets are the same just less "lively" for lack of a better word.

 

Only time will tell how it's used but more and more people are playing while connected to the internet and some service be it XBL or Steam and so on, and their speeds are getting better and better. 

I'd say the best place to apply this type of enhancement now is MMORPG.

Link to comment
Share on other sites

I think what's really interesting is watching the "chunk count"  the 2nd demo using the cloud actually catches up and then passed because it's processing faster and they are able to shoot extra objects while the main building is exploding.  Also that youtube video has a glitch where the silverlight video stream is buffering and it looks like both demos freeze and then jump forward which didn't happen when I watched it on the live stream.

Link to comment
Share on other sites

Well the talk about the X1 just ended and it had some interesting tid bits in it. 

 

They didn't have time to go into the real in depth stuff, but they did touch on how they came to the choices they made when designing the console from a hardware and software perspective.  It was compared against the previous consoles to layout where they decided to evolve the experience. The speaker took time to focus on Kinect and how they have demoed it to developers and what their thinking has been around it.  Kinect is mentioned as a built in feature, reflecting the clear position that Kinect is now just part of the platform, not a bundled add-on.

 

He talked about the OS setup a bit more in depth than I had seen before.  So there is a single host os which manages a 'shared' partition and an 'exclusive' partition.  These aren't considered Hyper-V environments anymore because everything but what is needed for each partition has been stripped out.  Both partitions are actually windows 8 based, so the 'Xbox OS' that was thrown around is actually not the correct term.  Basically, the shared partition manages thing like shared apps, the xbox system (or os if you want), audio, networking, and any other bits that are not directly part of running a game.  The exclusive partition only manages playing the game and passing things like directx commands to the host os which handles that.   Both partitions are Win 8 based, containing all the expected api support for Win32 or WinRT, but the exclusive partition uses an even more stripped down version of 8 than the shared partition.  He restated the idea that this fact is why games and apps will be able to move from X1 to PC or phone, and vice versa with much less effort for the developer.

 

He had a very short Q&A where it was asked if they were looking at emulation of the 360 for the X1.  Surprisingly enough, he said that yes, they are trying to do that.  He did follow that up with the fact that it has not been easy to pull off and he has nothing to announce yet, but they want to do that.

Link to comment
Share on other sites

I'd say the best place to apply this type of enhancement now is MMORPG.

 

 

That would be the best starting point, I agree.  I still think that local SP games can benefit as well, something like GTA or Watch Dogs could have things in the city and NPC characters that act more "real" or whatever you'd like to call it when able to be paired with Azure.  Otherwise they'd default to a base the developer has chosen.  So for example maybe you'd see more NPCs walking in the streets or cars when connected and without it you'd go back to a default.    Those types of things don't effect your game and aren't exactly latency restricted, unless I'm wrong about it.

Link to comment
Share on other sites

Here's a thought, would the resulting effects actually be from pre-calculated data (cached) from the cloud compute platform and the demonstration is actually not a true representation of a dynamic physics calculation?

 

If this is really in real time and true dynamic calculation, then you would need to achieve near native I/o performance.

Link to comment
Share on other sites

That would be the best starting point, I agree.  I still think that local SP games can benefit as well, something like GTA or Watch Dogs could have things in the city and NPC characters that act more "real" or whatever you'd like to call it when able to be paired with Azure.  Otherwise they'd default to a base the developer has chosen.  So for example maybe you'd see more NPCs walking in the streets or cars when connected and without it you'd go back to a default.    Those types of things don't effect your game and aren't exactly latency restricted, unless I'm wrong about it.

 

In GTA they could fix things like the train magically appearing and disappearing...

Link to comment
Share on other sites

Here's a thought, would the resulting effects actually be from pre-calculated data (cached) from the cloud compute platform and the demonstration is actually not a true representation of a dynamic physics calculation?

 

If this is really in real time and true dynamic calculation, then you would need to achieve near native I/o performance.

 

 

Anything is possible.

Link to comment
Share on other sites

I predict a thread where half the people still don't believe it and half say "I told you so".

 

 

 

Wont happen when it will be a real game you can buy.

  • Like 1
Link to comment
Share on other sites

Wont happen when it will be a real game you can buy.

 

 

Which is MS' challenge.  To convince developers to invest the time to experiment with ways of using it.  Its not shocking that it could take a little time before you see developers really pushing the boundaries of what servers can do for a game. MS seems committed to doing everything they can to tempt them.

Link to comment
Share on other sites

Here's a thought, would the resulting effects actually be from pre-calculated data (cached) from the cloud compute platform and the demonstration is actually not a true representation of a dynamic physics calculation?

all the storage in the world combined wouldnt even be a quintillionth of a percent of the storage needed to store every possible combination

Link to comment
Share on other sites

all the storage in the world combined wouldnt even be a quintillionth of a percent of the storage needed to store every possible combination

 

I was thinking within an error ratio, e.g. hit box collision detection, duplication of object instances. You wouldn't need that many samples for the structures they demonstrated.

Link to comment
Share on other sites

All they've done is shown one thing slowed down and one thing running at normal speed.

 

There was nothing in that demo that couldn't run even on a last gen 360?  Graphics were awful and physics average.

 

They just slapped a low number on the slowmo one and all of a sudden people are lapping it up

Link to comment
Share on other sites

Collision detection is usually done server side today. they even store snapshot and full history of where you travel for a period of time. So that when you fire a gun at someone, the server actually turns back time to properly calculate hit or miss. I'm doing a terrible job at explaining it though.  But it's all to prevent people abusing lag to dodge shots or hid behind buildings. 

Link to comment
Share on other sites

I was thinking within an error ratio, e.g. hit box collision detection, duplication of object instances. You wouldn't need that many samples for the structures they demonstrated.

but then it becomes something totally different.

Link to comment
Share on other sites

That was an absolute horsesh*t demo.

 

All they've done is shown one thing slowed down and one thing running at normal speed.

 

There was nothing in that demo that couldn't run even on a last gen 360?  Graphics were awful and physics average.

 

They just slapped a low number on the slowmo one and all of a sudden people are lapping it up

 

 

So it didn't have any value?

Link to comment
Share on other sites

I'm not going to get into a circular debate about it being real or fixed or w/e.  I'm only going to think about where it can go from here.  It wouldn't be hard for a developer to write their game to check for a connection to the cloud (in this case Azure/XBL), if the connection is there and fast enough then they can do these types of offloading of physics and probably other things that add to the game.   If there isn't one then they fall back to local and do less heavy processing.   The game still looks good because the graphics assets are the same just less "lively" for lack of a better word.

 

Only time will tell how it's used but more and more people are playing while connected to the internet and some service be it XBL or Steam and so on, and their speeds are getting better and better. 

 

There's something interesting about this part... calculations are done on the remote machine. All that's transferred over your connections is the variables sent out and the results sent back. Transferring small bits of data is far easier than streaming what's being rendered. Sure in some cases latency may affect things but I doubt it will hit it that hard. Even websites do this already with your phones on many apps. Streaming live data between server and client is the norm for software.

 

 

Here's a thought, would the resulting effects actually be from pre-calculated data (cached) from the cloud compute platform and the demonstration is actually not a true representation of a dynamic physics calculation?

 

If this is really in real time and true dynamic calculation, then you would need to achieve near native I/o performance.

It would only make sense for things to be cached if they can be. Certain things happen more than once, and if a specific series of physics calculations are done commonly then why not cache them? And even then this isn't about the need for it to be real time. because any lag on the server's part in processing calculations can be made up by the local GPU/Processor (which now has leeway to do that if necessary).

Link to comment
Share on other sites

There's something interesting about this part... calculations are done on the remote machine. All that's transferred over your connections is the variables sent out and the results sent back. Transferring small bits of data is far easier than streaming what's being rendered. Sure in some cases latency may affect things but I doubt it will hit it that hard. Even websites do this already with your phones on many apps. Streaming live data between server and client is the norm for software.

 

 

I'm not sure what all that has to do with the post you relied to. He never mentioned cloud rendering just that freeing up all the GPU resources used for physics from doing that, gives you a lot more power to do graphics with.

Imagine a GPU resource pool 25% or more of this pool of resources can be tied up in some form of physics rendering! or have to be reserved to do it when required. Free this pool and you suddenly have 25% more GPU power for graphics.

Link to comment
Share on other sites

I'm not sure what all that has to do with the post you relied to. He never mentioned cloud rendering just that freeing up all the GPU resources used for physics from doing that, gives you a lot more power to do graphics with.

Imagine a GPU resource pool 25% or more of this pool of resources can be tied up in some form of physics rendering! or have to be reserved to do it when required. Free this pool and you suddenly have 25% more GPU power for graphics.

Quoted the wrong post, lol

Link to comment
Share on other sites

This topic is now closed to further replies.