Microsoft uses Azure to destroy things in BUILD 2014 gaming demo

Developer Respawn Entertainment used Microsoft Azure for its Xbox One shooter Titanfall to handle things like AI and game physics. However, Microsoft showed how Azure could be used in PC gaming as part of its second day keynote address at BUILD 2014.

The gaming demo starts at the 2:21:11 hour mark on the Channel 9 video replay of the event. The prototype, which uses what the demo team called a "high end" PC gaming rig from Maingear. It involves a simulation of a building that can be destroyed via rocket launchers or explosives.

On the PC that was running the demo without the connection to the Azure cloud service, the building simulation was being destroyed but at the cost of a greatly reduced frame rate, down to just 2 FPS. This was due to the number of randomly generated particles and debris showing up on the screen and hurting the PC's GPU performance..

However, on the PC rig that ran the same demo with the Azure service handling all of the physics computations, the amount of particles didn't affect the frame rate at all; it stayed around 32 FPS. There's no word on when Microsoft will begin offering this kind of cloud-based service to PC game developers, but it certainly shows the potential of what can be done with this kind of feature, especially in first person shooter games.

Source: Channel 9 | Image via Microsoft

Report a problem with article
Previous Story

Microsoft showed off new Windows 8 car concept during BUILD 2014

Next Story

Microsoft to offer new Windows Store UI for Windows 8.1 Update

48 Comments

Commenting is disabled on this article.

This is the future, a rough start but they are on the right track.
PhysX are too complex for today's GPUs , cloud should nail it down in the following years.
This fight between directX and OpenGL is in our favour.

Depends on the game. The truth is its very week CPU in both Consoles. We are likely to see very big gains in PC space as compared to console market in the next 10 years.

I've seen demo's from the demo scene do the exact same thing without all the cloud palava and didn't notice any slow down like that back in 2007

farbrausch @ breakpoint 2007 fr-041: d e b r i s

check it out

Athlonite said,
I've seen demo's from the demo scene do the exact same thing without all the cloud palava and didn't notice any slow down like that back in 2007

farbrausch @ breakpoint 2007 fr-041: d e b r i s

check it out

The Demoscene is famous for precalculated animation and clever shortcuts. This tech demo is designed to be real-time and object accurate. No game would do this because it's so processor intensive. The idea is that accurate modelling WITHOUT precalculating is becoming possible with cloud compute offloading. It's a proof of concept. What developers will end up doing is using portions of this in concert with the kinds of rendering cheats they currently use. Where accuracy would be perceptible by the player, these kinds of techniques could be used. Otherwise, precalc'd stuff would be used as usual.

A more important use of this will be for lighting scenes that have changed. The cloud can calculate the normals, and other lighting details of changing scenes then send the calculated results to the client. As far as it's concerned, it's just rendering as usual -- precalculated lighting. But on the back end, the scene has changed dynamically based on events in the game.

Where a building stood before, now is a burning inferno -- and all the buildings around it are now lit accordingly.

It's real tech with real applications. It is already allowing developers to do things they could never do before.

So we hate Sim City for always on mode and love this idea?

Not saying it's a bad idea. Just saying that I missed some memo's and I'm not sure what the internet is supposed to be hating at the moment.

This kind of technology will likely be used to enhance a game if the service is available, and the internet service is fast enough.

Because Xbox One games have to work offline (except for MP only games, of course) this is not the same thing as the SimCity fiasco.

AmazingRando said,
So we hate Sim City for always on mode and love this idea?

Not saying it's a bad idea. Just saying that I missed some memo's and I'm not sure what the internet is supposed to be hating at the moment.


MEMO: Internet is supposed to be hating anything by MS and praise the same/inferior thing by Valve, Google or Apple.

Crimson Rain said,

MEMO: Internet is supposed to be hating anything by MS and praise the same/inferior thing by Valve, Google or Apple.

I think you have the wrong memo bud, this is NeoWIN. :rofl:

Never mind that SimCity was hated (before it even launched) because of always-on; meanwhile the same thing (in Titanfall) is ignored or called a non-factor? Never mind that both games are from the same publisher (EA) - or that both games were the launch titles of lack of single-player by design; with SimCity it was hated, but with Titanfall, the same design "feature" is basically seen as a non-factor. (For the record, I like both games, and play SimCity on PC, and mostly online at that - though offline IS available, and has been since Update 10.) For those - like me - that are playing Simcity today, now that offline mode is available, how important is it really to your gameplay?

People buy games like Titanfall and Call of Duty specifically to play online, SimCity fans don't. That's why it was a non-issue for Titanfall.

The always-connected multiplayer aspect wasn't popular before release, the hatred didn't start until the city size was known and all the technical issues became apparent.

I find it quite funny when the same people who claim streaming video game services like Gakai/Playstation Now are the future, yet keep telling us how cloud computing on Xbox One is impossible because of latency...

That's exactly why I said these arguments are bogus. People will just as easily call Gaikai/Playstation Now a waste of time, yet claim cloud is the way forward whenever Microsoft utilizes it. People just switch their arguments completely whenever it suits the company they want to reference.

bjorndori said,
I find it quite funny when the same people who claim streaming video game services like Gakai/Playstation Now are the future, yet keep telling us how cloud computing on Xbox One is impossible because of latency...

Sadly these topics are too often conflated.

There is a difference between sending rendered video back from a server, and sending rendering calculation results back from the server.

As I note above, if the user is firing the missile in this example, the server can do all the deconstruction physics before the missile hits the building and send that back to the client.

With full video rendering, the drawing of the missile going to the building will appear to fire a short time after the user presses the button. This is where latency exists and is a problem.

Using the cloud just for calculations can work with 0 latency even on fairly latent networks and is very light on the network.

They are just two different concepts and two different arguments.

Mobius Enigma said,

Sadly these topics are too often conflated.

There is a difference between sending rendered video back from a server, and sending rendering calculation results back from the server.

As I note above, if the user is firing the missile in this example, the server can do all the deconstruction physics before the missile hits the building and send that back to the client.

With full video rendering, the drawing of the missile going to the building will appear to fire a short time after the user presses the button. This is where latency exists and is a problem.

Using the cloud just for calculations can work with 0 latency even on fairly latent networks and is very light on the network.

They are just two different concepts and two different arguments.

I'm not sure. If the streaming service installed the basic game engine and then streamed all of the textures and such for each level or instance you were moving through... Then uninstalled the engine when you were done playing...

You might see some latency issues between levels and booting up the game, but honestly you shouldn't see any absurd latency during gameplay.

That is how quite a few of the PC mmo's I play do their initial install anyway. They load the game engine then backfill the rest of the world. Get's you playing in minutes instead of hours most of the time.

It really shouldn't be much of a stretch to make a streaming service for games do it that way entirely.

NastySasquatch said,

I'm not sure. If the streaming service installed the basic game engine and then streamed all of the textures and such for each level or instance you were moving through... Then uninstalled the engine when you were done playing...

You might see some latency issues between levels and booting up the game, but honestly you shouldn't see any absurd latency during gameplay.

That is how quite a few of the PC mmo's I play do their initial install anyway. They load the game engine then backfill the rest of the world. Get's you playing in minutes instead of hours most of the time.

It really shouldn't be much of a stretch to make a streaming service for games do it that way entirely.

This about an installed smart client using server side calculations.

Installation of the game is irrelevant.

Calculate debris chunks and motion paths as soon as the RPG (or whatever) is fired, send back all results to animate the chunks by time the RPG strikes

Martin Sundhaug said,
How would that work with latency in a 60+FPS environment?

If designed properly, latency would not be an issue. As the unseen or coming effects are not yet visible on the screen, so they can be calculated several seconds before the client needs to render them.

In this example, the whole building explosion can be calculated in the cloud before the first chunk of debris is seen falling on the screen.

Or for example, when they fire the missile into the building, as soon as the missile is fired, the server is doing all the deconstruction physics before the user sees the missile hit the building.

^ What he said.

It's walking a tightrope between user input and scripted events. If it's scripted, then it just happens. If it's user input then it can calculate as soon as the input is made. Even if, say, you're holding a detonator and standing a few hundred feet from a building it still works. The input is there, the math is done and is sent back so you can watch a more realistic demolition.

It's like rendering video vs watching video. If you have a video with lots of complex effects, like after effects type stuff, it can take forever to render but it might only be a 2-3 minute video.

If a series of super computers can calculate the math for me then by all means! I think it's still a little ways off but cloud computing with gaming will be where its at.

Osiris said,
Great example of how the cloud can work to enhance the gaming experience but im not entirely convinced that first test machine was high end :p

here's the video if you don't want to search through the full length channel 9 vid:
https://www.youtube.com/watch?...DhOMyw&feature=youtu.be


Just like you can make a super high end graphics workstation crawl by using raytracing, high end computers can start crawling if you start simulating physics with great precision.

Osiris said,
Great example of how the cloud can work to enhance the gaming experience but im not entirely convinced that first test machine was high end :p

here's the video if you don't want to search through the full length channel 9 vid:
https://www.youtube.com/watch?...DhOMyw&feature=youtu.be

I imagine MS can afford the highest end gaming systems for testing. :)

Also, think about it like this.

Do some insanely complex physics/math/whatever that would take hours on the fastest PC. Now run those same computations on a server technology that can scale across 1,000 hardware servers.

So 1000 servers versus 1 PC. Which is going to be faster?

(This is what Azure does, as it is a dynamically scaling service, and one reason it is rather brilliant as it doesn't have to tangibly exist on any piece of hardware and can run on top of several servers anywhere and scale out to a ton of servers as demand needs. )

This = random demonstration of tech in ideal conditions that will never actually get used in games because of the inherit problems that come with it.

Also, OpenGL 4.4 brings the DirectX12 style features to Sony so don't worry, they'll be just fine.

Edited by CuddleVendor, Apr 6 2014, 12:54am :

CuddleVendor said,
This = random demonstration of tech in ideal conditions that will never actually get used in games because of the inherit problems that come with it.

Its already been done. Microsoft just said at build that all of the logic of titanfall is computed in the cloud,and not on local hardware.

Jose_49 said,
This + DirectX 12.... Sorry Sony.

I think this will tell Sony fans, who have been criticizing XBONE for 30% lower graphic horse power, what really matters is the power of the cloud not the console itself. because next year both PS4 and XBONE comparing to next year's PCs will have ridiculously lower graphic horse power anyway but the cloud will expand regardless of the console power.

CuddleVendor said,
This = random demonstration of tech in ideal conditions that will never actually get used in games because of the inherit problems that come with it.

Also, OpenGL 4.4 brings the DirectX12 style features to Sony so don't worry, they'll be just fine.

Check out the other Build sessions to see a real code / demo.

CuddleVendor said,
This = random demonstration of tech in ideal conditions that will never actually get used in games because of the inherit problems that come with it.

Also, OpenGL 4.4 brings the DirectX12 style features to Sony so don't worry, they'll be just fine.


Says a random uneducated person on the Internet...

You guys are making a bigger deal out of this than necessary in my opinion. Personally, I find these whole cloud arguments for and against to be bogus, on the basis that people switch their argument based upon the company supporting the feature(s). (N)

dead.cell said,
You guys are making a bigger deal out of this than necessary in my opinion. Personally, I find these whole cloud arguments for and against to be bogus, on the basis that people switch their argument based upon the company supporting the feature(s). (N)

The point is every one knows PS4 has better graphic card compared to XB1 but also everyone knows that XBOX Live is much better than PS plus Network. so if you could have this much processing power on cloud, not only it gives you tremendous graphic power on something like this scene which requires lots of processing, but also it enables another door to the world of whole lot of game genres that has open world real-time multiplayer type of thing and a game such as Titanfall is an application of it. This is not only fun and expand imagination but guarantees truly bright future for console gaming as well.

trojan_market said,

The point is every one knows PS4 has better graphic card compared to XB1 but also everyone knows that XBOX Live is much better than PS plus Network. so if you could have this much processing power on cloud, not only it gives you tremendous graphic power on something like this scene which requires lots of processing, but also it enables another door to the world of whole lot of game genres that has open world real-time multiplayer type of thing and a game such as Titanfall is an application of it. This is not only fun and expand imagination but guarantees truly bright future for console gaming as well.

Agreed. And with Sony fans who purport to be "experts", it's important to remind them of things like cloud compute, which they conveniently choose to forget or ignore and which Sony physically and technically would not be able to compete with...

trojan_market said,

The point is every one knows PS4 has better graphic card compared to XB1 but also everyone knows that XBOX Live is much better than PS plus Network. so if you could have this much processing power on cloud, not only it gives you tremendous graphic power on something like this scene which requires lots of processing, but also it enables another door to the world of whole lot of game genres that has open world real-time multiplayer type of thing and a game such as Titanfall is an application of it. This is not only fun and expand imagination but guarantees truly bright future for console gaming as well.

My point is the cloud can be amazing/awful depending on which company flag the person opts to fly. That's why I'm saying it's bogus: some people have an agenda of promoting one company, whether it's Sony or Microsoft, and it shows in their arguments.

dead.cell said,

My point is the cloud can be amazing/awful depending on which company flag the person opts to fly. That's why I'm saying it's bogus: some people have an agenda of promoting one company, whether it's Sony or Microsoft, and it shows in their arguments.

We are talking about facts here, not opinions.

Crimson Rain said,
We are talking about facts here, not opinions.

So what CuddleVendor posted was a fact? Or that Sony is going to fail because of this?

Not sure what topic you entered, but those are far from facts...

dead.cell said,

So what CuddleVendor posted was a fact? Or that Sony is going to fail because of this?

Not sure what topic you entered, but those are far from facts...


>My point is the cloud can be amazing/awful depending on which company flag the person opts to fly. That's why I'm saying it's bogus: some people have an agenda of promoting one company,

This is ########. It doesn't matter which company you support or don't support; the fact is that cloud computing can augment your local processing power and increase performance and/or quality of the said task.

Contrary to uneducated "ZOMG GRAPHICS!!" idiots on Internet, Physics/AI processing can eat up much more computational power if you let them. Since resource is limited and you can make physics/AI appear good enough by using various tricks, games spend only a tiny fraction on those and spend most resources on graphics.

Once you have distributed computing at your disposal, those complex physics/AI computation becomes a reality and the system can spend even more resource on graphics.

These are irrelevant to any company or whatever. These are facts.

Crimson Rain said,
<snip>

I'm not arguing against that. I'm arguing against the way people seem to dismiss Microsoft or Sony's efforts, simply because they favor one or the other. If people want to remove their bias and have a legitimate conversation about the advantages of cloud computing, I'm all ears. Unfortunately, people seem to be rather contradictory in how they look at it all, as this thread is filled with extremes from both sides.

Again, neither the OP's doom and gloom for Sony, nor was the rebuttal dismissing Microsoft's implementation a "fact" here. Both sides are at fault in oversimplifying this down to a "my dad is better than your dad" sort of bickering.

I have no qualms with cloud computing, Microsoft's implementation, or anything of the sort. Not sure what you're going on about.

Edited by dead.cell, Apr 6 2014, 10:46pm :

CuddleVendor said,
This = random demonstration of tech in ideal conditions that will never actually get used in games because of the inherit problems that come with it.

Also, OpenGL 4.4 brings the DirectX12 style features to Sony so don't worry, they'll be just fine.

This is just not true about OpenGL...

1) If you listen to the MS DirectX team, they themselves do not yet know all the features of DX12, so there is NO WAY anyone can say OpenGL's 4.4 features will be anything like DX12.

2) With what MS had revealed about DX12, it goes much further than anything in OpenGL 4.4. OpenGL 4.4 would be equivalent to the middle performance layer of DX12, not the high end performance layer. There is a difference between extending high performance from a slower managed framework and allowing the developer full control of the hardware without having to deal with the slower framework.

For coders, use this analogy with regard to hardware access...
OpenGL 4.4 will at best be C++ with inline Assembly that will be tied to very specific hardware configurations.
DX12 is closer to full Assembly (Virtually no framework overhead), yet is still hardware agnostic. (Best of both worlds in terms of portability and performance.)

OpenGL 4.4 will help performance for Sony and dedicated devices like the PS4, but DX12 will meet that performance, offer a bit more performance, and run on any combination of hardware.

___________________

As for the demo, this has NOTHING to do with DX12, yes the OP is wrong here.

This is about using a scaling dynamic cloud service to handle the physics calculations for not only speed, but to offer shared render effects with permanence.

Image a huge MMO or multiplayer game, where the physics and rendering in the world are shared. So if a player walks into town where a building is crumbling, they will see the same pieces of steel and glass falling in the same places and falling in the same way as all the other users as this computation is being done server side.

Days or weeks later those same pieces of glass and steel that fell to the ground will still be in the same spots for all users that walk by that building.

Games be able to pull off physics that 20 PCs can not calculate and offer shared procedural environments.

This is brilliant stuff, as even if it is a single player game, end users do not have access to this level of computing power. A game can be made with server side (cloud) calculations that are dynamic/procedural and run in real time on a XB1 that would literally take hours to render each frame on the XB1 alone.


The other misconceptions is this is somehow a XB1 vs PS4 argument. It isn't. These same MS cloud services could be used by a game developer on the PS4 as well. The brilliance is in the cloud tie in and dynamic scaling Azure offers, and has nothing to do with DX12 or XB1.

Mobius Enigma said,


1) If you listen to the MS DirectX team, they themselves do not yet know all the features of DX12, so there is NO WAY anyone can say OpenGL's 4.4 features will be anything like DX12.


Agreed with all you said except this...somewhat. OpenGL has been copying DX for a long time. So it wont be long before they copy these lol.

Crimson Rain said,

Agreed with all you said except this...somewhat. OpenGL has been copying DX for a long time. So it wont be long before they copy these lol.

The problem is in copying many of these things, they will have to break the tenants of OpenGL, and I don't think they would be willing to go that far.

Although I could be wrong as they are already making the mistake of breaking portability, which is already hurting OpenGL.

Also Microsoft could play the OpenGL 'copying games' and offer DX12 for other platforms and OpenGL would be done except for a few niche developers.

dead.cell said,

I'm not arguing against that. I'm arguing against the way people seem to dismiss Microsoft or Sony's efforts, simply because they favor one or the other. If people want to remove their bias and have a legitimate conversation about the advantages of cloud computing, I'm all ears. Unfortunately, people seem to be rather contradictory in how they look at it all, as this thread is filled with extremes from both sides.

Again, neither the OP's doom and gloom for Sony, nor was the rebuttal dismissing Microsoft's implementation a "fact" here. Both sides are at fault in oversimplifying this down to a "my dad is better than your dad" sort of bickering.

I have no qualms with cloud computing, Microsoft's implementation, or anything of the sort. Not sure what you're going on about.


I completely agree with you, what I am saying is different though. when XB1 came out every single gamer were bitching why the console is less computationally powerful than playstation and why it has copy restriction. after all the buzz they finally managed to kill several features on XB1 which I liked that was digital game library, game sharing and digital distribution or regular online check (samething that steam does but people are OK with it) anyways, I like xbox but I am not biased at all. yes PS4 has prettier console but what I care is that I keep hearing from MS about Cloud gaming and what I hear from Sony is bringing back the same game we played last year to next gen playstation. I like both console. infact I own a PS3, XB360 and XB1. and once the limitation of supply goes away and some cool games comes to PS4 (not last of us which I played many times) I will get a one as well. but I am also multiplayer gaming fan and multiplayer in XBOX compared to PS is day and night.

Mobius Enigma said,

/snip

completely agree Mobius, The shared nature of this tech allows for a completely different approach. Imagine online FPS games where the maps are truly dynamic based on what was previously done on the maps?

It would take a different approach, maybe we'd have to deal with instances where the server keeps track of the group of players as they rotate through maps, and when the same map is played it loads up the map as it was left at the end of the last battle. There would be some way of controlling the max number of revisits so you don't just end up playing in complete ruins of a town/city etc. but the potential here seems amazing to me.

duddit2 said,

completely agree Mobius, The shared nature of this tech allows for a completely different approach. Imagine online FPS games where the maps are truly dynamic based on what was previously done on the maps?

It would take a different approach, maybe we'd have to deal with instances where the server keeps track of the group of players as they rotate through maps, and when the same map is played it loads up the map as it was left at the end of the last battle. There would be some way of controlling the max number of revisits so you don't just end up playing in complete ruins of a town/city etc. but the potential here seems amazing to me.

It is exciting technology, as it even goes further than just effects on environment/map state. Gaming models can move in new directions that were uncommon or processing heavy in the past and virtually impossible in large multiplayer games.

Imagine worlds with ongoing massive effect calculations, for example a huge pool of water where players are interacting and affecting the water collectively. It isn't just about storing the outcome of the calculations, but keeping them active in a world that continues to be dynamic.

Another example would be an endless world that is procedurally generated, and expands as new players explore beyond known regions.

The physics calculations is just a quick and dirty example of real time effects that this offers.