Microsoft uses Azure to destroy things in BUILD 2014 gaming demo

Developer Respawn Entertainment used Microsoft Azure for its Xbox One shooter Titanfall to handle things like AI and game physics. However, Microsoft showed how Azure could be used in PC gaming as part of its second day keynote address at BUILD 2014.

The gaming demo starts at the 2:21:11 hour mark on the Channel 9 video replay of the event. The prototype, which uses what the demo team called a "high end" PC gaming rig from Maingear. It involves a simulation of a building that can be destroyed via rocket launchers or explosives.

On the PC that was running the demo without the connection to the Azure cloud service, the building simulation was being destroyed but at the cost of a greatly reduced frame rate, down to just 2 FPS. This was due to the number of randomly generated particles and debris showing up on the screen and hurting the PC's GPU performance..

However, on the PC rig that ran the same demo with the Azure service handling all of the physics computations, the amount of particles didn't affect the frame rate at all; it stayed around 32 FPS. There's no word on when Microsoft will begin offering this kind of cloud-based service to PC game developers, but it certainly shows the potential of what can be done with this kind of feature, especially in first person shooter games.

Source: Channel 9 | Image via Microsoft

Report a problem with article
Previous Story

Microsoft showed off new Windows 8 car concept during BUILD 2014

Next Story

Microsoft to offer new Windows Store UI for Windows 8.1 Update

48 Comments

View more comments

Osiris said,
Great example of how the cloud can work to enhance the gaming experience but im not entirely convinced that first test machine was high end :p

here's the video if you don't want to search through the full length channel 9 vid:
https://www.youtube.com/watch?...DhOMyw&feature=youtu.be


Just like you can make a super high end graphics workstation crawl by using raytracing, high end computers can start crawling if you start simulating physics with great precision.

Osiris said,
Great example of how the cloud can work to enhance the gaming experience but im not entirely convinced that first test machine was high end :p

here's the video if you don't want to search through the full length channel 9 vid:
https://www.youtube.com/watch?...DhOMyw&feature=youtu.be

I imagine MS can afford the highest end gaming systems for testing. :)

Also, think about it like this.

Do some insanely complex physics/math/whatever that would take hours on the fastest PC. Now run those same computations on a server technology that can scale across 1,000 hardware servers.

So 1000 servers versus 1 PC. Which is going to be faster?

(This is what Azure does, as it is a dynamically scaling service, and one reason it is rather brilliant as it doesn't have to tangibly exist on any piece of hardware and can run on top of several servers anywhere and scale out to a ton of servers as demand needs. )

Calculate debris chunks and motion paths as soon as the RPG (or whatever) is fired, send back all results to animate the chunks by time the RPG strikes

Martin Sundhaug said,
How would that work with latency in a 60+FPS environment?

If designed properly, latency would not be an issue. As the unseen or coming effects are not yet visible on the screen, so they can be calculated several seconds before the client needs to render them.

In this example, the whole building explosion can be calculated in the cloud before the first chunk of debris is seen falling on the screen.

Or for example, when they fire the missile into the building, as soon as the missile is fired, the server is doing all the deconstruction physics before the user sees the missile hit the building.

^ What he said.

It's walking a tightrope between user input and scripted events. If it's scripted, then it just happens. If it's user input then it can calculate as soon as the input is made. Even if, say, you're holding a detonator and standing a few hundred feet from a building it still works. The input is there, the math is done and is sent back so you can watch a more realistic demolition.

It's like rendering video vs watching video. If you have a video with lots of complex effects, like after effects type stuff, it can take forever to render but it might only be a 2-3 minute video.

If a series of super computers can calculate the math for me then by all means! I think it's still a little ways off but cloud computing with gaming will be where its at.

I find it quite funny when the same people who claim streaming video game services like Gakai/Playstation Now are the future, yet keep telling us how cloud computing on Xbox One is impossible because of latency...

That's exactly why I said these arguments are bogus. People will just as easily call Gaikai/Playstation Now a waste of time, yet claim cloud is the way forward whenever Microsoft utilizes it. People just switch their arguments completely whenever it suits the company they want to reference.

bjorndori said,
I find it quite funny when the same people who claim streaming video game services like Gakai/Playstation Now are the future, yet keep telling us how cloud computing on Xbox One is impossible because of latency...

Sadly these topics are too often conflated.

There is a difference between sending rendered video back from a server, and sending rendering calculation results back from the server.

As I note above, if the user is firing the missile in this example, the server can do all the deconstruction physics before the missile hits the building and send that back to the client.

With full video rendering, the drawing of the missile going to the building will appear to fire a short time after the user presses the button. This is where latency exists and is a problem.

Using the cloud just for calculations can work with 0 latency even on fairly latent networks and is very light on the network.

They are just two different concepts and two different arguments.

Mobius Enigma said,

Sadly these topics are too often conflated.

There is a difference between sending rendered video back from a server, and sending rendering calculation results back from the server.

As I note above, if the user is firing the missile in this example, the server can do all the deconstruction physics before the missile hits the building and send that back to the client.

With full video rendering, the drawing of the missile going to the building will appear to fire a short time after the user presses the button. This is where latency exists and is a problem.

Using the cloud just for calculations can work with 0 latency even on fairly latent networks and is very light on the network.

They are just two different concepts and two different arguments.

I'm not sure. If the streaming service installed the basic game engine and then streamed all of the textures and such for each level or instance you were moving through... Then uninstalled the engine when you were done playing...

You might see some latency issues between levels and booting up the game, but honestly you shouldn't see any absurd latency during gameplay.

That is how quite a few of the PC mmo's I play do their initial install anyway. They load the game engine then backfill the rest of the world. Get's you playing in minutes instead of hours most of the time.

It really shouldn't be much of a stretch to make a streaming service for games do it that way entirely.

NastySasquatch said,

I'm not sure. If the streaming service installed the basic game engine and then streamed all of the textures and such for each level or instance you were moving through... Then uninstalled the engine when you were done playing...

You might see some latency issues between levels and booting up the game, but honestly you shouldn't see any absurd latency during gameplay.

That is how quite a few of the PC mmo's I play do their initial install anyway. They load the game engine then backfill the rest of the world. Get's you playing in minutes instead of hours most of the time.

It really shouldn't be much of a stretch to make a streaming service for games do it that way entirely.

This about an installed smart client using server side calculations.

Installation of the game is irrelevant.

So we hate Sim City for always on mode and love this idea?

Not saying it's a bad idea. Just saying that I missed some memo's and I'm not sure what the internet is supposed to be hating at the moment.

This kind of technology will likely be used to enhance a game if the service is available, and the internet service is fast enough.

Because Xbox One games have to work offline (except for MP only games, of course) this is not the same thing as the SimCity fiasco.

AmazingRando said,
So we hate Sim City for always on mode and love this idea?

Not saying it's a bad idea. Just saying that I missed some memo's and I'm not sure what the internet is supposed to be hating at the moment.


MEMO: Internet is supposed to be hating anything by MS and praise the same/inferior thing by Valve, Google or Apple.

Crimson Rain said,

MEMO: Internet is supposed to be hating anything by MS and praise the same/inferior thing by Valve, Google or Apple.

I think you have the wrong memo bud, this is NeoWIN. :rofl:

Never mind that SimCity was hated (before it even launched) because of always-on; meanwhile the same thing (in Titanfall) is ignored or called a non-factor? Never mind that both games are from the same publisher (EA) - or that both games were the launch titles of lack of single-player by design; with SimCity it was hated, but with Titanfall, the same design "feature" is basically seen as a non-factor. (For the record, I like both games, and play SimCity on PC, and mostly online at that - though offline IS available, and has been since Update 10.) For those - like me - that are playing Simcity today, now that offline mode is available, how important is it really to your gameplay?

People buy games like Titanfall and Call of Duty specifically to play online, SimCity fans don't. That's why it was a non-issue for Titanfall.

The always-connected multiplayer aspect wasn't popular before release, the hatred didn't start until the city size was known and all the technical issues became apparent.

I've seen demo's from the demo scene do the exact same thing without all the cloud palava and didn't notice any slow down like that back in 2007

farbrausch @ breakpoint 2007 fr-041: d e b r i s

check it out

Athlonite said,
I've seen demo's from the demo scene do the exact same thing without all the cloud palava and didn't notice any slow down like that back in 2007

farbrausch @ breakpoint 2007 fr-041: d e b r i s

check it out

The Demoscene is famous for precalculated animation and clever shortcuts. This tech demo is designed to be real-time and object accurate. No game would do this because it's so processor intensive. The idea is that accurate modelling WITHOUT precalculating is becoming possible with cloud compute offloading. It's a proof of concept. What developers will end up doing is using portions of this in concert with the kinds of rendering cheats they currently use. Where accuracy would be perceptible by the player, these kinds of techniques could be used. Otherwise, precalc'd stuff would be used as usual.

A more important use of this will be for lighting scenes that have changed. The cloud can calculate the normals, and other lighting details of changing scenes then send the calculated results to the client. As far as it's concerned, it's just rendering as usual -- precalculated lighting. But on the back end, the scene has changed dynamically based on events in the game.

Where a building stood before, now is a burning inferno -- and all the buildings around it are now lit accordingly.

It's real tech with real applications. It is already allowing developers to do things they could never do before.

Depends on the game. The truth is its very week CPU in both Consoles. We are likely to see very big gains in PC space as compared to console market in the next 10 years.

This is the future, a rough start but they are on the right track.
PhysX are too complex for today's GPUs , cloud should nail it down in the following years.
This fight between directX and OpenGL is in our favour.

Commenting is disabled on this article.