Microsoft Research may have cracked the nut to low-latency cloud based gaming

The idea of cloud-based gaming has been around for a long time but nobody has seemed to have nailed down the functionality because of one major issue, latency. Microsoft has a vested interest in both gaming and cloud based services, so finding a way to resolve the latency issue could be a big win for the company.

And that's exactly what Microsoft Research has done with a project called DeLorean. You can read the full paper at the source link below but what they have done is created a way to render frames before an event occurs in a game and based on your inputs, the platform delivers the correct set of frames to your device. The paper says that using this method they can mask up to 250ms of latency and they do this by combining future input prediction, state space subsampling and time shifting, misprediction compensation and bandwidth compression. When you combine all these features together, Microsoft refers to it as DeLorean.

If you are thinking that cloud based gaming is only good for RPGs and other slower paced games, think again. While Microsoft did test out the service with Fable, they also used Doom 3 as a benchmark too. The end results were that the user preferred to use the DeLorean setup for cloud based gaming over that of traditional cloud based gaming technologies that are currently available.

The implications of this technology, if it works in the real-world as well as it does in a controlled environment, will be a way to deliver Xbox style gaming on any screen that has a solid Internet connection. It's easy to see why Microsoft would want this feature as it could be a huge addition to its Xbox Live Gold product offerings.

Source: Microsoft | Thanks for the tip h0x0d

Report a problem with article
Previous Story

Microsoft has a Chromecast like dongle in the works and it has passed through the FCC

Next Story

iPhone 6 rumored to include 128GB storage option and parked car locator

33 Comments

Commenting is disabled on this article.

Every time I see negative nancys around here that obviously lack the technical background to truly comprehend the elegance of these sorts of research projects, I'm reminded of the Kids in the Hall movie: Brain Candy.

Particularly the Bruce McCollough scientist: "It's a Pill that gives ex-girlfriends worms"
CEO: Will it work on ex-boyfriends too?
BM: You don't get it.... it's a PILL that gives EX-GIRLFRIENDS worms, sheesh!

John Nemesh said,
You mean, kinda like Sony has already done? Wow...good job, MS.

No, this reduces the input lag that services like Playstation Now suffer from. This is not like Sony's OnLive imitation.

I'm more interested that they can somehow manage to develop that idea and bake it into windows and stream the game from PC to console anywhere around the world with just above 5MB of internet speed to have low latency. If it do become a norm then we will be able to play on laptop without worrying the GPU isn't good enough.

Meh cloud gaming will cost money right? As in a sub service like what £10 a month maybe more? Just save that £10 a month and every 5 years buy a new PC, sell old one get a new one.

Or as parts age like say a year sell parts and replace. Personally i buy a good PC then after about 4 years sell it for nearly half what i paid and then buy a new one.

Over the timeframe unless the sub is free for streaming you will spend about the same amount but get a much better experience with your own local gaming and your shiny new PC.

"The implications of this technology, if it works in the real-world as well as it does in a controlled environment, will be a way to deliver Xbox style gaming on any screen that has a solid Internet connection."

Does this mean we'll be able to play Triple AAA Xbox Games on our Windows PCs soon that are exclusive to Windows PCs?

This is bandwith on the server side, so bandwith of the company providing the service. And this is not really about cloud gaming, it is general technique that can be used from faster webpages to cloud compute for AI, physucs for games.

This technique does have 2 drawbacks (take a look at the PDF):
* increases bandwidth requirements (1.5x - 4.5x) - so until superfast broadband gets better rollout this isn't going to work
* games have to be modified to support this - so need to get developers signed on

Very interesting concept and valuable research though, but far from commercial launch.

are they saying they can stream 1080p 60fps gaming over the average sub 5mb broadband connection?

if not, I think most will stick with their console

I wonder if there is something going on with BTTF references and software devs this week, because just 3 days ago there was a DeLorean javascript lib (flux/event.observe related) release. Are we just running out of names? Is this the delorean from the future? I'm confused.

I personally dont think its worth the effort for microsoft to make 360 games playable on the one. 360's have gotten really cheap and they are still supported anyways.

(not a fanboy) but Sony buys a live streaming tech, apple buys an assistant tech, MS makes both from scratch based on their own money losing techs (bing at least).

This is where MS can really push forward, hopefully the marketing dept wont will it all.

Yep MS often develop the technology themselfs, other less innovative companies like Sony and Apple will just buy it.

And Bing no longer loses MS money.

Microsoft's problem has always been execution though. They can come up with a dozen and one ideas, but it's about getting those ideas out too.

Furthermore, Microsoft buys up companies for their user base and tech just as well. Wear the rose tinted glasses though if you will. /shrug

The licencing fees from this technology will earn the company a boatload of money for years to come, if it works in the real world.

Nicely done, people.

holy crap, I wonder what the naysayers are going to say now?
anyway. looking forward to some really great games in the future xbox ftw

rocksturdy said,
holy crap, I wonder what the naysayers are going to say now?
anyway. looking forward to some really great games in the future xbox ftw

they will say you'll probably reach your data cap if you have one in a few hours. I remember doing onlive a while ago just to test it out. would go through about 2 GB in an hour.

anothercookie said,

they will say you'll probably reach your data cap if you have one in a few hours. I remember doing onlive a while ago just to test it out. would go through about 2 GB in an hour.

Well, at least the hard part is out of the way.

anothercookie said,

they will say you'll probably reach your data cap if you have one in a few hours. I remember doing onlive a while ago just to test it out. would go through about 2 GB in an hour.


I dont' think that's the question issue here. as long as they figure out how to make cloud gaming viable option.

This also requires support from the game, it needs to be able to perform the predicted events and roll the game state back if the user doesn't do it (Or keep two separate states, one for if the event was correct, one if it wasn't).

Say the system thinks you're going to take a sniper shot, so it performs the shot and calculates the damage and pre-renders the frames. But just before the user is able to take the shot they move slightly unexpectedly, so their shot misses and the NPC lives to fight another day, the game needs to roll back the shot and not apply the damage to the NPC, etc.

rocksturdy said,
holy crap, I wonder what the naysayers are going to say now?
anyway.

We're going to quote the actual article...

"the user preferred to use the DeLorean setup for cloud based gaming over that of traditional cloud based gaming technologies that are currently available."

And point out that the summary is "this approach is definitely better than all the other crappy cloud-based prediction systems" and then note that they are NOT comparing it to a direct client to server model which is what FPS/twitch games like Doom 3 etc. need for realtime combat.

So, while this definitely sounds like an improvement for games like Diablo 3, Path of Exile, etc. the jury is still out on the real manly lag-dependent games. ;)

Nice, dont even read the article.....
The idea is that there is only a certain number of "upcoming" frames (like, one if you move forward, one backward, one taking a shoot...).
If the system calculates the frames (which is serverbased and support from graphic Engine), and send them out, to the client, they will most likely be on their way (or already delivered) Before you actually move your controller.
The only, right frame is then shown on screen.
The idea is to deliver a videostream, but for every fram, 2-10 frames are delivered and only 1 is displayed.
The funny thing with you comments is that Graphics Engine has done this locally for long time specially for compensating lagg in multiplayer games over internet.
The graphic Engines render calculates what it can "prerender" (what does not change that much) and then in the last instance add the more "unpredicable" things, like a enemy or sudden change of movement.
Even when you play locally theres "lagg" from input to output because of rendering and prediction. Try for yourself. Play (any) COD on ps/Xbox. Start running forward (and notices the lagg from standing still and move) and then contuine running (and notices the less lagg from input) then stop (and notice the lagg again).

@excalpius u did not read the paper did u? it is compare to a beastly hp intel i7 with nvidia 680. it registered MOS 4-4.5, 5 is the most.

I played an online game created over a decade ago that used at least 3 of these types of technologies (thankfully the lead developer liked to play the game and chat up its users from time to time -- even if he also did sometimes [admittedly] cheat while doing so, but in a fun non-game-ruining way).

The upside is that everything appears smooth as if you are playing locally.
The downside is that if you think you shot/killed/[insert-action-here] an opponent in your game, you may have completely missed them in their game's instance. The end result is that "lag" or latency would still be a major factor to actual playability, unless of course the other technologies that my game did not use would better solve that problem...though I don't know how (ex: "heat seeking" bullets/items/actions?).

anothercookie said,
would go through about 2 GB in an hour.

lol.

I go through 2GB in 30 minutes when I stream at 1080p60.

Higher quality compression technology would be welcome.