The Cloud is very real. //build/ demo is from Crackdown 3


Recommended Posts

cloud is not "a server" but you already knew that. Cloud computing is real, buzz word or not.

 

Of course its not 1 server but you got my point ;)

Link to comment
Share on other sites

Which is, in fact, the whole point behind it.

 

That the tech is old isn't the point - it's how it's leveraged that is the point.

 

Look at alternating current - it actually predates Edison's lightbulb by a few decades.  (Edison didn't even invent the method.)

 

However, Edison's company - General Electric - leveraged alternating current to build the first wide-scale electrical grids - which made electricity sellable further away from the generating facility.

Link to comment
Share on other sites

Oh, and the "Cloud" strikes again.

 

The sooner this gimmic name for a server dies the better.

 

First video:

 

"This is on an high-end machine"

 

Doing 12 FPS? What year was that machine high-end? Even my laptop can do better on RF:G where theres MUCH MORE destruction.

Y2K!

Link to comment
Share on other sites

Of course its not 1 server but you got my point ;)

I still don't understand how it is a gimmick as you say.

Link to comment
Share on other sites

Oh, and the "Cloud" strikes again.

 

The sooner this gimmic name for a server dies the better.

The cloud is more of a name of server with on the fly virtualization

Link to comment
Share on other sites

You know what, I just calculated the data rate for the Crackdown demo shown at Build. Obviously there's a couple more variables involved, for example, how the building breaks and the shape of the chunks. Would they derive from the local box which then gets sent up to Azure? Along with that and a server application which would have the collision meshes of the map so it can sync up with the local box, and then it receives the variables around the explosion like size, direction, radius etc.

 

32 bits * 3 - Float

9 bits * 3 - 9 bit Integer
Compression Ratio: 85%
Chunks: 10,000
 
Total Bits per Chunk: 123
Total Bits for Chunks: 1,230,000
Total Compressed: 184,500
 
Typical Ethernet MTU = 1500 bytes = 12000 bits
 
Data Frames Per every Screen Refresh: 15.375 = 16 Frames
 
Typical UDP Overhead = 28 bytes
Total Overhead per Screen Refresh = 447 bits
 
Total Bits every Screen Refresh: 184947 bits
 
Throughput Needed for 1 Screen Refresh: 180.6kbps
 
Throughput Needed for 16 FPS: 2.9Mbps
Throughput Needed for 32 FPS (As Demo): 5.8Mbps

For the data, I've used float values for the X,Y,Z co-ordinates on the map and assigned 9 bit integers for the rotation values for X,Y,Z. 9 Bits supports way more than 360 values.

 

The compression used is a given in this scenario. with the data being compressed featuring purely floats/ints the compression is very high, in around the 80's which I've substituted in.

 

To compare this to services which are used daily, for example Netflix, which uses 7Mbps a second for a Super HD stream which is pretty much standard these days. Next-gen consoles, previous gen support Super HD.

 

Regarding Latency:

Average RTT (Round Trip Time) to Azure: 40ms

Calculation Time at Server: 32ms (For 32FPS)
 
Total RTT = 72ms
In Seconds = 0.072 Seconds

That means it takes 0.072 seconds from the beginning of the explosion for it to come back and start happening on your screen. Once the first load has occurred, you only have to wait 32ms each frame which is the normal time for a refresh on-screen at something running at 32fps. This is due to the initial information being sent and the server application calculating responses without any further input from the client.

 

To be honest, the overall data rate, especially at 32FPS was higher than I though it would be. Although, this is without taking into consideration any optimisation processes which could shrink the amount of data needed per chunk or using algorithms in the data-sets to remove the amount of chunks needed to be sent to allow the full chunk amount.

 

In regards to packet loss, in 2014, you simply don't have any. ISPs these days tend to be both Tier 3 and 2 with peering often directly to large services which make up a lot of the bandwidth. This includes services like Google, Facebook, Twitter, Netflix etc. Honestly, unless you have a poor wireless signal inside your house which often causes some slight packet loss, you're not going to get any. Even if you drop a couple of packets, you'd loose a handful of chunks for that one frame and in-terms of gameplay it's not going to be noticeable really.

 

Obviously there is people in the world who can't maintain a 5.8Mbps stream, and to counteract that, the application will probably dynamically set an FPS based on peoples DL speed.

 

If anyones got any suggestions how to increase accuracy, or anything, let me know. Probably going to post this on Reddit and cause a ruckus.

 

TL;DR: Cloud computing is definitely feasible on normal ISP connections.

Link to comment
Share on other sites

You know I've never got why people hate the word/phrase whatever you want to call it of "the cloud". The term wasn't created to decieve us, we all know its not something new. Its just a term to describe an infrastructure and nothing more. 

Link to comment
Share on other sites

You know I've never got why people hate the word/phrase whatever you want to call it of "the cloud". The term wasn't created to decieve us, we all know its not something new. Its just a term to describe an infrastructure and nothing more. 

 

But that's the problem, people on this site and with some tech background will know that it's not a new thing. However, loads of companies are spinning the Cloud as something new and something that has never been done before, and it's that, that people have an issue with. 

Link to comment
Share on other sites

But that's the problem, people on this site and with some tech background will know that it's not a new thing. However, loads of companies are spinning the Cloud as something new and something that has never been done before, and it's that, that people have an issue with. 

 

Well there is potential for that to be the case in some circumstances, however sticking on topic and please correct me if I am wrong. Microsoft have never offloaded parts of the game to their XBL infrastructure in order to help local console performance. So whilst the cloud is nothing new, what they are trying to do here is very new to the Xbox console. 

 

Please understand, I'm not trying to say the process of what is happening is new, im sure there are PC games that do this etc, but to my knowledge this is a new thing to the Xbox console so they are quite within their right to promote the cloud etc as new technology in this area.

Link to comment
Share on other sites

This topic is now closed to further replies.