Xbox One: Microsoft Claims that Cloud Computing Can Provide Power of 3 Xbox Ones, 32 Xbox 360s


Recommended Posts

The ps4 does bugger all in terms of media really.

Maybe a little harsh to say it's not a games machine but I think you know where I was coming from. To put an entry level GPU in a games console isnt a wise move really.

Yeah, the high end comment was more contextual in terms of from last gen. The difference is very much like a high end GPU :-)

The ps4 does not offer anything in terms of media? I happen to use the media apps there. Plus, Sony is clearly going to be adding more media features via updates and new apps based on what they have said regarding updates. Things like DLNA support or a music player all matter when it comes to media uses. I have little doubt that the ps4 will be like the ps3 feature wise. Then you have the tv shows Sony is working on to bring to PSN.

Also, if you meant in comparison to last gen, then that is much different then comparing it to current pc hardware. That would put the X1 gpu at mid-high end if the ps4 is high end. Remember, the X1 is several times more powerful than the 360 based on raw specs alone.

Link to comment
Share on other sites

It certainly is a strange decision.

Tell me about it. I'm guessing MS was banking on DDR4 being commercially ready earlier from looking at the leaked roadmaps from 2012/13. Without that, they probably had to decide on what else to do, hence the eSRAM/DDR3 decision. Sony got really lucky with the availability of GDDR5 and being able to get 8GB of it in the box. They so nearly launched with 4GB based on leaked pre-announcement specifications.

Link to comment
Share on other sites

I'd love to know why MS decided on putting the eSRAM on silicon rather than a separate chip. If this wasn't decided and that space was used for GPU, you'd be looking at a box which has a GPU at titan specs with the amount of silicon they have.

There must be an advantage to having it on silicon versus a separate chip. Could bandwidth be much higher that way?

I also wonder if using DDR3 + eSram resulted in costs that prohibited them from using a more powerful gpu. Sony really lucked out with the late cycle price drop on GDDR5. I do kind of doubt that MS or Sony could have afforded the cost of a titan-like gpu though :laugh:

Beyond your question, I think they made a mistake in the amount of eSram to use. They should have tried to include more.

Link to comment
Share on other sites

There must be an advantage to having it on silicon versus a separate chip. Could bandwidth be much higher that way?

I also wonder if using DDR3 + eSram resulted in costs that prohibited them from using a more powerful gpu. Sony really lucked out with the late cycle price drop on GDDR5. I do kind of doubt that MS or Sony could have afforded the cost of a titan-like gpu though :laugh:

Beyond your question, I think they made a mistake in the amount of eSram to use. They should have tried to include more.

That's what I'm thinking. If they had a larger GPU but lost considerable amounts of BW to eSRAM the whole 'resolutiongate' scenario would be even worse. If there's one thing about silicon I know is that it doesn't cost for what's on the silicon necessarily, but rather purely, the size of the silicon. If silicon is larger, it's harder to produce, more manufacturing defects can occur etc. Incorporating a larger GPU where they've already licensed GCN in the first place on their silicon would probably incur no extra costs due to the silicon already existing at its size.

 

If they included more eSRAM, it would of used more of the die size on the silicon which would of meant even less GPU. 

Link to comment
Share on other sites

Titanfall,the AI,  all the wildlife (monsters and such), ships jumping in and out of orbit all done server side.  not locally

And yet it has the exact same limitations of non-cloud powered games, that is, the number of objects that can be rendered simultaneously before the cpu and gpu bottle neck.

I think simcity is the perfect example of why companies are jumping on the "cloud-powered" band wagon. It's all about DRM, anti-piracy, and control.

I can't think of any single thing I've seen thus far in Titanfall that couldn't be done just as well locally. Correct me if I'm wrong.

Link to comment
Share on other sites

And yet it has the exact same limitations of non-cloud powered games, that is, the number of objects that can be rendered simultaneously before the cpu and gpu bottle neck.

I think simcity is the perfect example of why companies are jumping on the "cloud-powered" band wagon. It's all about DRM, anti-piracy, and control.

I can't think of any single thing I've seen thus far in Titanfall that couldn't be done just as well locally. Correct me if I'm wrong.

Honestly I don't know exactly what Titanfall is using the servers for, but its clearly not visual related.

If it is using something tied to calculations, then that could be impossible to pick out since the resulting game experience is indistinguishable from a game running completely local.

I am not sure that Titanfall should be the end all, be all, when it comes to how servers are being used. Just as Forza's usage also does not show everything you can do.

Link to comment
Share on other sites

That's simply not true. People are criticising the way Microsoft is representing the cloud functionality, as it has virtually no impact upon the gaming experience. We wouldn't be having this conversation if games on the XB1 were using the cloud to creative innovative new experiences simply not attainable on the PS4. As it stands even the most fervent advocates are unable to illustrate how cloud computing improves the gaming experience.

 

It's all hype, no substance.

 

 

Titanfall has had issues with server reliability too and it isn't available in some countries due to the lack of server infrastructure, so what we've seen is the cloud functionality actually limiting the gaming experience.

 

My question is why does it matter if it's "not attainable on the PS4" ? It's a tool, a resource for developers to use. Yet unless it's some miracle that makes it heads over heals better than what the PS4 has to offer then Microsoft is just lying to you. I've not seen anything unbelievable from them. Not even the demo was that amazing if you think about it since all it did was handle physics calculations. You're transferring numeric data over the network. Small, small data back and forth.

 

This is what websites do all the time. And they are all about managing input/output with many people. This practice has been present for many years yet people act as if it's never been done before. The only thing you'll really have to worry about is sub average connections or distance from the server (with Microsoft that shouldn't be a problem since they have data centers all over the world). And by sub-average I mean < 1 or 2 mb/s

This is a sound, and legitimate concept. And anyone who's comparing it to something like a game streaming service (such as Gaikai) doesn't understand the technology.

Link to comment
Share on other sites

Also, let me further this a little more.

 

If you have piece of hardware that can do, say 50 calculations per second. And it does calculations for 1m but at some instances it gets bogged down.

 

Snapshot every 10s w/ calculations required:

 

10 - 30 - 25 - 43 - 60
 

That 60 is what we're worrying about. In their demo what this would do is create a process queue, since there's now a backup on a processor thread. If this continues, the line gets longer and longer. This results in a framerate drop. Why? Cause the system has to wait to get the numbers back before it can render.

Now, if you instead opt for a system that can perform 150 calculations per second (3x that of your original), you will theoretically be able to handle the same number of calculations in 1/3 the time. And, any time spent waiting for the data to transfer to/from the cloud is now negligible because (theoretically) the amount of time it takes for the cloud server to crunch your numbers is far less than for the local system, and any time lost in transferring data is made up for in processing speed.

Sure, there's the possibility of hiccups and latency. But the game can easily be developed to use the cloud as an overflow instead of a primary processing point. Or the reverse, even. That way the system isn't dependent on the cloud, but has it as a way to take the heat off the main system. I think this is exactly what Microsoft was demonstrating. Games aren't going to be getting worse without the cloud. They'll be getting better with it.

Microsoft is targeting the consumer with slightly above average internet speeds. Their entire console feature set is already based on it. People keep crying about bad net, well perhaps Microsoft isn't targeting those with bad internet as potential customers? Maybe they want to work with the current technology available as has been the trend in the tech industry these last 5 years. Everyone is trying to bring tech to some semblance of a standard, moving to forced updates and shrinking their legacy support. The tech industry is moving on. You can't blame Microsoft for doing so, too.

Link to comment
Share on other sites

Here's what most are missing.  Consoles stay the same for 5-8yrs.  Cell phones, pc's, tablets, etc change like the wind.  

 

Developers are allowed to take advantage of something now, that was designed to be of more significant use later.

Link to comment
Share on other sites

My question is why does it matter if it's "not attainable on the PS4" ? It's a tool, a resource for developers to use.

Because if the PS4 is able to do the same without the cloud then it shows what little impact it has.

 

Why do you feel that its not a major feature? You are confident in that point and yet you follow it up with you have no idea if it will change.

It's been hyped as a major feature yet it has done nothing to meaningfully improve the gaming experience. In some cases it has actually caused issues, like with Forza 5. It's possible developers will suddenly flock to use it and do so in a meaningful way but we simply haven't seen any evidence to support that. If it was a major feature, something that differentiated the XB1 from the PS4, you would expect to see it being used more.

 

For now it's nothing more than a gimmick. That might change.

 

Ultimately, the question is: Is the idea sound? Can there be things done with dedicated servers that improving a gaming experience?

That remains to be seen. We saw a lot of hype surrounding streaming games (OnLive, Gaikai, etc) and yet the actual experience was a complete letdown.

 

I don't know how any body can say this is just PR after many discussions and even a demo which showed how it can benefit games. Dedicating servers to off-loading CPU tasks off the box will allow for better graphical games and having 3x theoretical power for X1's in each console is massive. I'm not sure how they can dedicate that much.

Because it's all theoretical - we haven't seen any worthwhile real world implementations. Anyway, I think I've made my point and I don't just want to keep repeating the same thing. It is a technology that has potential and could theoretically lead to worthwhile improvements to the gaming experience?I don't think that is in dispute?but for now it simply doesn't do that. Even if it works there are problems associated with the approach, like latency spikes and connection interruptions that could negatively impact the user experience.

Link to comment
Share on other sites

Because if the PS4 is able to do the same without the cloud then it shows what little impact it has.

 

I don't really get what that means. Honestly i'd at least expect it to make 60fps more feasible with 1080p games. Something that both consoles are obviously struggling with.

 

It's been hyped as a major feature yet it has done nothing to meaningfully improve the gaming experience. In some cases it has actually caused issues, like with Forza 5. It's possible developers will suddenly flock to use it and do so in a meaningful way but we simply haven't seen any evidence to support that. If it was a major feature, something that differentiated the XB1 from the PS4, you would expect to see it being used more.

 

For now it's nothing more than a gimmick. That might change.

 

I don't think developers have even had a real chance to work with it. So I guess anything is a gimmick before it's actually used. You're making finite judgements based on something that's only barely been available. Just like everyone here who seems ready to call the console war based on the first few months, where last gen it seems everyone was helping the PS3 limp along until they were praising it in the last 2 years of an 9 year long generation. I'm so tired of this double standard.

 

That remains to be seen. We saw a lot of hype surrounding streaming games (OnLive, Gaikai, etc) and yet the actual experience was a complete letdown.

 

Because it's all theoretical - we haven't seen any worthwhile real world implementations. Anyway, I think I've made my point and I don't just want to keep repeating the same thing. It is a technology that has potential and could theoretically lead to worthwhile improvements to the gaming experience?I don't think that is in dispute?but for now it simply doesn't do that. Even if it works there are problems associated with the approach, like latency spikes and connection interruptions that could negatively impact the user experience.

 

First of all, this isn't OnLive or Gaikai. Not even close. Again, I will point to my above post that such a comparison only demonstrates your lack of understanding of the technology involved.

 

And no, sending data to the cloud is not theoretical. You're already doing it on this website. Many games already make use of it, and there are hundreds of technologies out there built explicitly for managing such environments in as fast a way as possible. Facebook, Twitter, Steam, any multiplayer game that's peer-to-peer, etc. All these things send data in real time back and forth. So to assume doing the same with some physics calculations is theoretical is ignorant at best.

  • Like 3
Link to comment
Share on other sites

I think we are at a point in this discussion that we can summarize it a little.

Basically, I think most people here agree that sever hardware CAN improve the gaming experience.

Past that, there is a split amongst those that want to see more real world games using it in a meaningful way before saying that it is a feature with value and those that give the feature value now based on real world demos and techniques that have a positive track record.

So I think that shows that while MS may have issues with pr, they are not completely crazy to want to pursue this feature. There is enough basis in fact and enough real world data to suggest that something very useful can come out of leveraging servers. Its up to MS to produce games and encourage 3rd party developers to try it out in order to give everyone a reason to support the feature. At least MS have done a lot to encourage other developers to try using it.

Link to comment
Share on other sites

We're only 5 months or so into the life of both systems and people expect developers to magically start using these new options from day 1 fully?  Developers need time, time to learn the hardware, time for the tools to get better (as is always the case), time to optimize code more, and so on.

 

Developers using the "cloud" in ways they've done before is natural, they've already done it before and it's something they know how to do right now.  When talking about offloading more than just AI for "bots" to the server then we're talking a bit more work.   Regardless, the whole argument that it can't happen is false.  Again, we're not talking large chunks of data, we're talking the bits that go back and forth between the CPU and the GPU, we're talking KBs at a time.    There are a number of things in a game that aren't latency dependent and so on.  Lot's of things can be offloaded either fully or partially, and really I don't see why a game can't, for example, tone down things like it's physics or other "dynamic" environmental effects when offline and then ramp them up a bit when there's a internet connection? 

 

Besides, it's not like MS is the only one talking about cloud compute, as someone else already pointed out other companies are as well.  

Link to comment
Share on other sites

 

Because it's all theoretical - we haven't seen any worthwhile real world implementations. Anyway, I think I've made my point and I don't just want to keep repeating the same thing. It is a technology that has potential and could theoretically lead to worthwhile improvements to the gaming experience?I don't think that is in dispute?but for now it simply doesn't do that. Even if it works there are problems associated with the approach, like latency spikes and connection interruptions that could negatively impact the user experience.

 

So because it hasn't been commercially implemented in games, it's all lies and PR? That's what you're saying. The attitude here is that it won't improve graphics, which it can, it's just when it's implemented.

Link to comment
Share on other sites

some weird idea been strikes me,

if its possible for two xbox one 'talk' to each others using the USB3, just like multiple graphics cards configuration but via the USB3,

wouldn't that may provide computing power better than some distant, lagging cloud-server?

Link to comment
Share on other sites

some weird idea been strikes me,

if its possible for two xbox one 'talk' to each others using the USB3, just like multiple graphics cards configuration but via the USB3,

wouldn't that may provide computing power better than some distant, lagging cloud-server?

 

They could make a hdmi joiner and have two Xbox Ones outputting to a single hdmi output with each Xbox One doing one half of the screen and could even create code in the system to make them communicate over WiFi so you wouldn't need to have a USB cable. But that's not going to happen. It's too much effort for something no one is going to use, no one is going to buy two Xbox Ones to compete with the competitors already cheaper system.

Link to comment
Share on other sites

That's simply not true. People are criticising the way Microsoft is representing the cloud functionality, as it has virtually no impact upon the gaming experience. We wouldn't be having this conversation if games on the XB1 were using the cloud to creative innovative new experiences simply not attainable on the PS4. As it stands even the most fervent advocates are unable to illustrate how cloud computing improves the gaming experience.

 

It's all hype, no substance.

 

 

Titanfall has had issues with server reliability too and it isn't available in some countries due to the lack of server infrastructure, so what we've seen is the cloud functionality actually limiting the gaming experience.

 

With that logic, why even buy PS4 then? Sony has failed to show innovative experiences simply not attainable on the PS3. There is simply no game today on PS4 that couldn't be done on PS3 (besides gfx obviously). All it does is just more pixels.

Heck they spent more air on talking about Indies that typically don't need more GPU muscle over AAAs and such.

Link to comment
Share on other sites

They could make a hdmi joiner and have two Xbox Ones outputting to a single hdmi output with each Xbox One doing one half of the screen and could even create code in the system to make them communicate over WiFi so you wouldn't need to have a USB cable. But that's not going to happen. It's too much effort for something no one is going to use, no one is going to buy two Xbox Ones to compete with the competitors already cheaper system.

So... you say the cloud can't enhance games graphically and then state this would be possible? Ok then.

Link to comment
Share on other sites

So... you say the cloud can't enhance games graphically and then state this would be possible? Ok then.

 

hdmi cable has 18Gbps bandwidth, your internet does not.

 

Regardless my point wasn't that servers (cloud) can't improve games, my point is they already do on PC, PS4, PS3 and Xbox 360. Server hosts data/content and makes calculations which are done in the 'cloud' rather than the local machine. It's nothing new or specific to Xbox One.

 

What can an Xbox One do with the servers (cloud) that can't be done on any other machine? I'm sure if you answered that and gave sources to support your claims less people would think Microsoft was trying to hype up the Xbox One to compensate for the fact its a weaker system than the PS4. I personally think a toaster with a LCD monitor could utilize the cloud as much as an Xbox One prove me wrong.

  • Like 1
Link to comment
Share on other sites

hdmi cable has 18Gbps bandwidth, your internet does not.

I'm not even going to discuss why what you've stated is wildly impossible and unfeasible. Throughput is not really important for cloud gaming. PS Now/Onlive uses considerably more B/W than a computational off-loading to Azure would use.

 

Why can't people understand why using the cloud is a feasible way to improve aspects of graphics in games?

Link to comment
Share on other sites

So because it hasn't been commercially implemented in games, it's all lies and PR? That's what you're saying. The attitude here is that it won't improve graphics, which it can, it's just when it's implemented.

 

And for good reason. Rendering(for real time apps) and high latency do not mix. And internet has high latency to anything outside the same building. Sure, they could theoretically enchance the visuals with the cloud, but that would come at the cost of your game running at framerates around 10-15fps cause it has to wait ages before it can finish the frame due to the high latency of the cloud.

 

TBH, the sooner people get off microsoft's cloud hype train and come into reality the better.

Link to comment
Share on other sites

I find it hilarious that a year on from MS talking about "the cloud" we are still discussing whether it can or cannot make a difference. I seriously think some people need to just agree to disagree right about now.

 

will the cloud make a difference, yes... will the cloud make the XB1 multiple times more powerful, it remains to be seen for now.

Link to comment
Share on other sites

And for good reason. Rendering(for real time apps) and high latency do not mix. And internet has high latency to anything outside the same building. Sure, they could theoretically enchance the visuals with the cloud, but that would come at the cost of your game running at framerates around 10-15fps cause it has to wait ages before it can finish the frame due to the high latency of the cloud.

 

TBH, the sooner people get off microsoft's cloud hype train and come into reality the better.

There's no point technically explaining anything again because obviously you and various people don't listen to it. You're completely wrong.

 

Your framerate comment makes me laugh though, did you even see the build demo?

Link to comment
Share on other sites

This topic is now closed to further replies.