Jump to content



Photo
xbox one microsoft cloud computing xbox 360 xbox live xbl

  • Please log in to reply
146 replies to this topic

#106 simplezz

simplezz

    Neowinian Senior

  • Tech Issues Solved: 1
  • Joined: 01-February 12

Posted 28 April 2014 - 18:27

Titanfall,the AI,  all the wildlife (monsters and such), ships jumping in and out of orbit all done server side.  not locally

And yet it has the exact same limitations of non-cloud powered games, that is, the number of objects that can be rendered simultaneously before the cpu and gpu bottle neck.

I think simcity is the perfect example of why companies are jumping on the "cloud-powered" band wagon. It's all about DRM, anti-piracy, and control.

I can't think of any single thing I've seen thus far in Titanfall that couldn't be done just as well locally. Correct me if I'm wrong.


#107 trooper11

trooper11

    Neowinian Senior

  • Tech Issues Solved: 5
  • Joined: 21-November 12

Posted 28 April 2014 - 18:41

And yet it has the exact same limitations of non-cloud powered games, that is, the number of objects that can be rendered simultaneously before the cpu and gpu bottle neck.

I think simcity is the perfect example of why companies are jumping on the "cloud-powered" band wagon. It's all about DRM, anti-piracy, and control.

I can't think of any single thing I've seen thus far in Titanfall that couldn't be done just as well locally. Correct me if I'm wrong.


Honestly I don't know exactly what Titanfall is using the servers for, but its clearly not visual related.

If it is using something tied to calculations, then that could be impossible to pick out since the resulting game experience is indistinguishable from a game running completely local.

I am not sure that Titanfall should be the end all, be all, when it comes to how servers are being used. Just as Forza's usage also does not show everything you can do.

#108 Emn1ty

Emn1ty

    Web Programmer

  • Joined: 09-April 06
  • Location: Irvine, CA
  • OS: Windows 8.1, OSX Mavericks
  • Phone: Driod Razr

Posted 28 April 2014 - 18:53

That's simply not true. People are criticising the way Microsoft is representing the cloud functionality, as it has virtually no impact upon the gaming experience. We wouldn't be having this conversation if games on the XB1 were using the cloud to creative innovative new experiences simply not attainable on the PS4. As it stands even the most fervent advocates are unable to illustrate how cloud computing improves the gaming experience.

 

It's all hype, no substance.

 

 

Titanfall has had issues with server reliability too and it isn't available in some countries due to the lack of server infrastructure, so what we've seen is the cloud functionality actually limiting the gaming experience.

 

My question is why does it matter if it's "not attainable on the PS4" ? It's a tool, a resource for developers to use. Yet unless it's some miracle that makes it heads over heals better than what the PS4 has to offer then Microsoft is just lying to you. I've not seen anything unbelievable from them. Not even the demo was that amazing if you think about it since all it did was handle physics calculations. You're transferring numeric data over the network. Small, small data back and forth.

 

This is what websites do all the time. And they are all about managing input/output with many people. This practice has been present for many years yet people act as if it's never been done before. The only thing you'll really have to worry about is sub average connections or distance from the server (with Microsoft that shouldn't be a problem since they have data centers all over the world). And by sub-average I mean < 1 or 2 mb/s

This is a sound, and legitimate concept. And anyone who's comparing it to something like a game streaming service (such as Gaikai) doesn't understand the technology.



#109 Andrew G.

Andrew G.

    Once More 'Round the Sun

  • Tech Issues Solved: 9
  • Joined: 14-September 03

Posted 28 April 2014 - 19:44

Topic cleaned



#110 Emn1ty

Emn1ty

    Web Programmer

  • Joined: 09-April 06
  • Location: Irvine, CA
  • OS: Windows 8.1, OSX Mavericks
  • Phone: Driod Razr

Posted 28 April 2014 - 20:30

Also, let me further this a little more.

 

If you have piece of hardware that can do, say 50 calculations per second. And it does calculations for 1m but at some instances it gets bogged down.

 

Snapshot every 10s w/ calculations required:

 

10 - 30 - 25 - 43 - 60
 

That 60 is what we're worrying about. In their demo what this would do is create a process queue, since there's now a backup on a processor thread. If this continues, the line gets longer and longer. This results in a framerate drop. Why? Cause the system has to wait to get the numbers back before it can render.

Now, if you instead opt for a system that can perform 150 calculations per second (3x that of your original), you will theoretically be able to handle the same number of calculations in 1/3 the time. And, any time spent waiting for the data to transfer to/from the cloud is now negligible because (theoretically) the amount of time it takes for the cloud server to crunch your numbers is far less than for the local system, and any time lost in transferring data is made up for in processing speed.

Sure, there's the possibility of hiccups and latency. But the game can easily be developed to use the cloud as an overflow instead of a primary processing point. Or the reverse, even. That way the system isn't dependent on the cloud, but has it as a way to take the heat off the main system. I think this is exactly what Microsoft was demonstrating. Games aren't going to be getting worse without the cloud. They'll be getting better with it.

Microsoft is targeting the consumer with slightly above average internet speeds. Their entire console feature set is already based on it. People keep crying about bad net, well perhaps Microsoft isn't targeting those with bad internet as potential customers? Maybe they want to work with the current technology available as has been the trend in the tech industry these last 5 years. Everyone is trying to bring tech to some semblance of a standard, moving to forced updates and shrinking their legacy support. The tech industry is moving on. You can't blame Microsoft for doing so, too.



#111 Showan

Showan

    Neowinian Senior

  • Joined: 28-November 12
  • Location: Amurrika
  • OS: W7, W8
  • Phone: Lumia 521

Posted 28 April 2014 - 20:31

Here's what most are missing.  Consoles stay the same for 5-8yrs.  Cell phones, pc's, tablets, etc change like the wind.  

 

Developers are allowed to take advantage of something now, that was designed to be of more significant use later.



#112 theyarecomingforyou

theyarecomingforyou

    Tiger Trainer

  • Joined: 07-August 03
  • Location: Terra Prime Profession: Jaded Sceptic
  • OS: Windows 8.1
  • Phone: Galaxy Note 3 with Galaxy Gear

Posted 28 April 2014 - 20:37

My question is why does it matter if it's "not attainable on the PS4" ? It's a tool, a resource for developers to use.

Because if the PS4 is able to do the same without the cloud then it shows what little impact it has.

 

Why do you feel that its not a major feature? You are confident in that point and yet you follow it up with you have no idea if it will change.

It's been hyped as a major feature yet it has done nothing to meaningfully improve the gaming experience. In some cases it has actually caused issues, like with Forza 5. It's possible developers will suddenly flock to use it and do so in a meaningful way but we simply haven't seen any evidence to support that. If it was a major feature, something that differentiated the XB1 from the PS4, you would expect to see it being used more.

 

For now it's nothing more than a gimmick. That might change.

 

Ultimately, the question is: Is the idea sound? Can there be things done with dedicated servers that improving a gaming experience?

That remains to be seen. We saw a lot of hype surrounding streaming games (OnLive, Gaikai, etc) and yet the actual experience was a complete letdown.

 

I don't know how any body can say this is just PR after many discussions and even a demo which showed how it can benefit games. Dedicating servers to off-loading CPU tasks off the box will allow for better graphical games and having 3x theoretical power for X1's in each console is massive. I'm not sure how they can dedicate that much.

Because it's all theoretical - we haven't seen any worthwhile real world implementations. Anyway, I think I've made my point and I don't just want to keep repeating the same thing. It is a technology that has potential and could theoretically lead to worthwhile improvements to the gaming experience—I don't think that is in dispute—but for now it simply doesn't do that. Even if it works there are problems associated with the approach, like latency spikes and connection interruptions that could negatively impact the user experience.



#113 Emn1ty

Emn1ty

    Web Programmer

  • Joined: 09-April 06
  • Location: Irvine, CA
  • OS: Windows 8.1, OSX Mavericks
  • Phone: Driod Razr

Posted 28 April 2014 - 20:55

Because if the PS4 is able to do the same without the cloud then it shows what little impact it has.

 

I don't really get what that means. Honestly i'd at least expect it to make 60fps more feasible with 1080p games. Something that both consoles are obviously struggling with.

 

It's been hyped as a major feature yet it has done nothing to meaningfully improve the gaming experience. In some cases it has actually caused issues, like with Forza 5. It's possible developers will suddenly flock to use it and do so in a meaningful way but we simply haven't seen any evidence to support that. If it was a major feature, something that differentiated the XB1 from the PS4, you would expect to see it being used more.

 

For now it's nothing more than a gimmick. That might change.

 

I don't think developers have even had a real chance to work with it. So I guess anything is a gimmick before it's actually used. You're making finite judgements based on something that's only barely been available. Just like everyone here who seems ready to call the console war based on the first few months, where last gen it seems everyone was helping the PS3 limp along until they were praising it in the last 2 years of an 9 year long generation. I'm so tired of this double standard.

 

That remains to be seen. We saw a lot of hype surrounding streaming games (OnLive, Gaikai, etc) and yet the actual experience was a complete letdown.

 

Because it's all theoretical - we haven't seen any worthwhile real world implementations. Anyway, I think I've made my point and I don't just want to keep repeating the same thing. It is a technology that has potential and could theoretically lead to worthwhile improvements to the gaming experience—I don't think that is in dispute—but for now it simply doesn't do that. Even if it works there are problems associated with the approach, like latency spikes and connection interruptions that could negatively impact the user experience.

 

First of all, this isn't OnLive or Gaikai. Not even close. Again, I will point to my above post that such a comparison only demonstrates your lack of understanding of the technology involved.

 

And no, sending data to the cloud is not theoretical. You're already doing it on this website. Many games already make use of it, and there are hundreds of technologies out there built explicitly for managing such environments in as fast a way as possible. Facebook, Twitter, Steam, any multiplayer game that's peer-to-peer, etc. All these things send data in real time back and forth. So to assume doing the same with some physics calculations is theoretical is ignorant at best.



#114 trooper11

trooper11

    Neowinian Senior

  • Tech Issues Solved: 5
  • Joined: 21-November 12

Posted 29 April 2014 - 00:34

I think we are at a point in this discussion that we can summarize it a little.

Basically, I think most people here agree that sever hardware CAN improve the gaming experience.

Past that, there is a split amongst those that want to see more real world games using it in a meaningful way before saying that it is a feature with value and those that give the feature value now based on real world demos and techniques that have a positive track record.

So I think that shows that while MS may have issues with pr, they are not completely crazy to want to pursue this feature. There is enough basis in fact and enough real world data to suggest that something very useful can come out of leveraging servers. Its up to MS to produce games and encourage 3rd party developers to try it out in order to give everyone a reason to support the feature. At least MS have done a lot to encourage other developers to try using it.

#115 George P

George P

    Neowinian Senior

  • Tech Issues Solved: 1
  • Joined: 04-February 07
  • Location: Greece
  • OS: Windows 8.1 Pro 64bit
  • Phone: HTC Windows Phone 8X

Posted 29 April 2014 - 06:59

We're only 5 months or so into the life of both systems and people expect developers to magically start using these new options from day 1 fully?  Developers need time, time to learn the hardware, time for the tools to get better (as is always the case), time to optimize code more, and so on.

 

Developers using the "cloud" in ways they've done before is natural, they've already done it before and it's something they know how to do right now.  When talking about offloading more than just AI for "bots" to the server then we're talking a bit more work.   Regardless, the whole argument that it can't happen is false.  Again, we're not talking large chunks of data, we're talking the bits that go back and forth between the CPU and the GPU, we're talking KBs at a time.    There are a number of things in a game that aren't latency dependent and so on.  Lot's of things can be offloaded either fully or partially, and really I don't see why a game can't, for example, tone down things like it's physics or other "dynamic" environmental effects when offline and then ramp them up a bit when there's a internet connection? 

 

Besides, it's not like MS is the only one talking about cloud compute, as someone else already pointed out other companies are as well.  



#116 JonnyLH

JonnyLH

    I say things.

  • Joined: 15-February 13
  • Location: UK
  • OS: W8, W7, WP8, iOS, Ubuntu
  • Phone: Nokia Lumia 920

Posted 29 April 2014 - 08:15

 

Because it's all theoretical - we haven't seen any worthwhile real world implementations. Anyway, I think I've made my point and I don't just want to keep repeating the same thing. It is a technology that has potential and could theoretically lead to worthwhile improvements to the gaming experience—I don't think that is in dispute—but for now it simply doesn't do that. Even if it works there are problems associated with the approach, like latency spikes and connection interruptions that could negatively impact the user experience.

 

So because it hasn't been commercially implemented in games, it's all lies and PR? That's what you're saying. The attitude here is that it won't improve graphics, which it can, it's just when it's implemented.



#117 Torolol

Torolol

  • Joined: 24-November 12

Posted 29 April 2014 - 08:32

some weird idea been strikes me,

if its possible for two xbox one 'talk' to each others using the USB3, just like multiple graphics cards configuration but via the USB3,

wouldn't that may provide computing power better than some distant, lagging cloud-server?



#118 GotBored

GotBored

    Brain Trust

  • Tech Issues Solved: 3
  • Joined: 24-June 13
  • OS: Windows 8.1
  • Phone: iPhone 5

Posted 29 April 2014 - 09:15

some weird idea been strikes me,

if its possible for two xbox one 'talk' to each others using the USB3, just like multiple graphics cards configuration but via the USB3,

wouldn't that may provide computing power better than some distant, lagging cloud-server?

 

They could make a hdmi joiner and have two Xbox Ones outputting to a single hdmi output with each Xbox One doing one half of the screen and could even create code in the system to make them communicate over WiFi so you wouldn't need to have a USB cable. But that's not going to happen. It's too much effort for something no one is going to use, no one is going to buy two Xbox Ones to compete with the competitors already cheaper system.



#119 BajiRav

BajiRav

    Neowinian Senior

  • Joined: 15-July 04
  • Location: Xbox, where am I?
  • OS: Windows 8.1, Windows 8
  • Phone: Lumia 920

Posted 29 April 2014 - 09:19

That's simply not true. People are criticising the way Microsoft is representing the cloud functionality, as it has virtually no impact upon the gaming experience. We wouldn't be having this conversation if games on the XB1 were using the cloud to creative innovative new experiences simply not attainable on the PS4. As it stands even the most fervent advocates are unable to illustrate how cloud computing improves the gaming experience.

 

It's all hype, no substance.

 

 

Titanfall has had issues with server reliability too and it isn't available in some countries due to the lack of server infrastructure, so what we've seen is the cloud functionality actually limiting the gaming experience.

 

With that logic, why even buy PS4 then? Sony has failed to show innovative experiences simply not attainable on the PS3. There is simply no game today on PS4 that couldn't be done on PS3 (besides gfx obviously). All it does is just more pixels.

Heck they spent more air on talking about Indies that typically don't need more GPU muscle over AAAs and such.



#120 JonnyLH

JonnyLH

    I say things.

  • Joined: 15-February 13
  • Location: UK
  • OS: W8, W7, WP8, iOS, Ubuntu
  • Phone: Nokia Lumia 920

Posted 29 April 2014 - 09:21

They could make a hdmi joiner and have two Xbox Ones outputting to a single hdmi output with each Xbox One doing one half of the screen and could even create code in the system to make them communicate over WiFi so you wouldn't need to have a USB cable. But that's not going to happen. It's too much effort for something no one is going to use, no one is going to buy two Xbox Ones to compete with the competitors already cheaper system.

So... you say the cloud can't enhance games graphically and then state this would be possible? Ok then.





Click here to login or here to register to remove this ad, it's free!