Jump to content



Photo
xbox one microsoft cloud computing xbox 360 xbox live xbl

  • Please log in to reply
146 replies to this topic

#136 Emn1ty

Emn1ty

    Web Programmer

  • Joined: 09-April 06
  • Location: Irvine, CA
  • OS: Windows 8.1, OSX Mavericks
  • Phone: Driod Razr

Posted 29 April 2014 - 15:46

That wasn't my point. The issue is whether doing so can improve the gaming experience in a significant manner (i.e. if it allows the XB1 to do things the PS4 can't do). It's all very well offloading physics data to a server but if it can be handled locally with minimal visual or performance difference then it doesn't achieve much, plus there are issues with latency spikes and connection interruptions. Things like physics and AI are very difficult to quantify - if an AI routine is three times as demanding you won't necessarily see much difference in-game.

 

If it can be handled with minimal visual or performance difference then they won't need to use the cloud. You're acting as if developers will use it just because and not for a good reason. The whole point of it is to reduce processing time and load on the local system so if you're not using it for that reason you're using it incorrectly. I don't even get why you'd make such a statement.




#137 trooper11

trooper11

    Neowinian Senior

  • Tech Issues Solved: 5
  • Joined: 21-November 12

Posted 29 April 2014 - 15:56

Regardless my point wasn't that servers (cloud) can't improve games, my point is they already do on PC, PS4, PS3 and Xbox 360. Server hosts data/content and makes calculations which are done in the 'cloud' rather than the local machine. It's nothing new or specific to Xbox One.
 
What can an Xbox One do with the servers (cloud) that can't be done on any other machine? I'm sure if you answered that and gave sources to support your claims less people would think Microsoft was trying to hype up the Xbox One to compensate for the fact its a weaker system than the PS4. I personally think a toaster with a LCD monitor could utilize the cloud as much as an Xbox One prove me wrong.



Do you guys not follow other posts in this thread?

What MS is doing differently is two fold:

1. Offering access to server hardware for free.

2. Building the X1 in a way to maximize usage of the cloud (they went over that early on, I can point to articles if you need them)


ANY device can connect to a cloud server. When will we get past the false impressions people have? As much as people say MS lie about this stuff, at least point out that they have never claimed that the cloud itself is different from any other cloud. Its all just a collection of servers. MS might claim that Azure has an advantage in the tools developers can use, but again, Azure is not specific to the X1 and can work with any device.



 

TBH, the sooner people get off microsoft's cloud hype train and come into reality the better.



You will not be rendering visuals via a server. Its too latency sensitive. The only way servers could improve visuals is if they offload enough other stuff to allow the gpu to do more then it otherwise would for visuals. I have no idea if that would amount to anything.

As far as people being on the hype train, what do you exactly mean by that? For one thing, it seems like most people around here are very much against MS' investment in servers, so your already in the majority opinion. Secondly, I haven't seen anyone around here make the crazy claims about the cloud that so many have focused on. The only people left that aren't opposed to it are focusing on what it can do, you know, the reality. That's where I'm at anyway.

 
 

Yeah, a controlled demo, unplayable, and not representative of any real world games we are currently seeing on these consoles. Remember when Sony chucked hundreds of ducks on a screen to show how the PS2 could handle graphics and physics (and then laughingly mocked that with a PS3 version)? Remember when MS demoed Milo to show us what Kinect 1 was apparently going to be capable of? Pretty much no one takes controlled demos seriously from any company unless they're playable and somehow represent an actual game, not a carefully constructed one-off scenario to try and push an agenda/point. Skeptics can eat crow afterwards if needed, to say we should be eating crow right now is not how "eating crow" works. None of the doubts has been proven to any stretch of the imagination.


I think the difference here is that the demo MS showed is not completely unheard of or new. The others you mentioned were completely new things that were only being created to market something. The principle behind the MS demo is not unverifiable. Literally, the techniques being used can be tested and verified elsewhere.

This is the part I don't get. Using servers to offload number crunching is not new and it has not been controversial until now. Now that MS used it as a feature of its platform, suddenly its all in question. If MS' had not promoted this feature at all, there would be very little blow back.

The part I agree with you about is that in order for it to be a clear advantage for gamers, more games have to come to demonstrate its usefulness.



As for that demo, you mean the questionable one from a few weeks or so ago? Yea, i seen it. Doesn't mean i believe it's completely legit though. Mainly because even with something else calculating the physics, the increase of things being rendered, on the client(hint, that's your xbox), from the enhanced physics will not come for free like they want you to believe. In fact, i'm pretty sure that's why they capped the framerate in that demo, to hide the fact it still wouldn't be 'free'.



What evidence is there that it was a lie? Its easy to just through that out there and dismiss something, but I would love to see the evidence that the demo was a fake.

 

Yes, but Microsoft is hyping the technology now and has been doing so since before launch.


Probably because they invested so much into it.

I suspect that they knew there would not be many games making use of the servers beyond the basics for a while, but they also felt they needed to get the word out there to push it as a feature. It's one of those risks to take.

More graphics power allows developers to produce better and new experiences. Also, being a DX11 chipset means it's capable of tessellation and the APUs are better equipped for processing physics.


So would you say that the X1 migrating to DX12 would be considered an innovation since it allows developers to do more?

#138 Showan

Showan

    Neowinian Senior

  • Joined: 28-November 12
  • Location: Amurrika
  • OS: W8
  • Phone: Lumia Icon

Posted 29 April 2014 - 21:01

Yes, but Microsoft is hyping the technology now and has been doing so since before launch.

 

 

More graphics power allows developers to produce better and new experiences. Also, being a DX11 chipset means it's capable of tessellation and the APUs are better equipped for processing physics.    <-----  This is called upgrading.  NOT INNOVATION



#139 George P

George P

    Neowinian Senior

  • Tech Issues Solved: 1
  • Joined: 04-February 07
  • Location: Greece
  • OS: Windows 8.1 Pro 64bit
  • Phone: HTC Windows Phone 8X

Posted 29 April 2014 - 21:17

People need to move off of the fixed thinking that this is going to be a boost to graphics visuals, it's not going to work that way, MS never said it was either.  The idea is cloud compute, it does what the name says it does, it crunches numbers and returns the finished values back to the system so the CPU isn't swamped and there's no queue build up which is what slows down framerates, it doesn't impact graphics quality the way some seem to think though.

 

This whole sticking point about latency is also another thing you have to get off of, not everything going on in a game is impacted by latency or dependent on it.  I don't know how much simpler it can be said, lots of things have wiggle room and at the end of the day we're talking about sending back and forth KBs of data, it's no different then hitting a website, lots of parts of a website are small, KBs worth, while some others are bigger, like images and videos etc, all the little bits, like the JS, don't take up much bandwidth but are important parts that take up CPU, or depending on browser GPU now.

 

It's the same thinking here, there's countless things you can offload and have the cloud, the server, number crunch for you so your CPU isn't bogged down, so it can keep feeding the GPU a nice steady flow and keep performance smooth.   There's AI, there's physics, there's lighting and weather.  There's other things that don't depend on player interaction and are less "real-time" that can be offloaded, scenes can be a bit more dynamic and so on.  It's all possible, it just takes time for developers to use more of, it's only been a few months in and people expected every developer to be using this right away?  Let's be a bit realistic here, it's not a switch you can just flip on and use, there's a server side part that needs to be made and tested and so on.   Time is always a factor here, so why not give them some and see where it goes before everyone gets out the pitch forks or calls this BS.



#140 BajiRav

BajiRav

    Neowinian Senior

  • Joined: 15-July 04
  • Location: Xbox, where am I?
  • OS: Windows 8.1, Windows 8
  • Phone: Lumia 920

Posted 30 April 2014 - 01:58

Yes, but Microsoft is hyping the technology now and has been doing so since before launch.

 

 

More graphics power allows developers to produce better and new experiences. Also, being a DX11 chipset means it's capable of tessellation and the APUs are better equipped for processing physics.

Again...how is throwing more hardware at something innovative?

 

More cloud power allows developers to produce better and new experiences. Also, being a Azure cloud means it's capable of scaling and the games are better equipped for processing physics.



#141 JonnyLH

JonnyLH

    I say things.

  • Joined: 15-February 13
  • Location: UK
  • OS: W8, W7, WP8, iOS, Ubuntu
  • Phone: Nokia Lumia 920

Posted 30 April 2014 - 09:07

People need to move off of the fixed thinking that this is going to be a boost to graphics visuals, it's not going to work that way, MS never said it was either.  The idea is cloud compute, it does what the name says it does, it crunches numbers and returns the finished values back to the system so the CPU isn't swamped and there's no queue build up which is what slows down framerates, it doesn't impact graphics quality the way some seem to think though.

Good post but I've got to say this part I don't agree with. If you remove so much work from the CPU, the time which the CPU used to execute that code can be given to the GPU to make the game more pretty. Just depends on the engine.

 

For example, to reach 60FPS you need to render each frame in 16.6ms. Say you had 8.8ms for the CPU and 7.8ms for the GPU and you remove the physics calculations which takes 3ms to calculate, you could give that 3ms to the GPU. It just completely depends what runs in parallel and the dependencies. Interesting concept though. 



#142 trooper11

trooper11

    Neowinian Senior

  • Tech Issues Solved: 5
  • Joined: 21-November 12

Posted 30 April 2014 - 14:41

Good post but I've got to say this part I don't agree with. If you remove so much work from the CPU, the time which the CPU used to execute that code can be given to the GPU to make the game more pretty. Just depends on the engine.
 
For example, to reach 60FPS you need to render each frame in 16.6ms. Say you had 8.8ms for the CPU and 7.8ms for the GPU and you remove the physics calculations which takes 3ms to calculate, you could give that 3ms to the GPU. It just completely depends what runs in parallel and the dependencies. Interesting concept though.


That certainly sounds possible, but I just think its important to stress that MS is not claiming that the cloud does anything that affects the visuals directly as in doing any rendering via the servers.

What it can do could result in better visuals, but that is going to vary a lot and is not guaranteed.

#143 HawkMan

HawkMan

    Neowinian Senior

  • Tech Issues Solved: 4
  • Joined: 31-August 04
  • Location: Norway
  • Phone: Noka Lumia 1020

Posted 30 April 2014 - 15:10

Here's why I personally do not get the claim Forza 5 would not be the same without the power of the cloud. And someone can correct me if I am wrong, but the way I understand the integration with Forza 5 specifically is yes it does the calculation for the AI, but if you then play the game offline, you will play with whatever the latest Drivatar calculation information is you have downloaded from the last time you played connected to the cloud.

So in that aspect, is it really the cloud makes it that much better of an experience and an experience that can only be had thanks to the cloud? Or does the cloud just makes things more convenient?
What I mean is it seems as if these calculations could also be done without the cloud itself. It just would obviously mean extended load times, etc. But if you can download the Drivatar information for use when the game is offline, then it is not really the cloud has to be used to get that Drivatar information. It is just it makes it easier to access and quicker to get.

So it is not really the cloud is solely responsible for making it that much better of a game. Just that it makes it a smoother experience overall.

 

The actual calculation of how the drivatar profile is made is done in the cloud. the xbox could theoretically also calculate all this many thousands of variables, during a race to create a profile. but then you'd need a few extra xboxes to do it :)

 

it's the actual creation of the drivatar profile that is done in the cloud, live while you play since it records thousands of variables for every seconds you drive and use them to create and modify the profile. without the cloud you would only have simple dumb drivatars like in the previous games.



#144 HawkMan

HawkMan

    Neowinian Senior

  • Tech Issues Solved: 4
  • Joined: 31-August 04
  • Location: Norway
  • Phone: Noka Lumia 1020

Posted 30 April 2014 - 15:16

They could make a hdmi joiner and have two Xbox Ones outputting to a single hdmi output with each Xbox One doing one half of the screen and could even create code in the system to make them communicate over WiFi so you wouldn't need to have a USB cable. But that's not going to happen. It's too much effort for something no one is going to use, no one is going to buy two Xbox Ones to compete with the competitors already cheaper system.

 

incidentally, for previous forza games, you could link up 3 or 4 360's for panoramic output. and people use it.

 

so... 



#145 HawkMan

HawkMan

    Neowinian Senior

  • Tech Issues Solved: 4
  • Joined: 31-August 04
  • Location: Norway
  • Phone: Noka Lumia 1020

Posted 30 April 2014 - 15:23

And for good reason. Rendering(for real time apps) and high latency do not mix. And internet has high latency to anything outside the same building. Sure, they could theoretically enchance the visuals with the cloud, but that would come at the cost of your game running at framerates around 10-15fps cause it has to wait ages before it can finish the frame due to the high latency of the cloud.

 

TBH, the sooner people get off microsoft's cloud hype train and come into reality the better.

 

 

you guys seem to have very short memories. 

 

you can increase graphics fidelity with the cloud by removing non graphics GPGPU calculation from the GPU to the cloud. freeing up up to 50% of the GPU resources in some physics heavy games. are you saying 50% more available GPU power won't have a effect on graphics output ?

 

even cloud rendering could be used in conjunction with local rendering to increase fidelity. Imagine this, the server render the background of the scene, say one of those huge Avatar vistas with trees, animals, and all kinds of things moving in the background. all of this, everything more than 500 meters away would be rendered in the cloud. and streamed to you.  Now you're going to say "ah but, it'll look weird because it'll lag behind and won't be synced with your movements", true, UNLESS you account for that, say you render 10% bigger than the screen. then the console locally will move this oversized backround matte around to account for the lag.  So even cloud assisted rendering is feasable. even though it's not something MS has said they will use or suggest doing. it's quite possible. 



#146 Kvally

Kvally

    Neowin Veteran

  • Joined: 18-June 03
  • Location: USA
  • OS: Mac OSX
  • Phone: iPhone 5S

Posted 12 May 2014 - 13:43

That is awesome....love how the Xbox One is embracing the cloud. It's a win win for gamers. Another reason why I am really enjoying this new gen and the Xbox. I thought last gen was awesome on the 360, but this new gen will be so much better.



#147 Vigilant

Vigilant

    Neowinian

  • Joined: 24-January 11

Posted 07 June 2014 - 03:55

3 Xbox Ones in the cloud is pretty bold, but if you are talking about bursting for a specific workload of cpu usage it is believable. Not something that anyone should consider to be sustainable.