Jump to content



Photo
xbox one microsoft cloud computing xbox 360 xbox live xbl

  • Please log in to reply
146 replies to this topic

#121 GotBored

GotBored

    Brain Trust

  • Tech Issues Solved: 3
  • Joined: 24-June 13
  • OS: Windows 8.1
  • Phone: iPhone 5

Posted 29 April 2014 - 09:33

So... you say the cloud can't enhance games graphically and then state this would be possible? Ok then.

 

hdmi cable has 18Gbps bandwidth, your internet does not.

 

Regardless my point wasn't that servers (cloud) can't improve games, my point is they already do on PC, PS4, PS3 and Xbox 360. Server hosts data/content and makes calculations which are done in the 'cloud' rather than the local machine. It's nothing new or specific to Xbox One.

 

What can an Xbox One do with the servers (cloud) that can't be done on any other machine? I'm sure if you answered that and gave sources to support your claims less people would think Microsoft was trying to hype up the Xbox One to compensate for the fact its a weaker system than the PS4. I personally think a toaster with a LCD monitor could utilize the cloud as much as an Xbox One prove me wrong.




#122 JonnyLH

JonnyLH

    I say things.

  • Joined: 15-February 13
  • Location: UK
  • OS: W8, W7, WP8, iOS, Ubuntu
  • Phone: Nokia Lumia 920

Posted 29 April 2014 - 09:41

hdmi cable has 18Gbps bandwidth, your internet does not.

I'm not even going to discuss why what you've stated is wildly impossible and unfeasible. Throughput is not really important for cloud gaming. PS Now/Onlive uses considerably more B/W than a computational off-loading to Azure would use.

 

Why can't people understand why using the cloud is a feasible way to improve aspects of graphics in games?



#123 Blackhearted

Blackhearted

    .....

  • Joined: 26-February 04
  • Location: Ohio
  • Phone: Samsung Galaxy S2 (VM)

Posted 29 April 2014 - 09:46

So because it hasn't been commercially implemented in games, it's all lies and PR? That's what you're saying. The attitude here is that it won't improve graphics, which it can, it's just when it's implemented.

 

And for good reason. Rendering(for real time apps) and high latency do not mix. And internet has high latency to anything outside the same building. Sure, they could theoretically enchance the visuals with the cloud, but that would come at the cost of your game running at framerates around 10-15fps cause it has to wait ages before it can finish the frame due to the high latency of the cloud.

 

TBH, the sooner people get off microsoft's cloud hype train and come into reality the better.



#124 Skiver

Skiver

    Neowinian Senior

  • Tech Issues Solved: 2
  • Joined: 10-October 05
  • Location: UK, Reading

Posted 29 April 2014 - 09:46

I find it hilarious that a year on from MS talking about "the cloud" we are still discussing whether it can or cannot make a difference. I seriously think some people need to just agree to disagree right about now.

 

will the cloud make a difference, yes... will the cloud make the XB1 multiple times more powerful, it remains to be seen for now.



#125 JonnyLH

JonnyLH

    I say things.

  • Joined: 15-February 13
  • Location: UK
  • OS: W8, W7, WP8, iOS, Ubuntu
  • Phone: Nokia Lumia 920

Posted 29 April 2014 - 09:55

And for good reason. Rendering(for real time apps) and high latency do not mix. And internet has high latency to anything outside the same building. Sure, they could theoretically enchance the visuals with the cloud, but that would come at the cost of your game running at framerates around 10-15fps cause it has to wait ages before it can finish the frame due to the high latency of the cloud.

 

TBH, the sooner people get off microsoft's cloud hype train and come into reality the better.

There's no point technically explaining anything again because obviously you and various people don't listen to it. You're completely wrong.

 

Your framerate comment makes me laugh though, did you even see the build demo?



#126 OP Audioboxer

Audioboxer

    Hermit Arcana

  • Joined: 01-December 03
  • Location: UK, Scotland

Posted 29 April 2014 - 10:11

There's no point technically explaining anything again because obviously you and various people don't listen to it. You're completely wrong.

 

Your framerate comment makes me laugh though, did you even see the build demo?

 

Yeah, a controlled demo, unplayable, and not representative of any real world games we are currently seeing on these consoles. Remember when Sony chucked hundreds of ducks on a screen to show how the PS2 could handle graphics and physics (and then laughingly mocked that with a PS3 version)? Remember when MS demoed Milo to show us what Kinect 1 was apparently going to be capable of? Pretty much no one takes controlled demos seriously from any company unless they're playable and somehow represent an actual game, not a carefully constructed one-off scenario to try and push an agenda/point. Skeptics can eat crow afterwards if needed, to say we should be eating crow right now is not how "eating crow" works. None of the doubts has been proven to any stretch of the imagination.



#127 George P

George P

    Neowinian Senior

  • Tech Issues Solved: 1
  • Joined: 04-February 07
  • Location: Greece
  • OS: Windows 8.1 Pro 64bit
  • Phone: HTC Windows Phone 8X

Posted 29 April 2014 - 10:15

People really are throwing the same arguments back and forth as to why it can and why they think it can't work yet if you break it down it can.  I think some people have the wrong idea about what cloud compute and "offloading" really means in this case.  No ones talking about rendering graphics in the cloud and streaming it, this isn't a Gekai (or however it's spelled) or OnLive type service, there's not massive amounts of data going back and forth between you and the server.   Offloading is in this case talking about having the cloud servers with their more powerful hardware do calculations and send the final answer back to your box, we're talking small amounts of data and not GBs, or in most cases not even MBs. 

 

Though I'm not technical enough to break it down into it's pieces the fact is the CPU works on it's tasks and these blocks of data aren't big at all, KBs of data flying back and forth between the CPU and the GPU.  Even below average 2mbps connections can send and receive KB chunks without any fuss.  Add to the fact that, as has been stated, not everything going on in a game or in this case the render pipeline are dependent on latency and can be offloaded without issue.



#128 Blackhearted

Blackhearted

    .....

  • Joined: 26-February 04
  • Location: Ohio
  • Phone: Samsung Galaxy S2 (VM)

Posted 29 April 2014 - 10:18

There's no point technically explaining anything again because obviously you and various people don't listen to it. You're completely wrong.

 

Your framerate comment makes me laugh though, did you even see the build demo?

 

So you really beleive there's a method to offload a portion of the rendering to the cloud without either cutting your framerate or displaying incomplete frames half the time?

 

As for that demo, you mean the questionable one from a few weeks or so ago? Yea, i seen it. Doesn't mean i believe it's completely legit though. Mainly because even with something else calculating the physics, the increase of things being rendered, on the client(hint, that's your xbox), from the enhanced physics will not come for free like they want you to believe. In fact, i'm pretty sure that's why they capped the framerate in that demo, to hide the fact it still wouldn't be 'free'.



#129 JonnyLH

JonnyLH

    I say things.

  • Joined: 15-February 13
  • Location: UK
  • OS: W8, W7, WP8, iOS, Ubuntu
  • Phone: Nokia Lumia 920

Posted 29 April 2014 - 10:18

Yeah, a controlled demo, unplayable, and not representative of any real world games we are currently seeing on these consoles. Remember when Sony chucked hundreds of ducks on a screen to show how the PS2 could handle graphics and physics (and then laughingly mocked that with a PS3 version)? Remember when MS demoed Milo to show us what Kinect 1 was apparently going to be capable of? Pretty much no one takes controlled demos seriously from any company unless they're playable and somehow represent an actual game, not a carefully constructed one-off scenario to try and push an agenda/point. Skeptics can eat crow afterwards if needed, to say we should be eating crow right now is not how "eating crow" works. None of the doubts has been proven to any stretch of the imagination.

Playable by them, just not the general public. A controlled demo which expanded on the idea many disbelieved which showed how it's possible. A demo which is feasible which is possible in games. For example, an online match of Halo 5 where everything is destructible but calculated on the server rather than locally. 

 

All the demos you listed were possible and true, I don't get what you're saying? What you're saying doesn't invalidate the whole argument, it's completely different.



#130 JonnyLH

JonnyLH

    I say things.

  • Joined: 15-February 13
  • Location: UK
  • OS: W8, W7, WP8, iOS, Ubuntu
  • Phone: Nokia Lumia 920

Posted 29 April 2014 - 10:22

So you really beleive there's a method to offload a portion of the rendering to the cloud without either cutting your framerate or displaying incomplete frames half the time?

 

As for that demo, you mean the questionable one from a few weeks or so ago? Yea, i seen it. Doesn't mean i believe it's completely legit though. Mainly because even with something else calculating the physics, the increase of things being rendered, on the client(hint, that's your xbox), from the enhanced physics will not come for free like they want you to believe. In fact, i'm pretty sure that's why they locked the framerate in that demo, to hide the fact it still wouldn't be 'free'.

I don't believe, I know. There's a fundamental difference.

 

Having 3x the CPU power assigned to each X1 (According to MS) means that heavy CPU tasks could be offloaded into the cloud (Physics, Particle Effects, AI etc). With these heavy CPU tasks out of the way, the engine has way more time to work with in a frame. This could be assigned to the GPU for prettier surroundings or give to the CPU to work on things locally more. Although, like the BUILD demo shows, you could simply off-load tasks like destruction physics for better gameplay while keeping the frame-rate stable. It's quite a technical idea, but a very simple one which has been used in scientific and mathematical studies for years. You just have to keep in mind which tasks can be off-loaded which aren't effected by latency. You're not going to move collision detection to the cloud are you?

 

The problem isn't with rendering the squares, draw-calls became hardly an issue when the 360 was released. It's the calculations of how the chunks react and move in the environment which takes the time in an engine.



#131 theyarecomingforyou

theyarecomingforyou

    Tiger Trainer

  • Joined: 07-August 03
  • Location: Terra Prime Profession: Jaded Sceptic
  • OS: Windows 8.1
  • Phone: Galaxy Note 3 with Galaxy Gear

Posted 29 April 2014 - 11:49

 

And no, sending data to the cloud is not theoretical. You're already doing it on this website. Many games already make use of it, and there are hundreds of technologies out there built explicitly for managing such environments in as fast a way as possible. Facebook, Twitter, Steam, any multiplayer game that's peer-to-peer, etc. All these things send data in real time back and forth. So to assume doing the same with some physics calculations is theoretical is ignorant at best.

That wasn't my point. The issue is whether doing so can improve the gaming experience in a significant manner (i.e. if it allows the XB1 to do things the PS4 can't do). It's all very well offloading physics data to a server but if it can be handled locally with minimal visual or performance difference then it doesn't achieve much, plus there are issues with latency spikes and connection interruptions. Things like physics and AI are very difficult to quantify - if an AI routine is three times as demanding you won't necessarily see much difference in-game.

 

So because it hasn't been commercially implemented in games, it's all lies and PR? That's what you're saying.

Basically. It's been touted as a major feature but we haven't seen the developer support or implementations to justify that claim. It just seems to be Microsoft's way to distract people from the performance issues affecting most XB1 games. It's the same with the Kinect - Microsoft has hyped it up and bundled it with the XB1 and yet few games makes use of it and those that do are generally pretty gimmicky. The technology is impressive, much more so than the cloud, but it has very little impact upon the gaming experience and reduces immersion.

 

I don't want people to assume that I have something against Microsoft in particular, as that simply isn't true. I think the touch input on the PS4 is gimmicky and immersion breaking; the Leap Motion looked great in the videos but I have found it to be impractical in day-to-day usage; the Wii U is gimmicky and underpowered. I'm interested in the gaming experience and right now Sony is delivering the better experience and it's work on VR looks very promising - it's still way behind where PC gaming is but it's the best of the "next-gen" consoles.

 

With that logic, why even buy PS4 then? Sony has failed to show innovative experiences simply not attainable on the PS3.

As you yourself pointed out the PS4 has much better visuals than the PS3, which is an innovation. But for what it's worth I wouldn't buy a PS4, as I consider the PC to be where the innovation is occurring.

 



#132 JonnyLH

JonnyLH

    I say things.

  • Joined: 15-February 13
  • Location: UK
  • OS: W8, W7, WP8, iOS, Ubuntu
  • Phone: Nokia Lumia 920

Posted 29 April 2014 - 11:58

Basically. It's been touted as a major feature but we haven't seen the developer support or implementations to justify that claim. It just seems to be Microsoft's way to distract people from the performance issues affecting most XB1 games. It's the same with the Kinect - Microsoft has hyped it up and bundled it with the XB1 and yet few games makes use of it and those that do are generally pretty gimmicky. The technology is impressive, much more so than the cloud, but it has very little impact upon the gaming experience and reduces immersion.

 

I don't want people to assume that I have something against Microsoft in particular, as that simply isn't true. I think the touch input on the PS4 is gimmicky and immersion breaking; the Leap Motion looked great in the videos but I have found it to be impractical in day-to-day usage; the Wii U is gimmicky and underpowered. I'm interested in the gaming experience and right now Sony is delivering the better experience and it's work on VR looks very promising - it's still way behind where PC gaming is but it's the best of the "next-gen" consoles.

There wasn't any substantial API support to help with this until the updated Azure SDK which was shown during Build. To incorporate these changes into a games engine takes a lot of time and it's something that simply isn't feasible in this time-frame for the current consoles. That doesn't mean it doesn't work, more so not implemented yet. We're 5 months into these consoles, you must understand that these things take time. 

 

VR for me is definitely interesting, just count me out for playing games with a heavy box on my head constantly. I'd definately use it, but for how much I would and for how expensive these devices will be, I couldn't see me picking one up. Especially the average joe.



#133 BajiRav

BajiRav

    Neowinian Senior

  • Joined: 15-July 04
  • Location: Xbox, where am I?
  • OS: Windows 8.1, Windows 8
  • Phone: Lumia 920

Posted 29 April 2014 - 12:10

As you yourself pointed out the PS4 has much better visuals than the PS3, which is an innovation. But for what it's worth I wouldn't buy a PS4, as I consider the PC to be where the innovation is occurring.

How is adding more GPU power is innovation when it's been done a million times already?



#134 theyarecomingforyou

theyarecomingforyou

    Tiger Trainer

  • Joined: 07-August 03
  • Location: Terra Prime Profession: Jaded Sceptic
  • OS: Windows 8.1
  • Phone: Galaxy Note 3 with Galaxy Gear

Posted 29 April 2014 - 13:27

There wasn't any substantial API support to help with this until the updated Azure SDK which was shown during Build. To incorporate these changes into a games engine takes a lot of time and it's something that simply isn't feasible in this time-frame for the current consoles. That doesn't mean it doesn't work, more so not implemented yet. We're 5 months into these consoles, you must understand that these things take time.

Yes, but Microsoft is hyping the technology now and has been doing so since before launch.

 

How is adding more GPU power is innovation when it's been done a million times already?

More graphics power allows developers to produce better and new experiences. Also, being a DX11 chipset means it's capable of tessellation and the APUs are better equipped for processing physics.



#135 JonnyLH

JonnyLH

    I say things.

  • Joined: 15-February 13
  • Location: UK
  • OS: W8, W7, WP8, iOS, Ubuntu
  • Phone: Nokia Lumia 920

Posted 29 April 2014 - 14:26

Yes, but Microsoft is hyping the technology now and has been doing so since before launch.

That doesn't mean theyre lying and the technology claiming to help wont. Which youre saying.



Click here to login or here to register to remove this ad, it's free!