Jump to content



Photo

Microsoft Xbox One -> Info that may not be well known

xbox one microsoft kinect xbox live

  • Please log in to reply
74 replies to this topic

#46 trooper11

trooper11

    Neowinian Senior

  • Tech Issues Solved: 5
  • Joined: 21-November 12

Posted 10 July 2013 - 01:05

The key to all of this is to remember that the X1's offline gaming experience will be the same as the ps4 regarding features, etc.

 

Also, keep in mind that developers are creating 'single player' experiences that leverage an internet connection. This has nothing to do with the X1 or PS4, its just the trend in gaming. So those that are upset with requiring an online connection to play are in for a shock as game developers push more and more titles that way. Look at Watch Dogs for example.




#47 OP Yogurtmaster

Yogurtmaster

    Neowinian

  • Tech Issues Solved: 1
  • Joined: 18-February 12

Posted 10 July 2013 - 06:34

What document? Where has Microsoft claimed it will double frame rates, or even improve them that significantly?

The fallback is up to the developer, but what fallback is there available? If the console is capable of doing the calculation in the first place, why would you not just do it on the console? And how much of a sacrifice to graphics, drop in framerate, etc would users put up with when their latency takes a dive? "Oh dear, your network is overloaded have Xbox 360 graphics and play at 15fps instead?" (Extreme example, but you get the point.)
 

1.5Mbps is never going to give the kind of latency required for processing the time sensitive data that would double frame rates, as you claim.
 

That kind of offloading isn't going to double frame rates. The only things that could double frame rates would be all the latency sensitive operations such as physics, graphics processing, sound and (some) AI. I'm not saying the cloud won't help at all, but you are limited on what can be offloaded, which does limit how much can be gained. I think you are grossly under-estimating how many things are time sensitive in your average game.
 

1) True. I believe Sony's plan with Gaikai for now is to use it for getting PS3 games on the PS4 though. Still requires a heck of a lot of power and would need good server optimization to get as many users per server as possible.
2) Eh?
3) Yes they can. If anything they could be more dynamic. It would be easier to implement as you wouldn't need to worry about a fallback situation. The whole game is in sync with itself, only bit you need to worry about is the console to server input and the speed the video is streamed back. By the time the Internet is capable of doubling frame rates by doing time sensitive calculations, it would be preferable to put everything on the server itself.
4) If you can double frame rates on 1.5Mbps, you can stream video.

The cloud is great and can do a lot of things. It can even free up resources for the time sensitive calculations, as you say. And that is probably enough to make up the graphics power difference with the PS4 on exclusives, but it won't result in outright better graphics or double frame rates at the same graphics level.

 

   You will see.  It's already happening.... See this thread here....

 

    This is just a very, very, very small example and it's not using all of the features, just very minor ones.

    http://www.neowin.ne...#entry595807814

 

    This is ONLY the beginning and it's a very small sample to boot.  There is a lot more than this.  Wait until Halo next year, that will be a game that really breaks things wide open.  It can allow for better graphics for sure, no doubt about that and yes offer better frame rates.   That is why I said a lot of people that think the PS4 is more powerful are going to get a huge shock.

 

     Sure, the PS4's own local GPU is better, that is a given, but when you add in server processing and Microsoft can expand this over time, the Xbox is going to be better.  Can Sony do this as well? Well the answer is yes and no.  They are set up for streaming, but not being as dynamic as this.  Microsoft also has that dedicated hardware that allows compression/decompression of assets and then injected right into the GPU's memory.  Sony doesn't have that feature in hardware and Microsoft is going to have a superior server platform, software stack, and superior method of delivery.  

 

     Microsoft's method is going to take a lot less bandwidth to deliver a superior experience.  Both ways of doing this are hard, they are not easy, but the method that is easier is the streaming method.  This is very innovative for a console because for the first time the console hardware can be augmented in real-time and it keeps the Internet bandwidth down.  

 

This is why it says "Cloud Powered", that document that I provided showed the four different things that can be done with the cloud.  It is not just marketing, there is actually good rationality behind it.   They are really pushing servers because servers can be a "game" changer (pun intended) for real.

 

Can everything be done by "server", not unless you do it the way Sony is doing it.  Microsoft can offload a lot to the server, but not everything and that is okay, because the Xbox One is powerful.  It's a lot more powerful than people think.   People think that if the GPU doesn't have as many FLOPS (Floating Point Operations per second) as Sony then it's not very powerful, but it's actually more powerful when you augment it with Servers or distributed computing. 

 

  As you said though Sony is going to give the PS4 back compat via the servers and that is going to be a long time before they even can upgrade their servers to be more powerful.  I wouldn't count on that for a long long time, probably 4-5 years or so.  Because it's going to take enormous server power to run full next-next generation games around the world.   That is going to be a ton of servers around the world and Sony isn't currently prepared for it.

 

Can Sony afford it?  



#48 vetFourjays

Fourjays

    Neowinian Senior

  • Joined: 09-September 05
  • Location: Staffordshire, UK

Posted 10 July 2013 - 09:03

   You will see.  It's already happening.... See this thread here....

 

    This is just a very, very, very small example and it's not using all of the features, just very minor ones.

    http://www.neowin.ne...#entry595807814

Was just reading that. This is the type of thing the cloud is going to be used for and confirms my point that it can only be used for latency insensitive operations (they actually say tree physics don't need to be fully synced). My issue with the cloud is specifically the outlandish claims of double frame rates, etc. I won't say they are impossible, just not possible yet. By the time the Internet can handle offloading those kind of calculations to the cloud, we'll be streaming games anyway.



#49 OP Yogurtmaster

Yogurtmaster

    Neowinian

  • Tech Issues Solved: 1
  • Joined: 18-February 12

Posted 11 July 2013 - 07:01

Was just reading that. This is the type of thing the cloud is going to be used for and confirms my point that it can only be used for latency insensitive operations (they actually say tree physics don't need to be fully synced). My issue with the cloud is specifically the outlandish claims of double frame rates, etc. I won't say they are impossible, just not possible yet. By the time the Internet can handle offloading those kind of calculations to the cloud, we'll be streaming games anyway.

 

  Nope, already happening.  You won't want to stream the entire game, that takes up a huge amount of bandwidth and it takes a huge amount of money (see the link below), that is going to require them at least renting a lot of server space around the world.  Maybe Sony can talk to Google or Microsoft.

 

This talks a lot about Microsoft Azure Data centers for servers (Clouds)

http://www.youtube.c...h?v=JJ44hEr5DFE

 

 I posted a lot of this in that document that I posted.  Latency sensitive game code is going to run on the local machine (which makes a lot of sense), the latency insensitive game code can run on a server that is nearby.   Game worlds can change dynamically and the bandwidth is going to be saved by using the Z77 hardware compression/decompression (This gives you speed and also saves a lot of bandwidth, with the gaming content coming from and to the servers) Move engines and then can be injected directly into the GPU's memory. 

 

You don't want the servers to run the entire game, that is a complete waste of bandwidth and server power.  

If Sony goes beyond the PS3 into the next generation beyond the PS4, it's going to be like 1080p or even 4K and that is going to need massive, massive servers and this is going to take up more energy from the servers themselves (this kind of processing isn't free).  I honestly would not expect 4K for a very long time because even with the latest codecs it's going to take a lot of bandwidth to do that and those bandwidth limits would be used up quickly because games are usually played longer than a hour and half movie. 

 

That is why I think Microsoft's idea makes a lot more sense.   They share the load between local and server and we should see a game like Halo next year offer some of the best cloud based experiences that Microsoft has for this new generation.   Microsoft keeps the load down on the servers which uses a lot less electricity and uses a lot less bandwidth as well.  It's a win all over. 

 

Sony won't have their server PS3 streaming started until 2014 and ONLY in the USA.  So, they are a long way of providing servers like what Microsoft has for the launch of the Xbox one, it's going to take them years and then they would have to provide an upgrade over time.  Microsoft is already starting the benefits of the servers starting at launch of the Xbox one, that is one reason why they haven't launched in all countries yet.

 

That video that I posted is huge.  Microsoft is building out new data centers in 9 months time for the 4th, 5th generation. 



#50 vetFourjays

Fourjays

    Neowinian Senior

  • Joined: 09-September 05
  • Location: Staffordshire, UK

Posted 11 July 2013 - 09:13

Nope, already happening.  You won't want to stream the entire game, that takes up a huge amount of bandwidth and it takes a huge amount of money (see the link below), that is going to require them at least renting a lot of server space around the world.  Maybe Sony can talk to Google or Microsoft.
 
This talks a lot about Microsoft Azure Data centers for servers (Clouds)
http://www.youtube.c...h?v=JJ44hEr5DFE
 
 I posted a lot of this in that document that I posted.  Latency sensitive game code is going to run on the local machine (which makes a lot of sense), the latency insensitive game code can run on a server that is nearby.   Game worlds can change dynamically and the bandwidth is going to be saved by using the Z77 hardware compression/decompression (This gives you speed and also saves a lot of bandwidth, with the gaming content coming from and to the servers) Move engines and then can be injected directly into the GPU's memory. 

You seem to be disagreeing with me agreeing with you.  :wacko: As I said, and as devs have said, and as you have now said, the cloud is going to be used for latency insensitive operations (I have never disagreed with this). All I'm saying is this isn't going to result in double frame rates. It will improve frame rates and graphics as more local power is available, but to get that much out of it would require latency sensitive operations to be offloaded. Of course all this does depend on the game in question.
 

You don't want the servers to run the entire game, that is a complete waste of bandwidth and server power.  
If Sony goes beyond the PS3 into the next generation beyond the PS4, it's going to be like 1080p or even 4K and that is going to need massive, massive servers and this is going to take up more energy from the servers themselves (this kind of processing isn't free).  I honestly would not expect 4K for a very long time because even with the latest codecs it's going to take a lot of bandwidth to do that and those bandwidth limits would be used up quickly because games are usually played longer than a hour and half movie. 
 
That is why I think Microsoft's idea makes a lot more sense.   They share the load between local and server and we should see a game like Halo next year offer some of the best cloud based experiences that Microsoft has for this new generation.   Microsoft keeps the load down on the servers which uses a lot less electricity and uses a lot less bandwidth as well.  It's a win all over.

I'm not entirely convinced on game streaming either (at least for the near future). It needs heavy server side optimization, excellent compression/decompression at either end and much more reliable Internet connections than most people probably have. However, the bandwidth requirements shouldn't be much more than those required for streaming an HD video. My point is that by the time you can start to offload latency sensitive operations to the cloud, you'd be in a position to just stream whole games. If anything it would be more desirable as instead of uploading and download many different individual snippets of game data once a frame, you only upload input and download video. I'm sure when the time comes Microsoft will be all over it.



#51 OP Yogurtmaster

Yogurtmaster

    Neowinian

  • Tech Issues Solved: 1
  • Joined: 18-February 12

Posted 11 July 2013 - 09:27

You seem to be disagreeing with me agreeing with you.  :wacko: As I said, and as devs have said, and as you have now said, the cloud is going to be used for latency insensitive operations (I have never disagreed with this). All I'm saying is this isn't going to result in double frame rates. It will improve frame rates and graphics as more local power is available, but to get that much out of it would require latency sensitive operations to be offloaded. Of course all this does depend on the game in question.
 

I'm not entirely convinced on game streaming either (at least for the near future). It needs heavy server side optimization, excellent compression/decompression at either end and much more reliable Internet connections than most people probably have. However, the bandwidth requirements shouldn't be much more than those required for streaming an HD video. My point is that by the time you can start to offload latency sensitive operations to the cloud, you'd be in a position to just stream whole games. If anything it would be more desirable as instead of uploading and download many different individual snippets of game data once a frame, you only upload input and download video. I'm sure when the time comes Microsoft will be all over it.

 

Just to prove that it can speed up frame rates.  I am going to give you a source and a quote.  This is why I also think that the next Halo will run at 60 instead of 30, which of course is double the frame rate.  Battlefield 4 was also mentioned to be 64-player which sounds like Microsoft's form of "Dedicated Servers" and I could be wrong on this, but they also said 60 but for Xbox One and they did not mention the PS4 (other people assumed it, but it is still an assumption, as far as I can tell, I haven't found this to be confirmed on the PS4). 

 

Here is the source and the quote that you will want to read that proves my point.  Keep in mind that Dan Greenwalt is the head of Turn 10 Studios. Notice what "offloading" means.  Usually when you "offload" you can gain back some speed. 

 

Source: http://www.oxm.co.uk...fers-are-small/

 

"Forza's known for a very solid 60. and so having more power on the box obviously, but also offloading power to the cloud allows us to do that 1080p and 60 frames at a level that most games would just be considering for 30."

 

 My own analysis about losing your connection (which would rarely happen), it would then default to that of what is on the disk or drive.  So, the graphics would scale down or the speed would scale down.  Something is scaling down if the Internet connection is lost.  

 

  As for streaming HD Video, you rarely are streaming 1080p for more than just 1 and a half hours.  Usually when people play games, they play much longer and those same people are also going to be using things like Netflix as well.  All of this adds up.  

 

The higher the video quality the more bandwidth that is needed and thus the bigger Internet connections.    What about frame rates of 60 frames per second, how is that going to be handled on something like this?

 

The Microsoft method is far superior, it won't take up nearly as much bandwidth at all and the worlds can be a lot more dynamic as well.

 

This is what I found about the requirements before Sony bought them.... 30 frames per second at 720p....

 

"Gaikai recommends an Internet connection of 5 Mbit/s or faster, and a 3 Mbit/s connection meets the minimum system requirements." 

 

I have a 6 Megabit connection and if I use this at 1080p nobody could do anything else in the house. 



#52 +FiB3R

FiB3R

    "the sun is definitely rising on a new dawn!"

  • Tech Issues Solved: 6
  • Joined: 06-November 02
  • Location: SE London
  • OS: Windows 8.1 Enterprise
  • Phone: Lumia 930

Posted 11 July 2013 - 09:49

You seem to have ignored my post on page 2

 

http://www.neowin.ne...ost&p=595805578

 

Care to clarify?



#53 vetFourjays

Fourjays

    Neowinian Senior

  • Joined: 09-September 05
  • Location: Staffordshire, UK

Posted 11 July 2013 - 09:58

Here is the source and the quote that you will want to read that proves my point.  Keep in mind that Dan Greenwalt is the head of Turn 10 Studios. Notice what "offloading" means.  Usually when you "offload" you can gain back some speed.

I know what offloading is. I have even stated you get more power available for other things as a result.
 

Source: http://www.oxm.co.uk...fers-are-small/
 
"Forza's known for a very solid 60. and so having more power on the box obviously, but also offloading power to the cloud allows us to do that 1080p and 60 frames at a level that most games would just be considering for 30."[/size]

You are really basing the idea of doubling frame rates on one fairly unspecific quote from the Turn 10 Studios head talking to the Official Xbox Magazine? I find the information provided by Respawn and Ubisoft far more telling of what the cloud can do than one remark to a (naturally) biased media outlet.
 
I'm not sure the quote even says quite what you seem to think it does:
 

Forza's known for a very solid 60. and so having more power on the box obviously, but also offloading power to the cloud allows us to do that 1080p and 60 frames at a level that most games would just be considering for 30.

It is open to interpretation I guess, but to me it says the combined power of the Xbox One's increased power and the cloud allow them to do a solid 60fps instead of 30fps. Not that the cloud specifically allows them to reach that and the Xbox would chug along at 30fps without it. The whole quote reads like it is about the difference from this-gen to next-gen.

 

I'm not going to continue arguing with you though as it seems like an exercise in futility. I'll believe the cloud can double frame rates when Forza/Halo played with an Internet connection runs at 60fps and without one it runs at 30fps.



#54 OP Yogurtmaster

Yogurtmaster

    Neowinian

  • Tech Issues Solved: 1
  • Joined: 18-February 12

Posted 11 July 2013 - 10:17

I know what offloading is. I have even stated you get more power available for other things as a result.
 

You are really basing the idea of doubling frame rates on one fairly unspecific quote from the Turn 10 Studios head talking to the Official Xbox Magazine? I find the information provided by Respawn and Ubisoft far more telling of what the cloud can do than one remark to a (naturally) biased media outlet.
 
I'm not sure the quote even says quite what you seem to think it does:
 

It is open to interpretation I guess, but to me it says the combined power of the Xbox One's increased power and the cloud allow them to do a solid 60fps instead of 30fps. Not that the cloud specifically allows them to reach that and the Xbox would chug along at 30fps without it. The whole quote reads like it is about the difference from this-gen to next-gen.

 

I'm not going to continue arguing with you though as it seems like an exercise in futility. I'll believe the cloud can double frame rates when Forza/Halo played with an Internet connection runs at 60fps and without one it runs at 30fps.

 

I know what offloading is. I have even stated you get more power available for other things as a result.
 

You are really basing the idea of doubling frame rates on one fairly unspecific quote from the Turn 10 Studios head talking to the Official Xbox Magazine? I find the information provided by Respawn and Ubisoft far more telling of what the cloud can do than one remark to a (naturally) biased media outlet.
 
I'm not sure the quote even says quite what you seem to think it does:
 

It is open to interpretation I guess, but to me it says the combined power of the Xbox One's increased power and the cloud allow them to do a solid 60fps instead of 30fps. Not that the cloud specifically allows them to reach that and the Xbox would chug along at 30fps without it. The whole quote reads like it is about the difference from this-gen to next-gen.

 

I'm not going to continue arguing with you though as it seems like an exercise in futility. I'll believe the cloud can double frame rates when Forza/Halo played with an Internet connection runs at 60fps and without one it runs at 30fps.

 

"At a level that most games would just be considering for 30".  In other words I see it as that adding in the cloud offloads processes so that they can do things that most games couldn't do (I.E running at 60 frames per second).  

 

If you look at Halo for Xbox One, why would they do 60 frames per second for the first time.  I don't see many shooters on consoles doing 60 frames per second.   I know Killzone isn't 60 frames per second, they are doing 30.



#55 vetFourjays

Fourjays

    Neowinian Senior

  • Joined: 09-September 05
  • Location: Staffordshire, UK

Posted 11 July 2013 - 10:57

Sigh.

 

"At a level that most games would just be considering for 30".  In other words I see it as that adding in the cloud offloads processes so that they can do things that most games couldn't do (I.E running at 60 frames per second).  

 

If you look at Halo for Xbox One, why would they do 60 frames per second for the first time.  I don't see many shooters on consoles doing 60 frames per second.   I know Killzone isn't 60 frames per second, they are doing 30.

You are picking up on the second part of the quote and not the first. The "but" joins the two parts, and the "also" makes the comment on the cloud connected to the first about the increased power of the Xbox One. Hence why it reads like a comment about Xbox One's capabilities as a whole over this/last-gen.

 

Probably because the Xbox One now has a whole boatload more graphical power? The Xbox One now has the same kind of graphics capabilities as a semi-decent PC, which have been capable of 60fps for some time. Hell, I've got a 4+ year old video card in my PC that could probably run at 60fps and wipe the floor with anything the PS3 or Xbox 360 could do. And the graphics going into the One/PS4 are much more powerful than that, so it isn't hard to see where the 60fps is coming from.



#56 JonnyLH

JonnyLH

    I say things.

  • Joined: 15-February 13
  • Location: UK
  • OS: W8, W7, WP8, iOS, Ubuntu
  • Phone: Nokia Lumia 920

Posted 11 July 2013 - 11:01

Sigh.

 

You are picking up on the second part of the quote and not the first. The "but" joins the two parts, and the "also" makes the comment on the cloud connected to the first about the increased power of the Xbox One. Hence why it reads like a comment about Xbox One's capabilities as a whole over this/last-gen.

 

Probably because the Xbox One now has a whole boatload more graphical power? The Xbox One now has the same kind of graphics capabilities as a semi-decent PC, which have been capable of 60fps for some time. Hell, I've got a 4+ year old video card in my PC that could probably run at 60fps and wipe the floor with anything the PS3 or Xbox 360 could do. And the graphics going into the One/PS4 are much more powerful than that, so it isn't hard to see where the 60fps is coming from.

You do know comparing consoles to PC's are like chalk and cheese? Even though this gen is running X86.

 

Consoles have a fixed architecture with a fixed platform. Regarding development, developers only have to worry about how to use all the power they have in front of them. On computers, they have to worry about how to support all the types of architectures with different specs and OS's overheads. So even though the consoles don't have the raw power of PC's, because of how they are and the development process for them they produce games which look just as good as the high-end PC's at the time to start off with. On a side note, I wish I had the spare money these days for a high-end PC, I miss PC gaming :(

 

Regarding the FPS argument with the cloud, I think he's referring to both as a combined platform. The thing is offloading AI into the cloud is going to really offload some local power, which does help with the optimizing and hitting the 60fps mark.



#57 OP Yogurtmaster

Yogurtmaster

    Neowinian

  • Tech Issues Solved: 1
  • Joined: 18-February 12

Posted 11 July 2013 - 11:33

Sigh.

 

You are picking up on the second part of the quote and not the first. The "but" joins the two parts, and the "also" makes the comment on the cloud connected to the first about the increased power of the Xbox One. Hence why it reads like a comment about Xbox One's capabilities as a whole over this/last-gen.

 

Probably because the Xbox One now has a whole boatload more graphical power? The Xbox One now has the same kind of graphics capabilities as a semi-decent PC, which have been capable of 60fps for some time. Hell, I've got a 4+ year old video card in my PC that could probably run at 60fps and wipe the floor with anything the PS3 or Xbox 360 could do. And the graphics going into the One/PS4 are much more powerful than that, so it isn't hard to see where the 60fps is coming from.

 

   Yeah, I am not buying that entire PC thing at all.  If that is true then why are so many PS4 games at 30 Frames per second.  Killzone should be running 60 easily.  Nope not buying that.   



#58 JonnyLH

JonnyLH

    I say things.

  • Joined: 15-February 13
  • Location: UK
  • OS: W8, W7, WP8, iOS, Ubuntu
  • Phone: Nokia Lumia 920

Posted 11 July 2013 - 12:51

   Yeah, I am not buying that entire PC thing at all.  If that is true then why are so many PS4 games at 30 Frames per second.  Killzone should be running 60 easily.  Nope not buying that.   

They can't make a good software stack to compliment their hardware. Running a box with a modified version of FreeBSD is not a good way to go.

 

The Xbox runs 3 HyperVM's which compliment each other and specifically designed to work exactly in harmony with the hardware the software sits on. You have hardware compression/decompression too offload data compression to sent to the cloud from the CPU. You've got memory buses to inject data from the cloud straight into the RAM. Just things like that which make a huge difference.



#59 ahhell

ahhell

    Neowinian Senior

  • Joined: 30-June 03
  • Location: Winnipeg - coldest place on Earth - yeah

Posted 11 July 2013 - 13:09

   Yeah, I am not buying that entire PC thing at all.  If that is true then why are so many PS4 games at 30 Frames per second.  Killzone should be running 60 easily.  Nope not buying that.   

WTF?

KZ running at 30FPS was a decision made by the dev.  The KZ series had slower game play anyway.  Focusing on 30FPS for the first gen of games isn't that big of deal (unfamiliar hardware).  If they are still doing 30fps, 3 years down the road THAT is a problem.



#60 JonnyLH

JonnyLH

    I say things.

  • Joined: 15-February 13
  • Location: UK
  • OS: W8, W7, WP8, iOS, Ubuntu
  • Phone: Nokia Lumia 920

Posted 11 July 2013 - 13:39

WTF?

KZ running at 30FPS was a decision made by the dev.  The KZ series had slower game play anyway.  Focusing on 30FPS for the first gen of games isn't that big of deal (unfamiliar hardware).  If they are still doing 30fps, 3 years down the road THAT is a problem.

Unfortunately its usually graphic fidelity what improves over the life span not the fps. If devs target 30fps, it'll usually stay at that and target more fidelity in the game once they get more familiar.

 

Look at it this way, the graphics engine would have to tick over twice as fast to hit 60fps from 30 which means they have to optimize it by 50%. A very big task.

 

If every game is hitting 60fps on the X1 with the same graphic fidelity, then the difference between consoles will be insane and they'll have to do something about it.