Many people dismissing "The Cloud" abilities are going to be really surprised over the next five years as this will evolve over time and many games that are expected to do only 30 frames per second will do 60. That is one thing that Microsoft has over Sony at the moment. This is why I believe Halo for Xbox One will be 1080p and 60 frames per second because the processing is offloaded on the server and this allows the game to be run at faster speeds, if Microsoft can get third parties on this, Sony could have a hard time with the same game going 30 frames per second.
I still find these kinds of claims about the cloud usage on the Xbox One extremely dubious. I'm not even sure Microsoft has claimed them directly. Do you have a source with detailed information on just what can be offloaded and how/why it would increase frame rates so much?
Maybe I am misunderstanding something, but the reason I find them dubious is because it doesn't make any sense when talking of things like doubling frame rates (it does make sense with dedicated server multiplayer and persistent worlds). Let's say you start offloading all your physics calculations to get more processing for "more frame rates". First you are assuming the user has Internet, but let's say they do (its the most likely scenario). You fire the gun, and the game asks the cloud to calculate the trajectory. It sends the data to the cloud and awaits a response. At a frame rate of 60 fps, the response would need to be returned within 16ms (1 sec / 60 frames) to be returned for the next frame. You could skip a frame, but that'd still need to be back within 32ms. So the first problem is you need very good Internet to get the data back quick enough. In some areas this kind of response time may be possible (to be honest I have no idea). What about those who don't? Or is the game fibre only? Anyway, let's assume the user's Internet is super awesome.
Oh no! Your wife/girlfriend/sister/mother/aunt/dog/cat starts watching a YouTube video and it is taking longer for the data to travel. And/or the network is struggling to handle the sudden increase in traffic. So the game, expecting the data back within 16/32ms is instead waiting much longer. Now what does it do? Calculate it itself? Why would the game go to all this effort, sending data across the Internet to get a response that it had the power available to do in a fraction of the time? Or do the graphics/framerate randomly decrease while it does the calculations itself? Or does it just keep on waiting for a response, leaving your bullet in limbo? And what happens if, while waiting for the response, a tank drives in front of your perfectly aligned sniper shot? It would need to ask for the cloud for a new calculation, further increasing the time it needs to wait before it actually knows where the bullet is going. The way I understand it, at the very least you get all the current problems of multiplayer gaming (lag, hit detection, etc) and more (afaik existing multiplayer games do only what's necessary to keep things synced).
As I said, I find these kind of claims about the cloud extremely dubious and I haven't seen them substantiated either.
The reason Sony are struggling to get 30fps is likely down to software. Microsoft came from a PC world, and their consoles are based on the same age-old DirectX APIs to my knowledge. Likewise, Xbox devs will be fully familiar with it. Meanwhile this is Sony's first foray into an x86 console, and the PS3 was on an entirely different weird and wonderful architecture. The API may have bugs in it, or may simply need to mature. Likewise, the developers are probably ironing out issues from going from PS3 to PS4 in their own software. I'd be surprised if it wasn't fixed by the time it launches. It may very well have been fixed by the time E3 was being shown - I doubt either company brought the most "up-to-date" hardware and software.