Lol. To say that gddr5 is wasted on a gaming machine. You've gotta be insane.
If the machine is designed to only play games up to 1080p and movies at 4k, yes, gddr5 is wasted. There aren't enough pixels to justify the bandwidth/fill rate. 1080p/60 is what, ~3gbps and no game developer is wasting cycles redrawing every pixel of every scene but rather modeling textures and lighting already active in memory or on ROP
In a perfect would, maybe something like that might be feasible. But this ain't a perfect world. There are too many issues that'll stop that whole 'cloud computing' idea from allowing the xbox one to become more powerful of a machine.
Doesn't make sense. The problems with games aren't really CPU/GPU, its the fact they're largely scripted, process driven and event base - you play one, you play them all. ONe of the announced features were dynamic maps and dynamic multiplayer so the worlds you play would be different each time you play them. This is possible because of the cloud. That's the kind of stuff that will make gaming fun if you ask me.
The graphics are already amazing, but again, come on, we're talking 1080p. HD is already said and done, we're talking about more interactivity, more personalized experiences, more interaction and more store. More WIN