PS4 Architect Mark Cerny: Cloud won't work well to boost graphics


Recommended Posts

Plus one notch to the "Appeal to Authority" fallacy column.

 

oohhh, so you get to play the insult card and then when  insulted yourself you try and play the fallacy card. I love it! Keep it up! absolutely rich!

 

I'm not sure what you're point is anymore.  Are you really saying you know best and everyone must believe you just because you found some video from sony claiming that it supports your argument when there is plenty of evidence the world over across many different fields of computing that shows the true power of "cloud" compute technology regardless of your disdain for buzzwords?

 

Sony chose brute force hardware, Microsoft chose their route. Let them duke it out.  Your self professed claimed to faim and throwing out insults when anyone doubts your opinions as being fact is hilarious.

Link to comment
Share on other sites

oohhh, so you get to play the insult card and then when  insulted yourself you try and play the fallacy card. I love it! Keep it up! absolutely rich!

 

I didn't play any insult card, I quite simply (truthfully) pointed out your history of posting absolute nonsense. (I can dredge up the post where Brandon corrected you if you wish) Nor does it change that you did in fact employ an appeal to authority fallacy.

 

You still haven't gotten back to me with an actual technical argument by the way.

 

Edit:

 

I'm not sure what you're point is anymore.  Are you really saying you know best and everyone must believe you just because you found some video from sony claiming that it supports your argument when there is plenty of evidence the world over across many different fields of computing that shows the true power of "cloud" compute technology regardless of your disdain for buzzwords?

 

Sony chose brute force hardware, Microsoft chose their route. Let them duke it out.  Your self professed claimed to faim and throwing out insults when anyone doubts your opinions as being fact is hilarious.

 

Am I saying I know best? No, I just know a thing or two about computer graphics and the kind of numbers involved with rendering a frame. I'm also well aware of how both Microsoft and Sony are happy to manipulate people into these pathetic PR wars where all that happens is people vomit buzzwords and PR links over eachother.

Link to comment
Share on other sites

I ignored it because your post is a load of rubbish, objects outside of view are not rendered to begin with, and you can't pre-render it as you don't know where and when the object will come into view, or if it's state will change. Read the ars link.

 

 

You can believe what you want honestly, at the end of the day I get the last laugh when you're left disappointed because the PR didn't live up to reality. That and the whole PC gaming master race, yaknow?

 

you obviously dont understand what you read,and comment as if you do. i specifically said the cloud stuff is not used for graphics rendering. when object bodies are out of view, physics calculations for example can tolerate latency because,guess what, they are not in view,so not essential to be calculated in real time.

 

if i throw a ball,then turn around while it is in the air, the ball doesnt freeze until i turn back around. when i turn around,i expect the ball to be resting on the ground,after having bounced a couple of times.

Link to comment
Share on other sites

So I won't deny that I am not fond of the XB1 and will always be a playstation guy at heart...but...

 

Does anyone here actually even develop applications? No? Its showing. Time for a little education...

 

The cloud can and will increase graphics fidelity and the magic of this happens with PRECACHING resources (textures and meshes for example). Say you are playing a game and in a dungeon. It has two exits, one to another room the other to the overworld: a developer like me (in the world of the cloud) can ask the cloud for resources within the sync distance of the playing entities' observer radius. This isn't asked for the moment you enter a room, it is asked for ahead. Wonder why these consoles have a big RAM increase over last gen? Its to keep this in memory and recalled to send off to the GPU when doing a render loop.

 

The only problem comes down to potential sync issues. If your connection isn't fast enough, you may not get the resource in time. If that happens then the console's GPU is used as a fallback. Also keep in mind that the console's GPU will not be put to waste: we can save a lot of time generating meshes (for example) for the GPU by having the cloud gives us this. This means the console's GPU can instead be used for dynamics such as particles and physics. That in itself will give the impression of "better graphics". Great examples are the sheer difference between Borderlands 2 on Low and Borderlands 2 on Ultra (with Ultra physX). Looks completely different.

 

tl;dr Cloud will send textures/meshes "ahead of time" (precaching) for the console to store in RAM or persisted on the HD and the console's GPU will be delegated as a fallback and dynamics component.

  • Like 1
Link to comment
Share on other sites

you obviously dont understand what you read,and comment as if you do. i specifically said the cloud stuff is not used for graphics rendering. when object bodies are out of view, physics calculations for example can tolerate latency because,guess what, they are not in view,so not essential to be calculated in real time.

 

if i throw a ball,then turn around while it is in the air, the ball doesnt freeze until i turn back around. when i turn around,i expect the ball to be resting on the ground,after having bounced a couple of times.

 

You said nothing about physics at all, you simply talked about an object out of view. To quote:

 

I dont know how much clearer this can be explained. Also about latency, stuff thats not in view can tolerate latency as it can be processed in its sweet time,and the xbox can just update it so its ready when we view it.

 

For someone that talked about "waste" when it comes to the cloud earlier, your physics example is rather wasteful. If the simulation is non-essential, it would be more economical to boot it to a low-priority thread.

Link to comment
Share on other sites

You said nothing about physics at all, you simply talked about an object out of view. To quote:

 

 

For someone that talked about "waste" when it comes to the cloud earlier, your physics example is rather wasteful. If the simulation is non-essential, it would be more economical to boot it to a low-priority thread.

 

 

 

 

here,ill simplify it for some folks,because people are mistaking graphics processing in the cloud, vs computing processing in the cloud.

 

it couldnt have been more clear than this. just because i didnt say the word physics doesnt mean i wasnt talking about physics. someone like you,who is speaking like they have a clue about how this stuff works would have picked it up instantly. 

 

and you just proved what ive been saying all along,that you dont understand the purpose of the cloud,yet speaking like you do. yes a simple thrown ball would go in a low priority thread, but these games have complex simulations that would need demanding power. offloading such simulations to the cloud frees resources and enabled the hardware to be used more for graphics. anytime the gpu/cpu are involved in simulations is resources taken away from graphics.

Link to comment
Share on other sites

you obviously dont understand what you read,and comment as if you do. i specifically said the cloud stuff is not used for graphics rendering. when object bodies are out of view, physics calculations for example can tolerate latency because,guess what, they are not in view,so not essential to be calculated in real time.

 

if i throw a ball,then turn around while it is in the air, the ball doesnt freeze until i turn back around. when i turn around,i expect the ball to be resting on the ground,after having bounced a couple of times.

Erm...if it is out of the sync range of an entity which can observe, almost always it isn't simulated.

 

Why would you simulate what isn't being observed? I am guessing you folks are confused on camera radius vs. sync radius.

 

Rendering culls based on the current view and projection of the camera calculated in a frustum. Its the same concept of the sync distance but some games will cull more than others (the more you cull the less it renders and the more FPS you get). The goal is a smooth player experience; you should be able to turn around and see for as far as you can see without seeing any rendering occurring. Depending on your meshes as well as textures and other rendering effects as well as target hardware optimizations, this can happen or you see the mesh before rendering "cloudy textures" and then they get rendered as objects are prioritized front to back.

 

Sync is different because it doesn't matter where you are looking at. So long as it is within my "sync radius", the game will synchronize the updates to me (typically through some NetworkSynchronizer system). You need only make sure that your sync radius is optimized.

 

EDIT

 

As for offloading physics calculations to the cloud, you can only reliably do it for predetermined dynamics (precaching, like I mentioned in a prior post). Offloading physics calculations and awaiting responses in real time is impossible if you expect to get a real time response.

Link to comment
Share on other sites

Erm...if it is out of the sync range of an entity which can observe, almost always it isn't simulated.

 

Why would you simulate what isn't being observed? I am guessing you folks are confused on camera radius vs. sync radius.

 

Rendering culls based on the current view and projection of the camera calculated in a frustum. Its the same concept of the sync distance but some games will cull more than others (the more you cull the less it renders and the more FPS you get). The goal is a smooth player experience; you should be able to turn around and see for as far as you can see without seeing any rendering occurring. Depending on your meshes as well as textures and other rendering effects as well as target hardware optimizations, this can happen or you see the mesh before rendering "cloudy textures" and then they get rendered as objects are prioritized front to back.

 

Sync is different because it doesn't matter where you are looking at. So long as it is within my "sync radius", the game will synchronize the updates to me (typically through some NetworkSynchronizer system). You need only make sure that your sync radius is optimized.

 

EDIT

 

As for offloading physics calculations to the cloud, you can only reliably do it for predetermined dynamics (precaching, like I mentioned in a prior post). Offloading physics calculations and awaiting responses in real time is impossible if you expect to get a real time response.

 

for the last time,im not talking about graphics simulations or rendering,and i sure as hell am not talking about realtime physics calculations either. i made this very clear many times. people need to read. that example of the ball was about its physics,not rendering it graphically. i will calculate its physics,so when i turn back around,i can render its final position when its in view.

Link to comment
Share on other sites

You said nothing about physics at all, you simply talked about an object out of view. To quote:

 

 

For someone that talked about "waste" when it comes to the cloud earlier, your physics example is rather wasteful. If the simulation is non-essential, it would be more economical to boot it to a low-priority thread.

 

Low priority threads still have to be done, they're still there on the CPU . if you don't do it the ball will just hang there in the air until you observe it again...

Link to comment
Share on other sites

for the last time,im not talking about graphics simulations or rendering,and i sure as hell am not talking about realtime physics calculations either. i made this very clear many times. people need to read. that example of the ball was about its physics,not rendering it graphically. i will calculate its physics,so when i turn back around,i can render its final position when its in view.

If an object is within sync and has physics, it will always be done in realtime. Attempting to prioritize physics based on view is too expensive as you must trace the camera to the shape of the entity and determine if any other object would obstruct the view 100%.

 

One of the pros of the cloud is removing the barrier known as "sync realtime" (which is what I think you are trying to explain vcfan). Locally simulation should occur within observer's sync radius (and rendering done based on camera matrices). With the cloud, you could have anything outside sync radius still incur simulations. Once the entity comes into sync radius of that observer, you could remove the network updates from the cloud sync'ing that entity to that observer.

 

Should be fully possible and would be pretty cool.

Link to comment
Share on other sites

What's this, one of the two most spammy PS fans posting more Anti Xbox FUD?  Give me a break...

 

Geez, I don't think anybody ever said that the 'cloud' would improve graphics.  They are looking to offload non-time sensitive operations to the cloud, stuff which can deal with a couple hundred millisecond latency. So we're looking at stuff like Physics, AI, and possibly lighting, to free up local operations for more time-sensitive things.

 

Take for example the drivatars in Forza. AI based upon real users driving, computed in the cloud. Perfect example. Anything to do with graphics? No. A benefit to the game? Sure is.

 

Wow people are so quick to jump on the anti xbox bandwagon. It is just boring now that the same people continually fail to grasp the concept they are trying to slate and just end up looking stupid.

Link to comment
Share on other sites

If an object is within sync and has physics, it will always be done in realtime. Attempting to prioritize physics based on view is too expensive as you must trace the camera to the shape of the entity and determine if any other object would obstruct the view 100%.

 

One of the pros of the cloud is removing the barrier known as "sync realtime" (which is what I think you are trying to explain vcfan). Locally simulation should occur within observer's sync radius (and rendering done based on camera matrices). With the cloud, you could have anything outside sync radius still incur simulations. Once the entity comes into sync radius of that observer, you could remove the network updates from the cloud sync'ing that entity to that observer.

 

Should be fully possible and would be pretty cool.

 

yes,thats exactly the concept im trying to explain,but im trying to simplify it so people can understand it,thats why i used the ball example. now,how its done internally, that i dont know, but of course a "simulate locally,or through network" based on sync radius method sounds much more efficient than trying to hack together a method based on view like you said.

Link to comment
Share on other sites

Low priority threads still have to be done, they're still there on the CPU . if you don't do it the ball will just hang there in the air until you observe it again...

Even better is the fact that with the cloud, there is no need for thread delegation of priorities (in terms of physics).

 

Treat all simulations within sync range of an observer as standard priority and any entity that goes outside that range enters "cloud sync" where the cloud takes over and does simulations.

/**
 * Example in Java
 *
 * This class is attached to any Entity (using a Component design) which will incur network updates from the cloud
 */
public class CloudNetworkComponent extends NetworkComponent {
    private final List<Entity> observingEntities = new ArrayList();

    /**
     * Callback which occurs when an Entity with the Observer component observes the owner of this component. 
     */
    @Override
    public void onObserved(EntityObserveEvent event) {
        //Already force sync'd from the cloud
        if (observingEntities.contains(event.getEntity()) {
            return;
        }

        final PlayerComponent player = event.getEntity().get(PlayerComponent.class);
        if (player != null) {
            player.getNetworkSynchronizer().sync(getOwner());
        }
        observingEntities.add(event.getEntity());       
    }

    @Override
    public void onCloudSync(CloudEntitySyncEvent event) {   
        //Transform is position, rotation, and scale
        getOwner.getPhysics().setTransform(event.getTransform());    
    }

    public final void cloudSync(CloudEntitySyncEvent event) {
        //You could even upload the position of the entity back to the cloud that your console figured out
        //so the cloud can share it with others :).
        if (!observingEntities.isEmpty()) {
            onCloudSync(event);
        }
    }
}

A basic code example. 

Link to comment
Share on other sites

I think cloud computing can solve the nonsensical "rubber band AI" in Need For Speed type games. When you overtake an AI driver, you can actually leave it far behind if his/her car is underpowered. Not like you are driving a Veyron vs AI's Punto and at the end of the race the Punto is just 5 secs behind you.

Some proper calculations could be made by cloud when the cars are far behind. But if it is very near, the calculations are again entrusted to the local console. So when you are disconnected from Internet, dynamic AI is disabled.

Isn't this exactly the way cloud is supposed to work?

Link to comment
Share on other sites

Yeah, you kinda do need to because it would give you context. Your post makes no sense and almost seems to imply the 360 lacks internet connectivity.

 

Are you seriously going to act this clueless? I even separated the sentences in different paragraphs. You make it seem like just "because why not?", Microsoft can implement cloud capabilities to any device (in this case the Xbox 360). You obviously didn't see anything regarding research, funding, implementation in my first post, right? I suppose that makes ZERO sense. :rolleyes:

 

And I can't even being to understand how hard it is to understand quite a simple analogy (not sure how you even came to the conclusion that I suggested the 360 lacks Internet connectivity). Seriously, you just drift apart.

 

read my post on the previous page #5,and reply back,its convenient that you ignore whatever you want to ignore.

 

I ignored it because your post is a load of rubbish

 

Why do you even come and "try" to have a discussion? Start a blog where you can write your facts since apparently everyone just writes rubbish illogical incomprehensible stuff, and everyone's wrong except you.

 

Maybe Microsoft should put off in expanding their cloud datacenters, afterall it's just a waste of money!

Link to comment
Share on other sites

I think cloud computing can solve the nonsensical "rubber band AI" in Need For Speed type games. When you overtake an AI driver, you can actually leave it far behind if his/her car is underpowered. Not like you are driving a Veyron vs AI's Punto and at the end of the race the Punto is just 5 secs behind you.

Some proper calculations could be made by cloud when the cars are far behind. But if it is very near, the calculations are again entrusted to the local console. So when you are disconnected from Internet, dynamic AI is disabled.

Isn't this exactly the way cloud is supposed to work?

 

Pretty much, I described another angle above but the premise is the same. No matter what: if a car (entity) is in range of an observer (you, the player), your local console does the simulation. The cloud could do it otherwise.

 

This is irregardless of internet status.

Link to comment
Share on other sites

it couldnt have been more clear than this. just because i didnt say the word physics doesnt mean i wasnt talking about physics. someone like you,who is speaking like they have a clue about how this stuff works would have picked it up instantly. 

 

and you just proved what ive been saying all along,that you dont understand the purpose of the cloud,yet speaking like you do. yes a simple thrown ball would go in a low priority thread, but these games have complex simulations that would need demanding power. offloading such simulations to the cloud frees resources and enabled the hardware to be used more for graphics. anytime the gpu/cpu are involved in simulations is resources taken away from graphics.

 

You do understand there is more to a engine than rendering and physics right? If you can talk specifics, then let's talk specifics. Let's not beat around the bush with vague statements, buzzwords and PR speak. Unless of course you're being intentionally vague so you can weasel your way out of compromising statements.

 

It's also worth pointing out your arguments support the earlier notion of using such offloading with the 360, which you called a "waste of resources". When arguably, the 360 needs it far more than the One does.

 

So which is it? Is it a waste of resources or is it a viable approach? It can't be both.

 

Low priority threads still have to be done, they're still there on the CPU . if you don't do it the ball will just hang there in the air until you observe it again...

 

They do still have to be executed on the CPU yes, but at the same time the CPU will not always be at 100% utilisation. Physics on a object basis aren't expensive either, especially so if we stick to the "ball falling to the floor" narrative.

 

Are you seriously going to act this clueless? I even separated the sentences in different paragraphs. You make it seem like just "because why not?", Microsoft can implement cloud capabilities to any device (in this case the Xbox 360). You obviously didn't see anything regarding research, funding, implementation in my first post, right? I suppose that makes ZERO sense. :rolleyes:

 

Research funding and implementation for what? Are you saying R&D is required to connect, send, wait and recieve data from a client to a "cloud" server cluster?

 

And as other people keep mentioning, Azure already exists. So really yes, your post does make zero sense as you're acting like there is some sort of R&D required but you can't actually name what that constitutes.

Link to comment
Share on other sites

Research funding and implementation for what? Are you saying R&D is required to connect, send, wait and recieve data from a client to a "cloud" server cluster?

 

And as other people keep mentioning, Azure already exists. So really yes, your post does make zero sense as you're acting like there is some sort of R&D required but you can't actually name what that constitutes.

 

Are you for real?  Of course, there is R&D involved in getting it all working together.  It's not ######ing magic.  You make it sound like MS has a Go-Go-Gadget-Cloud switch.   :s

Link to comment
Share on other sites

Here's what I'll say...

 

  • Cloud resources are highly unlikely to be useful for offloading any existing graphics work from the console.
    HOWEVER...
  • Offloading things like multiplayer game servers to the cloud frees up resources for graphics and other local tasks. This is the most direct way in which cloud services are likely to improve the graphical experience of the console.
  • Cloud services can enrich games in other ways, such as:
    • Leaderboards / stats
    • Persistent worlds
    • Content that evolves and is expanded over time by the developer
  • Clever developers will find cool new ways to make use of it. I wouldn't rule out someone coming up with a way to enrich the graphical experience using cloud resources...
  • Like 2
Link to comment
Share on other sites

You do understand there is more to a engine than rendering and physics right? If you can talk specifics, then let's talk specifics. Let's not beat around the bush with vague statements, buzzwords and PR speak. Unless of course you're being intentionally vague so you can weasel your way out of compromising statements.

 

It's also worth pointing out your arguments support the earlier notion of using such offloading with the 360, which you called a "waste of resources". When arguably, the 360 needs it far more than the One does.

 

So which is it? Is it a waste of resources or is it a viable approach? It can't be both.

 

 

They do still have to be executed on the CPU yes, but at the same time the CPU will not always be at 100% utilisation. Physics on a object basis aren't expensive either, especially so if we stick to the "ball falling to the floor" narrative.

 

 

Research funding and implementation for what? Are you saying R&D is required to connect, send, wait and recieve data from a client to a "cloud" server cluster?

 

And as other people keep mentioning, Azure already exists. So really yes, your post does make zero sense as you're acting like there is some sort of R&D required but you can't actually name what that constitutes.

You are free to respond to my posts on the matter. I would be glad to discuss it further :)

Link to comment
Share on other sites

Are you for real?  Of course, there is R&D involved in getting it all working together.  It's not ****ing magic.  You make it sound like MS has a Go-Go-Gadget-Cloud switch.   :s

 

Name it then. Microsoft has already invested the R&D in creating their Azure platform, and the Internet already exists. So what's different?

 

At the end of the day, fundamentally cloud computing is the same whatever the context. You have a workload, you send or allocated the workload, Azure processes it and you get a result/service back.

 

Edit:

You are free to respond to my posts on the matter. I would be glad to discuss it further :)

 

You've clarified a few things far more eloquently than I have, I get the feeling though what you've said so far hasn't quite sunk in on certain posters.

 

I'll have a response and a couple of questions later, doing some research first.

Link to comment
Share on other sites

 

Here's what I'll say...

 

  • Cloud resources are highly unlikely to be useful for offloading any existing graphics work from the console.

    HOWEVER...

  • Offloading things like multiplayer game servers to the cloud frees up resources for graphics and other local tasks. This is the most direct way in which cloud services are likely to improve the graphical experience of the console.
  • Cloud services can enrich games in other ways, such as:
    • Leaderboards / stats
    • Persistent worlds
    • Content that evolves and is expanded over time by the developer
  • Clever developers will find cool new ways to make use of it. I wouldn't rule out someone coming up with a way to enrich the graphical experience using cloud resources...

 

Existing graphics work, as in realtime? (in sync range) then I agree. You can use prediction and precache resources from the cloud. Read my earlier posts with a short example.

Link to comment
Share on other sites

Research funding and implementation for what? Are you saying R&D is required to connect, send, wait and recieve data from a client to a "cloud" server cluster?

 

And as other people keep mentioning, Azure already exists. So really yes, your post does make zero sense as you're acting like there is some sort of R&D required but you can't actually name what that constitutes.

 

Hahaha oh wow... exactly why I added that last sentence in my first post. Incredible.

 

Yes, I am pretty sure it's not just a matter of "connect, send, wait and recieve data".

Link to comment
Share on other sites

It's funny to me how people argue over this.  My opinion is this:  we'll see at the end of the day if you see any difference at all between cloud vs non-cloud games.  I doubt we will.  Personally.

Link to comment
Share on other sites

Name it then. Microsoft has already invested the R&D in creating their Azure platform, and the Internet already exists. So what's different?

 

At the end of the day, fundamentally cloud computing is the same whatever the context. You have a workload, you send or allocated the workload, Azure processes it and you get a result/service back.

 

I'm not a dev so I can't give you specifics but using any inkling of ######ing logic SHOULD tell you that you have to come up with SOMETHING to go from point A (xboxone) to point B (Azure cloud) and back.  Hence, RESEARCH AND DEVELOPMENT.

Good god, man.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.