RUMOR: Microsoft Under Clocking Xbox One By 100-200mhz


Recommended Posts

I keep seeing people make these claims but I've yet to see an actual use case set out that isn't implemented as such "because we can".

It is stated by someone from MS; not a random no-name source. Go look it up.

As for implementation, the article clearly states a few use cases that can be (and will be) used in xbox one in future.

Because we can? No one can do it without lots of resources (read: money) and no one who has access to that much money is going to waste money on this. This is something you will start to see once xbox one sees mass adaption.

Link to comment
Share on other sites

It's worth noting that a weaker console doesn't nesscarily mean it will be a failure as a product. Generally speaking the weaker console has always the more successful.

That said, if it means games like BF4 need to cut frame rate in half, then alot of core gamers it's pretty big deal.

  • Like 1
Link to comment
Share on other sites

*** IF **** it's true, it may also be so they can get higher chip yields at manufacturing as well.

I feel the XboxOne and PS4 are slow similar in hardware, that the gap in difference won't really matter, but will be use for fanboys to attach each other with.

You'll have under optimized games on xyz system, causing it to look better on the other when it shouldn't.

You'll highly optimized exclusives optimal for the system there on that will look amazing.

as a photographer, I can say that, canon likes making APS-C sensor cameras (google that if you don't know) because they can get more sensors per wafer. I have a Canon 5D MarkII that is a full frame sensor and it is more expensive because they don't get as many sensors per wafer... that aside from the new features etc.

but to underclock the CPU?... lol priceless. if I pay for a 2.5Ghz processor, I want the entire enchilada..

Link to comment
Share on other sites

As for implementation, the article clearly states a few use cases that can be (and will be) used in xbox one in future.

Because we can? No one can do it without lots of resources (read: money) and no one who has access to that much money is going to waste money on this. This is something you will start to see once xbox one sees mass adaption.

What article is this?

Sorry, but I'm past believing that publically traded companies are remotely smart enough to not waste vast sums of money on something they think is a good idea, especially when that idea has great potential for lock-in.

Link to comment
Share on other sites

About a month ago I was told by an unnamed source that Microsoft was having heating issues with the Xbox One. There wasn?t enough details to go forward with a story so it was kept under wraps until more information surfaced on this issue. Today, a completely different source close to the Xbox One project, informed me that Microsoft will have to under clock the Xbox One about 100-200mhz to fix the systems heating issues.

The implication being that Microsoft are just now finalizing hardware is a bit naive. It would definitely be too late to tell console developers, "sorry guys, but we need to sap around 10-15% of the CPU."

As launch titles, games are never as optimized as they would otherwise be because the developers do not have a full grasp of the actual hardware's capabilities, and taking away 800 MHz to 1.6 GHz across the eight cores would be a nightmare for developers this late in development.

CU count is quantifiable, the cloud is nebulous. If you know the number of stream processors you can extrapolate the number of CUs and since we know both are using AMD GCN architecture you can do a very real baseline comparison.

The 300,000 server number is silly, I'd wager they are virtual servers on a smaller number of actual hardware.

I do not doubt the physical 300K count, but I suspect that they are probably shared by other Azure services. In addition to the 300K, we can comfortably assume that they are split into weaker VMs similar to every cloud service (and they even mention VMs as the way that their cloud architecture is setup for off-console rendering).

Link to comment
Share on other sites

What article is this?

Sorry, but I'm past believing that publically traded companies are remotely smart enough to not waste vast sums of money on something they think is a good idea, especially when that idea has great potential for lock-in.

http://www.tomshardware.com/news/Xbox-One-Cloud-Jeff-Henshaw-Matt-Booty-Adam-Pollington,22775.html#xtor=RSS-181

Link to comment
Share on other sites

Doesn't surprise me, rumours are they are having yield issues with ESRAM as well.

This is what I've been saying all along but apparently to some people I'm a fanboy and an idiot who knows nothing because Microsoft will have modified the GPU to plug the massive performance gap.

Every bad rumor about the Xbox one you cheer lead..get a life dude..LOL

  • Like 2
Link to comment
Share on other sites

I'm a bit confused. It has DX11, or something equivalent? That or the PS3 is running windows? Regardless, if they don't have DX11 that is a pretty big standard they'll be missing. Although I don't think that will necessarily be a big hit depending on what they are using.

The GPU supports DirectX but Sony can't implement it (and doesn't have to) because the PS4 doesn't run Windows (or Sony doesn't ask help from Microsoft). For PS4 development developers will most likely be able to use OpenGL and a newer version of libGCM (which is a sort of extension of OpenGL with Sony's own additions). Those allow developers to have really low-level access to the GPU, skip the overhead and really squeeze out as much performance as possible. Microsoft is very unlikely to do this as they have always required Xbox developers to strictly adhere to what is possible in DirectX. DirectX makes development easier but can really hurt performance in some scenarios (because of the overhead).

It'll take PS4 exclusives to really bring out the power though, but it's available. If Microsoft doesn't give developers more possibilities the Xbox One will most likely be the slower console this generation. If the PS4 allows the use of OpenGL (which is highly likely) it also means that porting games requires minimal effort. OpenGL is pretty much available on every platform (Windows, Linux, OSX, Android, PS3, ...) besides the Xbox, so that's quite convenient.

I've never owned a console and am not really a gamer (the last game I purchased, Assassin's Creed 3, is still lying around unopened) but just from a computing point of view there is absolutely no doubt that the PS4 will be faster in almost every scenario. It has faster memory, higher clocks, more shaders and lower-level access to the hardware. Pretty simple.

Link to comment
Share on other sites

I'm a bit confused. It has DX11, or something equivalent? That or the PS3 is running windows? Regardless, if they don't have DX11 that is a pretty big standard they'll be missing. Although I don't think that will necessarily be a big hit depending on what they are using.

DirectX is a Microsoft owned set of APIs. The GPUs in both are essentially the same, but obviously Sony doesn't have access to the DX APIs. The other (main) standard is OpenGL.This is no different than the situation in the current consoles, or using the same graphics card in Windows and OSX.

Link to comment
Share on other sites

Every bad rumor about the Xbox one you cheer lead..get a life dude..LOL

Lol that's why I said Sony should be paying him because if he's not getting paid for all this "work" he's doing then I don't know what to call him. Lol.

So, the tech has zero application to realtime computation and is purely targeted at pre-computable workloads.

Again, why not just pre-compute?

I kept hearing Microsoft talk about persistent worlds when talking about cloud computing. So we'll see wht they had in mind soon enough

  • Like 1
Link to comment
Share on other sites

It's been said a few times, the 300k is just for Xbox Live servers, it doesn't take into account the already massive Azure datacenters they have which is something developers can also take advantage of and use as they like. If you know anything about how Azure works you know it's not a simple VM server image really, it's way more dynamic in the way it does things. This means that a developer doesn't have to manage and pay for large server costs years from now that they did when a game initially shipped and had high demand. Since it can scale up and down as needed the costs will match that, meaning 2-3 years after their game is out and demand is 1/3rd what it was the Azure costs will also have dropped to the point that developers/publishers can just let the process run, probably covering the costs by selling in-game content depending on what type of game it is.

Link to comment
Share on other sites

I'm just struggling to understand how 100/200mhz on such a chip would be the difference between heating issues and running absolutely fine because surely after being used extensively or if dust should start to build up the issues would resurface. I'm not gonna believe this one just yet.

Link to comment
Share on other sites

I'm just struggling to understand how 100/200mhz on such a chip would be the difference between heating issues and running absolutely fine because surely after being used extensively or if dust should start to build up the issues would resurface. I'm not gonna believe this one just yet.

It wouldn't, nor was CPU overheating ever the issue with the old Xboxes, and considering this is a newer processor made by AMD, heat dissipation should be even easier this time around.

So, that's why it's a rumor, because it's probably nonsense.

Link to comment
Share on other sites

If it's not meant to help out the rendering, then people should stop trying to say that it's going to magically make up for the big difference in power between the two consoles when the big difference is in the rendering performance.

GPU's today do a lot of stuff that takes a lot of power and doesn't need to be done fully in real time or synchronously. When you take all these things away from the GPU and put it on a cloud computing platform instead, then guess what, a lot of GPU resources that otherwise would be tied up is available to be used for increasing graphics performance and quality. it's quite simple really.

  • Like 2
Link to comment
Share on other sites

1. Are the CPUs for PS4 and X1 identical? If so, why wouldnt PS4 have the same heat issues?

I was under the impression that both MS and Sony were going fanless this time.

2. From what I had "heard", the CPUs design in the X1 were "heavily influenced" by MS. If so, doesnt that negate #1 and any comparisons that can be made by the systems? Because if that were true, i would assume the GPU portion would be affected as well.

  • Like 2
Link to comment
Share on other sites

So, the tech has zero application to realtime computation and is purely targeted at pre-computable workloads.

Again, why not just pre-compute?

lighting/photon maps can be pre-rendered yes, of course the cloud will do it in seconds where the consoles would require minutes or even tens of minutes for a complex level, the cloud weill also be able to snap shot and upgdate the lighting as the level changes due to destructible enviroment

AI. real time, can be off loaded form the console to the cloud easily.

physics can be cloud calculated and off loaded from the GPGPU, releasing GPU resources.

Link to comment
Share on other sites

1. Are the CPUs for PS4 and X1 identical? If so, why wouldnt PS4 have the same heat issues?

I was under the impression that both MS and Sony were going fanless this time.

2. From what I had "heard", the CPUs design in the X1 were "heavily influenced" by MS. If so, doesnt that negate #1 and any comparisons that can be made by the systems? Because if that were true, i would assume the GPU portion would be affected as well.

You make a good point.

Link to comment
Share on other sites

I was under the impression that both MS and Sony were going fanless this time.

What would have given you the idea that a modern home console could possibly go fanless?

lighting/photon maps can be pre-rendered yes, of course the cloud will do it in seconds where the consoles would require minutes or even tens of minutes for a complex level, the cloud weill also be able to snap shot and upgdate the lighting as the level changes due to destructible enviroment

AI. real time, can be off loaded form the console to the cloud easily.

physics can be cloud calculated and off loaded from the GPGPU, releasing GPU resources.

Those two things require being kept in decent sync with the game. The ai so the game actually functions correctly(ex. things trigger when/how they should), and the physics so it can render the affected items correctly. The wildly variable latencies of the internet will prevent those from working without issue.

As for the lighting.. precomputing that would make the game have to wait till it has received that information to be able to properly render the frame.

Link to comment
Share on other sites

Anyone expecting massive differences in visual quality in the 2 consoles is in for dissapointment. The specs are close enough, even with the PS4 having 50% more shaders and bandwidth, its not enough to make a huge visual difference in any multiplatform title. If anything, those are always coded to the lowest common hardware, which would be the XBox One. If they indeed end up underclocking, that's only going to mean worse performance on both consoles.

Link to comment
Share on other sites

1. Are the CPUs for PS4 and X1 identical? If so, why wouldnt PS4 have the same heat issues?

I was under the impression that both MS and Sony were going fanless this time.

2. From what I had "heard", the CPUs design in the X1 were "heavily influenced" by MS. If so, doesnt that negate #1 and any comparisons that can be made by the systems? Because if that were true, i would assume the GPU portion would be affected as well.

It's not the CPU that's the problem.

The reason the esram are there in the first place is because the 8 gigs of DDR3 ram are too slow for the GPU when doing framebuffer memory stuff, while PS4 doesn't need it because they are using GDDR5 ram which you'll find on GFX cards, which you probably know.

It's not the CPU that's in need of the esram it's the GPU and that's why it'll effect framerates. It's very unlikely that any game is CPU capped with the CPU's they are going to use even though they are weak.

I think the reason Microsoft went DDR3 ram is because of the density of the ram. The motherboard would become way too complex. The reason Sony are able to do it is because they are not only putting them on the top of the motherboard but also on the bottom.

The CPU's are most likely very similar but the memory layout and GPU are a very different setup.

Link to comment
Share on other sites

It's not the CPU that's the problem.

The reason the esram are there in the first place is because the 8 gigs of DDR3 ram are too slow for the GPU when doing framebuffer memory stuff, while PS4 doesn't need it because they are using GDDR5 ram which you'll find on GFX cards, which you probably know.

It's not the CPU that's in need of the esram it's the GPU and that's why it'll effect framerates. It's very unlikely that any game is CPU capped with the CPU's they are going to use even though they are weak.

I think the reason Microsoft went DDR3 ram is because of the density of the ram. The motherboard would become way too complex. The reason Sony are able to do it is because they are not only putting them on the top of the motherboard but also on the bottom.

The CPU's are most likely very similar but the memory layout and GPU are a very different setup.

Yes but people are comparing the PS4 CPU to the X1, and either MS has the same CPU or a customized one. Then you cant compare the two, and then, you have to consider the the ramifications of that customization with how it interacts with everything else. Basically, if its customized, comparisons made are right up there with the way they were made with 360 and PS3, and saying one is dumbed down, or the other "OMG Superior!" is a waste of time.

Link to comment
Share on other sites

What would have given you the idea that a modern home console could possibly go fanless?

Those two things require being kept in decent sync with the game. The ai so the game actually functions correctly(ex. things trigger when/how they should), and the physics so it can render the affected items correctly. The wildly variable latencies of the internet will prevent those from working without issue.

As for the lighting.. precomputing that would make the game have to wait till it has received that information to be able to properly render the frame.

Keeping AI and Physics in sync isn't a big deal, EVERY multiplayer game already does it. physics would be the biggest issue here. unless it's a racing game 50 ms isn't going to be noticeable even for AI.

you don't update photon maps every frame, photon maps are rendered once unless there are major changes to the level.

  • Like 1
Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.