RUMOR: Microsoft Under Clocking Xbox One By 100-200mhz


Recommended Posts

Anyone expecting massive differences in visual quality in the 2 consoles is in for dissapointment. The specs are close enough, even with the PS4 having 50% more shaders and bandwidth, its not enough to make a huge visual difference in any multiplatform title. If anything, those are always coded to the lowest common hardware, which would be the XBox One. If they indeed end up underclocking, that's only going to mean worse performance on both consoles.

On the current consoles there are usually a few differences, not in the design of the game but the graphical settings. Most of the time they have different anti analyzing settings and sometimes different resolutions.

http://forum.beyond3...ead.php?t=46241

PS3:

Batman: Arkham Asylum = 1280x720 (no AA)

Deus Ex: Human Revolution = 1280x720 (MLAA)

Dirt 2 = 1280x720 (QAA)

Fifa Street 3 = 1920x1080 (no AA)

Ghostbusters = 960x540 (QAA, pre-patch), 1024x576 (2xAA, post-patch)

Xbox 360:

Batman: Arkham Asylum = 1280x720 (2xAA)

Deus Ex: Human Revolution = 1280x720 (FXAA)

Dirt 2 = 1280x720 (4xAA)

Fifa Street 3 = 1920x1080 (4xAA) (yes 1080P with 4xAA!)

Ghostbusters = 1280x720 (2xAA)

Link to comment
Share on other sites

Just came to post that. Not good news. Microsoft better not be putting their head in the sand regarding all the key issues. DRM, Always on, Used games, this underclocking etc.

Major Nelson has announced the winners of the competition to attend the media briefing, so i assume it is still very much on.

Link to comment
Share on other sites

lighting/photon maps can be pre-rendered yes, of course the cloud will do it in seconds where the consoles would require minutes or even tens of minutes for a complex level, the cloud weill also be able to snap shot and upgdate the lighting as the level changes due to destructible environment

I don't think you understand what pre-computing is. If a developer is leveraging high quality static lighting (UE3 Lightmass, Source Radiosity/VRAD) then the lightmaps are computed at compile-time and then keyed accordingly with scripted events. If your lighting is reacting to destructible geometry then it's not static and you're back to the issue of latency.

AI. real time, can be off loaded form the console to the cloud easily.

This is feasible (as it's basically multiplayer), but more trouble than it's worth to simply offload AI.

physics can be cloud calculated and off loaded from the GPGPU, releasing GPU resources.

Introducing latency into environmental physics is undesirable.

  • Like 1
Link to comment
Share on other sites

I'm just not seeing the connection between yield issues and having to downclock the processor.

If they can't get enough satisfactory processors, there would just be less available. That would have nothing to do with underclocking the bunch of good processors.

Makes no sense to me.

Link to comment
Share on other sites

So let me get this straight this custom APU is blowing out a lot of hot air, and they couldn't spend a little more cash on better cooling? come on ms

Link to comment
Share on other sites

The GAF thread is saying this is true now.

"ucan remo.ve RUMUR frmo hte thread tittle TEAMMOD. esram yieldls. are "troubling"toput it lightly.

truthfact.

btw youiw lil not njhear abouthing about tech atMS E3 presentation. tech tarbaby (notracistst!) IS TSTICKIER THAN YOU WIL EVER KNOW. I LOOK FORWARF TO READING THE BOOK BY VENTUREBEAT.

theyMSrushed."

Seems legit to me. :s

  • Like 1
Link to comment
Share on other sites

"ucan remo.ve RUMUR frmo hte thread tittle TEAMMOD. esram yieldls. are "troubling"toput it lightly.

truthfact.

btw youiw lil not njhear abouthing about tech atMS E3 presentation. tech tarbaby (notracistst!) IS TSTICKIER THAN YOU WIL EVER KNOW. I LOOK FORWARF TO READING THE BOOK BY VENTUREBEAT.

theyMSrushed."

Seems legit to me. :s

LOLWUT??? I think I just lost a few brain cells reading that.

Link to comment
Share on other sites

LOLWUT??? I think I just lost a few brain cells reading that.

he apparently posts like that to fool search engine indexing. :/ (doesn't make sense to me)

Link to comment
Share on other sites

1.6 Ghz is barely something to actually have so much heat... I think it may be the GPU Part in this case (my laptop has a quad core amd APU, at 1.6 the fan barely runs, it's however overclocked to 2.6 GHz, the fan gets to full blow power if it were not for my small mod which again keeps it at minimum)

Link to comment
Share on other sites

he apparently posts like that to fool search engine indexing. :/ (doesn't make sense to me)

He's so important that the Googles steal his comments. :laugh:

  • Like 1
Link to comment
Share on other sites

Look at the cooling on a modern comparable AMD CPU and GPU, and then look at the massive cooling on the One. you'd have to be a pretty gullible fool to buy this rumor, or really want to believe it for "some" reason.

Link to comment
Share on other sites

If this rumor is true, that means the case is gigantic and ugly for no reason. Seriously, all of the vents and fans in those pictures aren't enough to keep the hardware cool? Lets hope it's not the case.

  • Like 1
Link to comment
Share on other sites

1.6 Ghz is barely something to actually have so much heat... I think it may be the GPU Part in this case (my laptop has a quad core amd APU, at 1.6 the fan barely runs, it's however overclocked to 2.6 GHz, the fan gets to full blow power if it were not for my small mod which again keeps it at minimum)

or may be MS decided to add another APU to the board to exceed PS4 specs :ninja:

(yes I just made that up, and started a new rumor).

Link to comment
Share on other sites

or may be MS decided to add another APU to the board to exceed PS4 specs :ninja:

(yes I just made that up, and started a new rumor).

That would hilarious. Balmer could run out on stage with a PS4 poster and start air humping it.

Link to comment
Share on other sites

so let me get this straight.adding 32mb esram to the die is now making the die so unbearably hot and the chip needs downclocking but the ps4s 50% more compute units on the die and some magical sony physics defying pixie dust modifications has their chip running so cool it doubles as an air conditioning unit. Even a retard will find a hole in your story.

  • Like 2
Link to comment
Share on other sites

so let me get this straight.adding 32mb esram to the die is now making the die so unbearably hot and the chip needs downclocking but the ps4s 50% more compute units on the die and some magical sony physics defying pixie dust modifications has their chip running so cool it doubles as an air conditioning unit. Even a retard will find a hole in your story.

the GDDR5 ram isn't exactly cool running either.

Link to comment
Share on other sites

http://www.neogaf.co...25#post61464125

I stand by what I said. I understand there are people here that are a much better known quantity than I am, and I don't expect to suddenly convince people to go against their gut instincts simply because I feel confidently that this information (about a downclock) isn't true.

But I said it, I had good reason for saying it, and I will accept whatever punishment deemed necessary if I'm wrong. Mocking, banning, whatever is deemed necessary. I know how the site works. Nobody is above the basic rules of the road. I fully embrace and enjoy that aspect of this site. But as I said in a little jest before, give me a fair trial. We need real confirmation of a downclock to the GPU outside of forum rumor and hearsay. The one constant that nobody can deny is that ESRAM had serious manufacturing issues at a time, and that they are essentially still a real challenge even now, although not impossible to overcome, largely because the ESRAM can be made in quite a few more places than is commonly the case in these circumstances. The rest is simply not confirmed, but I won't challenge the ESRAM issues, because it simply can't be challenged in any credible fashion. If you're in a position to hear anything at all from somebody that possibly knows anything at all about what's going on, the ESRAM issues and concerns are things you've been hearing about since 2012, but clearly people at MS believe ESRAM is a design win, but it isn't coming without its headaches, headaches that were expected and planned to be countered by having it made in multiple places, and separating the good from the bad. A process shrink for the ESRAM will likely be possible in the first year the system is on the market.

All I can say definitively is that these challenges weren't unexpected by Microsoft. It didn't blindside them, even if Sony's move to 8GB of GDDR5 may have, but all is inline with what they expected, even the problems and challenges manufacturing the ESRAM. But at no point have I ever heard anything remotely close to anything implying that in order to deal with ESRAM's manufacturing challenges, that downclocking the GPU might be necessary. In fact, it's been said that downclocking the GPU wouldn't even help, because that isn't where the problem is. The problem is with the size and overall complexity of the part, which downclocking would hardly make up for in any way that could be considered significant and helpful. And digging a little deeper, and I wasn't entirely clear on this before, the problem is less so with making the ESRAM itself, but more incorporating that ESRAM once made into the larger set of components that it has to work in concert with.

And that's pretty much the gist of what I have right now. Anything further would get me in serious trouble, because I can't back it up with anything real that wouldn't also get someone else in trouble.

So that's another poster claiming there is no downclocking happening.

so let me get this straight.adding 32mb esram to the die is now making the die so unbearably hot and the chip needs downclocking but the ps4s 50% more compute units on the die and some magical sony physics defying pixie dust modifications has their chip running so cool it doubles as an air conditioning unit. Even a retard will find a hole in your story.

At the reveal: Microsoft tried to save money by using cheap components

Latest: X1's BOM is higher than PS4 because they wasted money on Kinect and esram

At the reveal: X1 is doomed because it is using slower gpu and ram than the PS4.

Latest: X1 will be downclocked because heat is a problem

Few days ago: X1's APU is overly complex compared to PS4

Latest: X1 is cheap design by throwing bunch of components together.

At the reveal: X1's memory is complex programming because of esram and DME (data move engines)

Latest: DME is just PR speak for DMA and PS4 also has it

At the reveal: PS4 has advantage because it has Gaikai powered backwards compatibility

Latest: All these cloud speak from MS is just PR bs.

It's as if the FUDers can't make up their mind. :laugh:

  • Like 3
Link to comment
Share on other sites

Well lets hope the PS4 doesn't run as hot as the Xbox 360, last thing we need this gen is another fiasco like that. It,would finish Sony as a company

Link to comment
Share on other sites

In the defense of us Xbox "fanboys, PS3 was supposed to be twice as powerful as Xbox 360 (which was labeled as Xbox 1.5 by Sony IIRC) so we still have some hope. :rofl:

Late in it's life, with games like Uncharted 3, Last of us, and even 60fps Wipeout HD, if it wasn't so hard to develop for, I don't know about twice as powerful, but clearly more powerful. But power that can't be realized isn't worth very much.

Link to comment
Share on other sites

Seriously wtf @ this thread? Does it matter? People will ****ing bitch about anything.

Bad publicity is better than no publicity :)

Link to comment
Share on other sites

Don't worry Microsoft have the infinite power of the cloud lol.

Lol, Irony? What If I buy a game and I don't have the internet? who's going to render it for me then? The sky?

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.