Microsoft: Xbox One CPU gets final speed boost to 1.75 GHz

With about two months to go before the Xbox One is supposed to launch in the U.S. and other countries, a Microsoft Xbox executive confirmed today that not only is the company's next-generation game console in full production, the hardware inside the case got a final performance boost.

According to GeekWire, Microsoft Xbox chief marketing officer Yusef Mehdi revealed at the Citi Global Technology Conference that the Xbox One has received one final CPU speed boost, going up from 1.60 GHz to 1.75 GHz. Last month, Microsoft announced that the GPU inside the console has received a performance increase from 800 MHz to 853 Mhz.

Microsoft is using hardware co-designed by the company with AMD that includes the CPU, GPU and more on one chip. On paper, Sony's PlayStation 4 is supposed to be faster than the Xbox One but it looks like Microsoft was able to squeeze in some extra clock speed adjustments in the system before the first retail units started production.

Microsoft has not revealed a final launch date for the Xbox One beyond "November," and Mehdi did not offer up any further info on that subject today. He also didn't reveal just how many Xbox One units will be ready for the launch, saying only, "This will be the biggest launch we've ever done by a wide margin in terms of units shipped at launch."

Source: GeekWire.com | Image via Microsoft

Report a problem with article
Previous Story

Microsoft gives 90 day extension to current TechNet subscriptions

Next Story

Unannounced Nexus device leaks in Google's KitKat statue video

52 Comments

Commenting is disabled on this article.

even if it cant do the gfx all to well, forza's drivatars are a good use of it processing the telemetry of your driving to make the AI cars harder to get past and also when one of your friends plays with your "ghost" car itll be like you how youd drive in the game making it harder so its good.

who cares as long as cloud computing boosts the console making it future proof. any calculation that takes less than 100ms to complete can be offloaded, and it is only going to get faster over time. this time on the eve of the next next generation, we'll all be talking about how MSFT invented console cloud computing and sony is now busy copying it for the PS5 in order to keep up with the xbox two.

It sounds like a marketing stunt.
In fact, for both sony and MS, the spec of the console is irrelevant for the end user, what we care is the end result, and sometimes, it is up for the developers.

People presuming MS boosted the speed just to boost the specs rather than the suitability of all the hardware together. lol.

150 MHz hardly makes a difference, Sony could pretty much do the same considering how these APUs work, plus the chip on sony still emits less heat because of the lower transistor count (and still more powerfull GPU, ja!)

actually the sony gpu has more transistors generates more heat. so sony can't really push things further without massive cooling redesign which is pretty much too late to change at this point.

the GPU has more transistors making it hotter cus itll take more power to run, sure the ESRAM adds alot of transistors to the chip BUT it depends on how much power it draws and seeing as its basically a flash ram type thing the power draw will be no where near what a cpu and gpu needs prolly in the watts maybe even 1-2 watts so not much extra heat from the power draw so not much to worry about

prolly why theyve made the console so big and kept the power brick outside the console, reduce as much heat as possible and keep it well ventilated even when shoved in an entertainment center bit.

The gpu is more powerful but not by a lot tbh. wonder what the memory bus will be like feeding the gpu, MS seems to have a 256bit bus from the esram which is massive for that "power" of gpu. in gfx cards with around 768 stream processors usually gets 64bit. sonys type prolly gets 128 bit. i think theyll be the similar but i think MS will have the edge for a while cus of directx beats opengl. best part of opengl is PRT which it supports

no its not lol 150 mhz is nothing. i could overclock my cpu by 150 mhz now if i wanted to on air and it would be fine. 150mhz was like 15-20 years ago. not exactly last year mate

I am talking proportionately, 150mhz would have ran an entire OS with multiple applications open when it was a top dog. It doesn't seem much now but it could well be enough to improve background operations.
Most computers barely reach a 100% CPU usage (although a 150 system would have) so the increase, proportionally, is pretty good.

I'd imagine over the preceding months they've been stress testing the system and this will be a software update on first load.

Couldn't have said it better, I game on consoles because of the games. Not because it looks super shiny!

I am perfectly happy with the current "look" of the current gen and don't wish for more.

lol, Microsoft raises the clock speed of their console to try and compete with the competition yet it's still slower. At least it's faster than the Wii U, right?

Tha Bloo Monkee said,
lol, Microsoft raises the clock speed of their console to try and compete with the competition yet it's still slower. At least it's faster than the Wii U, right?

And I'm still buying it over the competition.

Tha Bloo Monkee said,
lol, Microsoft raises the clock speed of their console to try and compete with the competition yet it's still slower. At least it's faster than the Wii U, right?

Yawn at the fan boy.

Not sure how many times i can say this but these numbers mean nothing till we see some real world performance and i say that as a guy sitting on the fence regarding his next purchase.

i wouldnt say its so much to compete with the competition there making it faster because the design can handle it, id be alot more worried about the PS4 overheating thatn x1, small design (less room for ventilation), PSU integrated into the console will add alot of heat if not cooled properly especially if ppl are gaming for 10 hours a day.

Im sure PS4 ill be fine with sata cables taking alot less room than ide etc but if it does start getting to hot itll cause system instability, poor gfx performance poor cpu performance etc. you just got hope

The only problem with old PS3/360s was bad solder, overheating really isn't a problem. PS4/X1 would have been tested at 100% load for weeks solid at stupid temps.

Still not knowing how the internal changes of the GPU and CPU work, technical numbers are meaningless.

With the Xbox 360, a lot of speculation of the Xenos GPU resulted in underestimating the processing power and over estimated the PS3 GPU. Even estimating how the GPU logic with the latency of the ALUs was mistaken, not realizing that the OS and DirectX APIs were designed to handle some of the GPU shader thread management in the OS kernel.

Again we have a basic 'reference' design of both APUs, but AMD had been very clear that the memory and GPU portions of the Xbox One are specifically designed by Microsoft to meet the DirectX/NT needs.

Looking back at a few technical articles from the PS3/ Xbox 360 timeframe, they were highly technical and detailed about the known workings of the two different technologies. However they were confident that the PS3 would have no problem running 1080p games with regard to just the GPU. Some sites even estimated the PS3 would easily handle dual display 1080p content with the 'advantageous' GPU.

As it turned out, the PS3's 'better' GPU performed worse than the Xbox 360 GPU, having trouble with 720p and never getting close to the estimated 1080p.

It was in the details of how the new DMA of the GPU worked, how the 10mb Cache worked so that the buffer wasn't starved, the new DMA, the implementation of UMA that Microsoft designed and the OS assisted with the threading logic of the ALUs which was designed into the newer NT kernel that managed the GPU.

Until we see the content and know the specifics of what Microsoft has done with the lower level aspects of the GPU, we won't know if it or the PS4 will perform better, no matter what base specifications are released.

My bet is still on the Xbox One, just knowing MS's history with GPU/CPU design, which ironically can be found in the PS4.

plus MSFT console essentially is a direct-x box. it's what PC developers know for years and will have no issue tunning to extreme detail whereas the PS4 is yet another custom graphics stack on top of off the shelf pc parts. as you say, software and drivers and the graphics stack will play a greater role than MHz and gbps when these consoles are so alike chip wise.

my money is on MSFT. as creators of the #1 technical gaming platform: PC, they far out spec sony in software alone. it is just hard to measure software specs because few could understand the huge massive advantage MSFT has over sony.

I like the Xbox dashboard more than the PS4. I just wish Xbox Live Gold wasn't required for things like Netflix.

Enron said,
I like the Xbox dashboard more than the PS4. I just wish Xbox Live Gold wasn't required for things like Netflix.

I agree that the Gold subscription isn't competitive with Roku and other devices freely offering access to these Apps.

In defense, most users have a Gold subscription for gaming and other features and it isn't a huge cost.

The Gold subscription also goes back to when Netflix was starting their streaming content and Microsoft helped 'fund' part of Netflix's infrastructure and gave them access to the Smooth/Adaptive streaming technologies.

Microsoft justified this investment in Netflix at the time, as it gave the Xbox an exclusive App and Gold members were helping offset the cost difference, as the streaming only service was offered cheaper to Xbox users at that time.

However, that is the past, and Microsoft needs to reconsider this requirement.

It would still be a 'loss' for Microsoft, as some of these Apps use the Xbox servers; however, the existing gamer 'Gold' base of users might end up eating the costs with an extra $1 or 2 a month to offset the server costs of the users not using paying for Gold.

Ah, I wasn't aware of the history there. Thanks for explaining. I was pretty late to Xbox, only got one in 2011.

I don't understand... Why not just ADD a better processor? This just comes across as money saving tactics - and mind you, the X1 is $100 more expensive than the PS4. Hardware-wise, I don't understand why its priced at the same level when the console doesn't meet ps4's harware specs.

still_rookie said,
I don't understand... Why not just ADD a better processor? This just comes across as money saving tactics - and mind you, the X1 is $100 more expensive than the PS4. Hardware-wise, I don't understand why its priced at the same level when the console doesn't meet ps4's harware specs.
A) it's a custom built processor... I'm pretty sure it's more complicated than "add a better processor" B) $100 more expensive... and it comes with the kinect sensor. PS4's equiv is a purchased addon. You can argue the merrits of requiring the Kinect sensor, but there's your $100.

MrHumpty said,
A) it's a custom built processor... I'm pretty sure it's more complicated than "add a better processor" B) $100 more expensive... and it comes with the kinect sensor. PS4's equiv is a purchased addon. You can argue the merrits of requiring the Kinect sensor, but there's your $100.

Exactly, the CPU and GPU are both highly customized by MS though at the core they're using AMD tech. These are not standard off the shelf PC parts guys.

A) it's a custom built processor... I'm pretty sure it's more complicated than "add a better processor" B) $100 more expensive... and it comes with the kinect sensor. PS4's equiv is a purchased addon. You can argue the merrits of requiring the Kinect sensor, but there's your $100.

And Sony's is too. And we already know that Sony's processor is more powerful than the Microsoft's. So, minus the $100 for kinect, given the fact that Xbox will have lower hardware specs, how is it fair to price it at $400 claiming to be on par with Sony's? Its absolutely absurd. At least the 360 is cheaper than the ps3.

I really don't think there will be a terribly noticeable difference between the consoles. They both have powerful and similar hardware, they're both based on x86 processors so game ports should come much easier, and with several times the amount of RAM of the last generation, I think it'll be a while before we hear something like, "The PS4 version has 7 birds flying in the background but the XBOne only has 6!!!" I think the crutch for making money in this generation is going to be exclusive titles.

Sony's apparently looking into VR from the looks of it. (though, it's my understanding they've been at this for about a year now at least)

A simple boost like this is unlikely to make up for that gap (as far as I know Sony haven't even announced specifics like this?). At the same time the only time you'll likely see a difference is when it comes to exclusives and even then I doubt there will be a drastic difference.

Gerowen said,
....
Also, there is the fact that game makers will be attempting to push out games quickly, and making it for the lower common denominator means most games that are multi-console will run exactly the same anyway.

Ya, seems they are panicked about the huge spec disparity. I can totally see another RROD situation considering how big the console is.

DPyro said,
I can totally see another RROD situation considering how big the console is.

Pretty silly thing to say, don't you think? You've got it backward.

The RROD has exactly 0 to do with the size of the case and also 0 to do with the chip that wasn't being cooled properly. Anyone that every fixed the issue knows this. Nothing in this article in any way points to a RROD risk.

No, RRoD due to using mixed solder (because of the RoHS guidelines), which could become liquid at high temperatures and not enough clamping of the heatsink so that the chip was able to move when it did.
Solder reflow fixed RRoD and later iterations of the Xbox 360 had better clamping heatsinks to keep the chips in place.

ILikeTobacco said,
The RROD has exactly 0 to do with the size of the case and also 0 to do with the chip that wasn't being cooled properly. Anyone that every fixed the issue knows this. Nothing in this article in any way points to a RROD risk.

The thermal paste was the effect, not the cause. Cause = heat produced by a poor ventilation system and a heat cpu. Effect = thermal paste problem = RROD, solution = better fan and cpu, or at least it is what MS did it.


Thermal paste had nothing to do with the cause of RRoD but Ms did put more there for better thermal conductivity to the HSF later on.

It wasn't the paste or the CPU that was the problem. It was the crappy clamp they used for the heat sink. The clamp that was used to hold the heat sink in place was a crappy thin sheet of metal that when constantly cooled and expanded, warped and no longer kept the heat sink on the processor properly. To fix the issue, all you had to do was bolt the heat sink to the board properly. The only reason you had to replace the paste was because by the time you were doing this, you had already pushed the paste to its limits and destroyed its chemical compound. The ONLY problem was the flimsy piece of metal used to keep it all together. Nothing else.

Brony said,

The thermal paste was the effect, not the cause. Cause = heat produced by a poor ventilation system and a heat cpu. Effect = thermal paste problem = RROD, solution = better fan and cpu, or at least it is what MS did it.
The cause was the clamp holding the heat sink in place. The system had perfectly adequate ventilation. Mine still runs fine and hasn't had a RROD since I bolted the heat sink in place. I did the fix back in 2009 and 4 years later, still no issues.

ILikeTobacco said,
The cause was the clamp holding the heat sink in place. The system had perfectly adequate ventilation. Mine still runs fine and hasn't had a RROD since I bolted the heat sink in place. I did the fix back in 2009 and 4 years later, still no issues.

Exactly. Ripping off that useless x-clamp and replacing it with $0.25 in screws almost always fixed the problem. There was also a problem with weak solder joints, which probably was due to the warping from the x-clamps. When I was in college, I bought a launch-day red-ringed 360 for $25, fixed it with the screws, and then when that stopped working a couple years later, I'd stick the motherboard in the oven at 350 degrees (I think), as a poor-man's solder "reflow". I got an amazingly long life out of that dumb thing.

Mine still works as a media center in the guest room. Got a newer model for free with a laptop a few years back. The thing plays games just fine though it can't handle the Kinect at all. Over the years, I have fixed a number of them and given them away. Most people throw them out not knowing how cheap and easy the fix is.