Report: Xbox One and PS4 use up to three times more power than Xbox 360 and PS3

Many folks will be getting an Xbox One or a PlayStation 4 console for Christmas this week (a few people might be getting one of each product). However, a recent report claims that both consoles will use up to three times more energy than the older Xbox 360 and PS3, even though Microsoft and Sony have put in features that are designed to make the Xbox One and PS4 more efficient.

The non-profit Natural Resources Defense Council claims to have "completed rigorous measurements of the power use" of both consoles. On the plus side, the group praises Microsoft and Sony for putting in features such as charging accessories while the consoles are in a lower powered "sleep mode" and reducing power when left on for 10 minutes.

However, the same report claims that because the Xbox One and PS4 use more power overall to play games and watch movies, the energy efficiencies that the companies have put into the console are overridden. It states, "For example, the Xbox One uses approximately 40 percent more power to play a game than the Xbox 360, and the PS4 consumes almost twice as much as the PS3."

In a head-to-head study, the PS4 uses a lot more power than the Xbox One when playing games, streaming movies like Netflix and navigating through its interface. By the same token, the Xbox One uses more power when it is in Connected Standby mode than the PS4, as well as when both consoles are turned "off" but still connected to a power socket.

Because the Xbox One uses more power in Connected Standby mode, where it is waiting for its owner to say "Xbox One" via its Kinect sensor, this report claims that it will consume more electricity annually than the PS4. It estimates that Microsoft's console will use 253 kilowatt hours per year, which will cost about $150 for five years in energy bills. The PS4 is estimated to use 184 kilowatt hours per year.

The NRDC claims that both companies can make efforts to improve the energy efficiency for their new consoles via software updates, such as reducing the amount of power needed to stream video. The report also says Microsoft should try to cut down the Connected Standby power mode on the Xbox One.

Source: Natural Resources Defense Council | Images via Microsoft and Sony

Report a problem with article
Previous Story

Samsung releases Galaxy Gear commercial, ups the creepy factor

Next Story

Apple CEO Tim Cook's end of 2013 memo to employees hints at "big plans" for 2014

50 Comments

Commenting is disabled on this article.

Maybe Sony and Microsoft could have used Intel Iris Pro for their new consoles, but it would have last gen graphics and cost over $1000.

When Natural Resources Defense Council demands Pres Obama and the rest of the world ditch Air Force 1 and jetliners for prop driven airliners and ditch cars for bicycles and rick shaws, then we will know they are actually serious about energy.

I really don't understand why power consumption is even being talked about with a set top box game system. Its never been an issue until this new generation. I want power, reliability and performance, not to save $5 a year on my energy bill.

So... Which hardware revision of the Xbox 360 and PS3 are they comparing energy usage with? I don't know about the PS3 - although I assume the situation's the same - but each hardware revision of the 360 has reduced the power requirements, which is why a power brick from a 1st gen will fit in to a newer console, but not vice versa, because the newer supplies are lower current.

Obviously the first hardware for a new console will consume more power than a newer iteration seven years down the line...

The original 360 drew 16.5 amps if I remember rightly, and the last iterations of old body style got that down to 10 or 11, no idea what the newest models draw...

I am definitely curious to see my power bill... I have pretty much played the Xbox One non stop since I got it haha

Im in Canada, and if I were charged the 253kwh in peak hour rates, it would be $25. Realistically, it would probably end up costing me $15.

It's because this article misquotes the original source. In the original they said it would cost $150 over a 5 year period. It's not $150 per year like Neowin has written it, so that's more in line with the roughly 10-11 cents per KWh that most are probably paying.

In reference to the questions, I thought I would include a throwback reference to the movie Fletch. As he said...

"Come on, guys. Maybe you need a refresher course. Hey, it's all ball bearings nowadays."

These two machines are likely to generate a shedload of heat ... even more so than their predecessors.
If either of them suffer system failures due to overheating, based on how they're built, the PS4 seems
most likely to suffer in this respect than the XBox One.

Wrong, first paragraph...
"...use up to three times more energy than the older Xbox 360 and PS4..."

Surely the older Sony console is the PS3?????

Think I'll wait for the "Slim" version as the chip die will improve and other efficiencies will be made.

Edited by DJ_Disco_Dave, Dec 23 2013, 3:59pm :

So basically, the Xbox One is pretty efficient except for the standby power consumption. Not sure if they can address that with software updates?

Given that it's constantly running 24/7, 365 days a year (that's the point of standby mode), I think 150 dollars is actually not THAT bad.

So let me get this straight... XBOne uses 40% more on average and PS4 50% more on average, yet they arrive at a figure of 'three times' which would be 300% more...

How much is electricity per unit in the states?! I worked this out for UK at 15p a unit and it worked out at £37.50 or ~$61

The source is fine, it's how this article was written for Neowin. The original source said it was $150 over a 5 year period which works out to roughly $30 per year.

Oh no doubt they're going for the 'shock value' of a high number, it's just that they didn't even mention that it as over 5 years on here so it makes it seem even more ridiculous in price.

So 40% more of 203W is only 284W for the Xbone? Seems acceptable.

so 253kW on a 0.284kW means they average its use a bit more than 890 hours a year...

Sensationalist article. The XB1 and PS4 both use much less power than the 360 and PS3 when they was first released. The original fat PS3 used over 200w under load.

You can also set the XB1 and PS4 to not use standby mode, where they use 1w or less.

NoClipMode said,
Sensationalist article. The XB1 and PS4 both use much less power than the 360 and PS3 when they was first released. The original fat PS3 used over 200w

Thanks for bringing historical data to the article. IIRC my measurements of my (2010?) Xbox 360 model (that now is off often as the One replaced it) puts it at 70W for apps and 120W for games... so comparable for streaming, and not so far off for games. I think it was 5W when off/standing-by.

Xbox One's connected standby replaces my receiver's idle, since the Xbox turns it on/off. That is perhaps one additional point of guidance missed by the article: use the IR blaster more effectively to save power on other electronics you'd not always remember to turn on/off.

NoClipMode said,
Sensationalist article. The XB1 and PS4 both use much less power than the 360 and PS3 when they was first released. The original fat PS3 used over 200w under load.

You can also set the XB1 and PS4 to not use standby mode, where they use 1w or less.


That's a great point, and it should have been stated somewhere too.

NoClipMode said,
Sensationalist article. The XB1 and PS4 both use much less power than the 360 and PS3 when they was first released. The original fat PS3 used over 200w under load.

You can also set the XB1 and PS4 to not use standby mode, where they use 1w or less.


Thanks. I was asking if they were comparing it with their original counterparts or the Slim versions.

Or maybe more features but same power?

In your analogy, cars today have the same bHp or more but use less fuel than cars of the 1990s

So yeah, while i didn't expect it to use less power, i thought maybe a lil more or the same would be good engineering.

MikeChipshop said,
Were they expecting them to have more features, be more powerful yet use less power?

It's not impossible... Plenty of computer parts that have more features and use less power than previous generations. Computers are still advancing very quickly and power savings growing in leaps and strides as well (since there is so much emphasis on tablets/phones and those demand low power prowess).

$125-150+ a year... PER machine? Add up 4M+ machines and that number quickly becomes very large.

Hopefully they can release better power options because a patch that saves 1% of electricity would quickly add up.

giantpotato said,
I upgraded from an Phenom X3 to an i7 4770k. It's 3.5x faster, supports more instruction sets and uses 10w less power.

That's because AMD has had power usage issues when directly compared with Intel for a long time…

wernercd said,

It's not impossible... Plenty of computer parts that have more features and use less power than previous generations. Computers are still advancing very quickly and power savings growing in leaps and strides as well (since there is so much emphasis on tablets/phones and those demand low power prowess).

$125-150+ a year... PER machine? Add up 4M+ machines and that number quickly becomes very large.

Hopefully they can release better power options because a patch that saves 1% of electricity would quickly add up.

I agree with your point that its not impossible, but AMD chips are simply not as power efficient as equivalent Intel chips (equivalent is used loosely here, i think we all know AMD trails behind Intel in terms of powerful chips). The big but though is that AMD are cheaper, hence if more power efficient Intel chips were used the console (s) prices would also increase, not to mention the integration of cpu/gpu on the AMD front and the added expenses possibly incurred on the Intel route.

so i agree it can be done, but from a standpoint of creating a new console to compete NOW its not possible to both create and succeed.

onto the point made regarding multiplying power use per year per console, sorry but i cant take these statements seriously without very detailed stats and comparisons which help to see past the issue of multiplying small figures into large number/ling time use, its like the stats about how much of your life you spend on the toilet, or sleeping etc. They cause a reaction but when put in perspective its not too bad.

Ask any Macbook laptops (all based on Intel) user who plays games or uses intensive software whether that Intel semi-fictional TDP is as "cool" as it sounds as they cook on their laps.

You can recognize the people who use Intel based Macbooks by the funny way they walk.

MFH said,

That's because AMD has had power usage issues when directly compared with Intel for a long time…

Its also because of change of technology. all of the phenom series are 65 nm to 45 nm chips where as more modern i7 series are 22nm. smaller circuit cosumes less power.

giantpotato said,
I upgraded from an Phenom X3 to an i7 4770k. It's 3.5x faster, supports more instruction sets and uses 10w less power.

Also, you cannot record the last 15 mins etc etc. And you DON'T HAVE all those PS4 features. What are you comparing ?

The AMD Jaguar Core is 28nm. So are all the other newer AMD CPU's and GPU's. The higher power draw of this APU comes from the much higher performing GPU part and has nothing to do with the CPU. If you take an Intel i5 CPU and add a discrete GPU card that offers the same performance as AMD's Jaguar APU you would end up with a much higher power draw overall.

Konstantine said,

Also, you cannot record the last 15 mins etc etc. And you DON'T HAVE all those PS4 features. What are you comparing ?

I was replying to a comment about having more processing power and using less wattage, I don't know why you are turning this into a PC vs. Consoles debate. And FYI, yes, you can record gameplay on PC.

Gungel said,
If you take an Intel i5 CPU and add a discrete GPU card that offers the same performance as AMD's Jaguar APU you would end up with a much higher power draw overall.

Are there still that low-performance discrete graphics cards?