Intel claims big bucks

Intel is claiming that its Core line of processors has allowed for $2bn in savings since release.

First introduced two years ago, the Core chips were touted by Intel in part for their low power consumption in the face of ever-climbing energy rates.

A company report estimates that since their release, the Core desktop, notebook and server chips have allowed for some 20 Terawatts less energy being used by the processors over the previous line.

Intel then divided this by an average energy cost of $0.10 per kilowatt hour to claim a $2bn energy savings from the chips.

"All the while we've been delivering these performance improvements, we have also been able to reduce the energy used by our microprocessors," wrote Lorie Wigle, general manager of Intel's eco-technology program office.

Energy savings have become a major selling point for all major chipmakers in recent years.

Both Intel and chief rival AMD have touted faster and more efficient processors of late, fuelled by smaller manufacturing processes and the use of more efficient materials.

View: vnunet

Report a problem with article
Previous Story

MySpace To Launch Online Music Service

Next Story

China running out of IP addresses

13 Comments

Commenting is disabled on this article.

um i dont know how they got the arbitary cost for a KWh but here in New Zealand im paying 3x that for KWh so by my rekoning they only managed to save a third of what they say

Just more Intel marketing garbage, as with any large corp. Back in the Pentium4 days, Intel's marketing machine was all about clock speed. Nothing else mattered. It wasn't until AMD (and Via to some extent) proved Intel wrong that they finally ditched the horrid P4 microarchitecture. 20 stage "hyperpipeline" sounds great on the marketing sheet, don't it?

Why arent we using multi-core GPUs by now? They just keep making them faster. Speed isnt everything when crunching numbers, which is (mostly) what 3D is.

Why isn't there any power-saving technology similar to SpeedStep for GPUs by now? Surely it shouldn't be difficult to make a high-end GPU like the 8800 Ultra consume not much more power than an integrated graphics solution during less intensive tasks like web browsing?

(pagnaet said @ #3.2)
Why arent we using multi-core GPUs by now? They just keep making them faster. Speed isnt everything when crunching numbers, which is (mostly) what 3D is.
GPUs are by their nature multicore. For example, the new Nividia GForce GTX 350 has 480 Unified Shaders - each one is a separate core and can be run completely in parallel. The whole point about graphics rendering is that it can be done in parallel in hardware, that's why GPUs were invented! It's also why GPUs are now being used for other tasks and why AMD are integrating the GPU into the processor die.

And humcheepeng, I'm pretty sure that GPUs do consume less power when not used for gaming (etc).

(eAi said @ #3.4)
GPUs are by their nature multicore. For example, the new Nividia GForce GTX 350 has 480 Unified Shaders - each one is a separate core and can be run completely in parallel. The whole point about graphics rendering is that it can be done in parallel in hardware, that's why GPUs were invented! It's also why GPUs are now being used for other tasks and why AMD are integrating the GPU into the processor die.

And humcheepeng, I'm pretty sure that GPUs do consume less power when not used for gaming (etc).

yeah, its like, multicore in gpu is all about raw performance increase, not enabling things like parallel processing to be done...
so like sticking to the efficiency theme, is one big chip more efficient than a number of smaller chips working together? dunnoe.... maybe...

(carmatic said @ #3.5)

yeah, its like, multicore in gpu is all about raw performance increase, not enabling things like parallel processing to be done...
so like sticking to the efficiency theme, is one big chip more efficient than a number of smaller chips working together? dunnoe.... maybe...

No it is not. The graphics performance you can get from 2 "cores" is worse than the performance you can get from one core with twice the execution units. Multi-core GPUs are just an advertising gimmick.

then here come companies like ATI & nV , making more power hungry graphic cards Vs the old GFX line !