Are hot Ivy Bridge CPUs the result of thermal paste?


Recommended Posts

Why is Ivy Bridge so hot? Ask that question in any forum currently, and you are likely to receive one of two different popular (but not entirely correct) answers that everyone has been parroting:

  1. ?Power density is greater on Ivy Bridge than Sandy Bridge?
  2. ?Intel has problems with tri-gate/22nm?

The first answer is correct, but wrong at the same time ? power density is greater, but it isn?t what is causing temperatures to be as much as 20 ?C higher on Ivy Bridge compared to Sandy Bridge when overclocked. The second answer is jumping to conclusions without sufficient evidence. If you aren?t in the loop, there?s evidence of a considerable temperature difference nearly everywhere you look ? we confirmed it by mirroring settings in our Ivy Bridge review, and we have read similar reports in solid testing at Anandtech as well as from other sites.

So why is Ivy Bridge hot? Intel is using TIM paste between the Integrated Heat Spreader (IHS) and the CPU die on Ivy Bridge chips, instead of fluxless solder.

oc-ivydie-2.jpg

How does TIM paste generally compare with fluxless solder for conducting heat? Heat conductivity can be measured in watts per meter Kelvin. To be technically exact, we would need to know exactly what Intel is using for TIM paste/solder. When I went to Intel and asked, their polite answer may not surprise you ? ?Secret sauce?! Given that, we can use some rough approximations. A solder attach could have a heat conductivity in the range of 80 W/mK. A TIM paste could have a heat conductivity in the range of 5 W/mK. That?s your problem right there! Note that these values are not exact, as we don?t know the exact heat conductivity of Intel?s ?Secret sauce?. However, these are values representative of solder or TIM paste, and there is a giant gap between how TIM paste and solder perform in regards to conducting heat. They are in different leagues.

Read the full article over on Overclockers.

I've read one comment somewhere that Intel could be doing this as a cost-cutting measure teamed with the (unfortunate) fact that AMD are no longer a serious threat to them, so they can do whatever the hell they want. I'll just say that I'm glad I've got a Sandy Bridge CPU here...

Link to comment
Share on other sites

I remember the last time Intel got complacent in their throne and AMD snuck up behind them and kicked them up their arse - Intel would be wise to learn their history!

  • Like 2
Link to comment
Share on other sites

But the heating issue is ONLY an issue if you overclock so the every day user is completely unaffected by this. IB runs perfectly fine when run at stock speeds. I also believe that the OC community has tested by removing the IHS and direct cooling applied to the die (with TIM) and the chip still ran hot as lava so the spreader not being soldered is not the source of the chips running hot.

  • Like 2
Link to comment
Share on other sites

^ Edit: if this is a result of cost-cutting (and reportedly it is so), it's still troubling - like, what they'll be cutting next.

Edit: ah yes, found about IHS removal:

http://www.eteknix.c...er-ihs-removal/

I wonder if v2 Xeons will have this sort of secret-crap, too. Then again, Xeons are locked, so there's not much point anyway.

Link to comment
Share on other sites

this as a cost-cutting measure teamed with the (unfortunate) fact that AMD are no longer a serious threat to them, so they can do whatever the hell they want.

tumblr_lnwltqhA7R1qzj7lm.png

Link to comment
Share on other sites

But the heating issue is ONLY an issue if you overclock so the every day user is completely unaffected by this.

Then why buy a K series Ivy Bridge CPU? The stock HS/F that came with my 2600k is a friggin joke, now Intel makes the problem worse. Me no like.

Link to comment
Share on other sites

Then why buy a K series Ivy Bridge CPU? The stock HS/F that came with my 2600k is a friggin joke, now Intel makes the problem worse. Me no like.

You buy a K series Intel CPU because it's unlocked for overclocking...? The stock cooler Intel provides does a fine job for keeping the CPU within the Thermal limits Intel sets. If Intel deem that an Ivy Bridge CPU can operate at temperatures over 70 degrees, if not hotter before thermal protection kicks in, you have nothing to worry about.

Link to comment
Share on other sites

I remember the last time Intel got complacent in their throne and AMD snuck up behind them and kicked them up their arse - Intel would be wise to learn their history!

AMD publicly conceded defeat and announced a focus on mobile devices, to the victor goes the spoils.

Link to comment
Share on other sites

This only really becomes a "situation" when you increase the voltage high enough to go for OC's of 4.6GHz or higher. For people buying these CPU's they won't be going for those kind of overclocks. That is what the LGA 2011 systems are for which btw have an adjustable BCLK.

To give an example of why this is important if I run my 3930K at 4.375GHz with a 100MHz BCLK I get 12 Points in Cinebench. If I do that same OC with a 125MHz BCLK I get 12.58 points. Over half a point more just by raising the QPI bus speed. That is part of the reason why the LGA 2011 systems are for enthusiasts and why the 1155 systems are for mainstream users. It just so happens that Intel has included K series chips on the 1155 platform because overclocking is becoming more mainstream where consumers even average joes expect to get a little bit more value out of their purchase with a couple minutes of fiddling. Some motherboards even come with a single button OC feature and water coolers like the H100 make it a breeze for novices to pull off 4GHz.

But don't be fooled, these 1155 processors are still mainstream parts and Intel wants people buying LGA 2011 for extreme overclocks closer to 5GHz and above.

Link to comment
Share on other sites

AMD publicly conceded defeat and announced a focus on mobile devices, to the victor goes the spoils.

Thing i of course that while AMD is slower than intel today, what do you need the fastest intel cpu's for ? even the fastest AMD CPU's are more than fast enough for even the most demanding games today. After all it's not CPU they need, so get a fast AMD CPU and a good graphics card and you can max any game as good as with an intel CPU

The only ones needing the power of the fastest Intel CPU's are workstations, specifically and mostly those doing 3D work needing fast CPU's for preview renders and for render stations. but for gaming AMD is still very much a worthwhile contender, especially as it's cheaper to. of course people don't buy them because they believe they need the faster intel CPU's. Heck even the stuff that used to require a lot of CPU is now relyign less and less on the CPU, such as AI, simply because the CPU architecture is inneficient for the tasks they used to be doing in games, while the massive parallel cores on the GPU is able to do all that stuff far more efficiently.

So spending twice as much to get the most powerful intel CPU for your gamign rig is a huge waste when you could get a top end AMD and get the same perfromance,, better performance if you put the extramoney into a better or extra graphics card.

Link to comment
Share on other sites

You buy a K series Intel CPU because it's unlocked for overclocking...? The stock cooler Intel provides does a fine job for keeping the CPU within the Thermal limits Intel sets. If Intel deem that an Ivy Bridge CPU can operate at temperatures over 70 degrees, if not hotter before thermal protection kicks in, you have nothing to worry about.

No, it doesn't. Not when you're running 100% CPU usage 24/7, even at the stock 3.4 GHz. I was hitting 80+ degrees until I changed the cooler.

I asked why buy a K chip when IB gets hotter when you overclock. "Well, Intel's cooler will work and Ivy Bridge is fine... as long as you don't overclock. But you buy a K series CPU to overclock."

So if you want to overclock, buy a Sandy Bridge CPU. Yes?

Link to comment
Share on other sites

I remember the last time Intel got complacent in their throne and AMD snuck up behind them and kicked them up their arse - Intel would be wise to learn their history!

So do I, 1997.

If i'm being honest, I just read the first few lines, and using fluxless solder? Errr, you have got a be joking, everything would conduct, i.e. the whole CPU would short itself out.

Interesting the see that dye though... It's massive! (The last dye I saw/still have lying around is a P4, and it's a small square!)

Link to comment
Share on other sites

If i'm being honest, I just read the first few lines, and using fluxless solder? Errr, you have got a be joking, everything would conduct, i.e. the whole CPU would short itself out.

Interesting the see that dye though... It's massive! (The last dye I saw/still have lying around is a P4, and it's a small square!)

Surface of die is insulated (with silicon nitride*) and is not conductive. Indium-based alloy*, one with a very low melting point just above 100 degrees C, is used as the solder in question.

* is what I seem to remember from classes, but should not be very far from truth.

Link to comment
Share on other sites

So do I, 1997.

If i'm being honest, I just read the first few lines, and using fluxless solder? Errr, you have got a be joking, everything would conduct, i.e. the whole CPU would short itself out.

Interesting the see that dye though... It's massive! (The last dye I saw/still have lying around is a P4, and it's a small square!)

Good man - someone who knows their history! This also carried on for a fair few years (at least 3 or 4) if memory serves me correctly.

AMD publicly conceded defeat and announced a focus on mobile devices, to the victor goes the spoils.

True, but who says it should AMD that Intel should worry about. There's always more then one competitor out there - more specifically, ARM manufacturers. Qualcomm and nVidia with their Snapdragon and TEGRA processors have a good chance of overthrowing Intel's dominance in the lower-end and even mid-end race, relegating Intel to the "hardcore enthusiast" crowd only. I'd say if Intel aren't careful, this could happen within 3-5 years.

It may sound like fanboy hogwash but quite frankly I'm actually rooting for Intel - or, at least, a competitive Intel. Fat-Turkey-Eating-Sit-Down-All-Day-King Intel deserve to get their rear ends handed to them. ARM is becoming a very popular architecture of choice for tablet and phone manufacturers and this is going to be exacerbated when Windows RT/WOA comes out.

Picture this: nVidia releasing TEGRA 3/4 processors in a PC for $299, or in a mobo kit for $150-$199, and marketing the fact that THEIR PCs can run both the latest Windows 8 as well as Android 5 - giving them them access to over half a million apps to choose from on day one as well as two of the hottest and latest operating systems out. Already purchased apps on your Android phone? No problem, just sign into your Google Play account and all of your apps are available straight away. Have a W8 or Android tablet? No problem, app uniformity, just sign into your accounts!

Intel, if faced with some heavy competition from nVidia and Qualcomm, could definitely be put on the back foot - especially with a deal this attractive. I'd definitely be interested in an Android 5/Windows 8 box for just ?199.99 (or even ?249.99) if it was offered to me in a compact "sit under the TV" form factor, and I'm an enthusiast build your own PC type of guy!

I'm deviating/fantasising too much, but yeah :p :D

Link to comment
Share on other sites

No, it doesn't. Not when you're running 100% CPU usage 24/7, even at the stock 3.4 GHz. I was hitting 80+ degrees until I changed the cooler.

I asked why buy a K chip when IB gets hotter when you overclock. "Well, Intel's cooler will work and Ivy Bridge is fine... as long as you don't overclock. But you buy a K series CPU to overclock."

So if you want to overclock, buy a Sandy Bridge CPU. Yes?

If you have an Ivy Bridge, download and run CoreTemp or Realtemp, and you will see the TJ Max is 105 degrees. So if you're hitting 80 degrees with the stock cooler under 100% load 24/7, then the stock cooler is obviously doing it's job, which you can argue is not effective.

I couldn't find official info from Intel, but Bit tech states that the i7 3770k will hit 105 degrees before throttling beings; http://www.bit-tech.net/hardware/2012/04/23/intel-core-i7-3770k-review/1

Otherwise yes, I do agree Sandy Bridge is better for overclocking.

Link to comment
Share on other sites

Thing i of course that while AMD is slower than intel today, what do you need the fastest intel cpu's for ? even the fastest AMD CPU's are more than fast enough for even the most demanding games today. After all it's not CPU they need, so get a fast AMD CPU and a good graphics card and you can max any game as good as with an intel CPU

The only ones needing the power of the fastest Intel CPU's are workstations, specifically and mostly those doing 3D work needing fast CPU's for preview renders and for render stations. but for gaming AMD is still very much a worthwhile contender, especially as it's cheaper to. of course people don't buy them because they believe they need the faster intel CPU's. Heck even the stuff that used to require a lot of CPU is now relyign less and less on the CPU, such as AI, simply because the CPU architecture is inneficient for the tasks they used to be doing in games, while the massive parallel cores on the GPU is able to do all that stuff far more efficiently.

So spending twice as much to get the most powerful intel CPU for your gamign rig is a huge waste when you could get a top end AMD and get the same perfromance,, better performance if you put the extramoney into a better or extra graphics card.

Video rendering on an AMD CPU is stupidly slow. Phenom II 955BE at that.

Link to comment
Share on other sites

If you have an Ivy Bridge, download and run CoreTemp or Realtemp, and you will see the TJ Max is 105 degrees. So if you're hitting 80 degrees with the stock cooler under 100% load 24/7, then the stock cooler is obviously doing it's job, which you can argue is not effective.

I couldn't find official info from Intel, but Bit tech states that the i7 3770k will hit 105 degrees before throttling beings; http://www.bit-tech....-3770k-review/1

Otherwise yes, I do agree Sandy Bridge is better for overclocking.

Maybe it's just for AMD CPU's but coretemp is way off, it shows me TJ max of 90 whilst my CPU would be fried at that time as the max it should go is 62 degrees. Maybe it works other way around with Intel?

Video rendering on an AMD CPU is stupidly slow. Phenom II 955BE at that.

That's why the 3770k is only 6-8% faster in video rendering than the 8150? Yeah I'd call that "stupidly low"

Link to comment
Share on other sites

Maybe it's just for AMD CPU's but coretemp is way off, it shows me TJ max of 90 whilst my CPU would be fried at that time as the max it should go is 62 degrees. Maybe it works other way around with Intel?

Personally, I use Real Temp; that seems to be the most accurate one in my case (plus I can also show the temperature of my graphics card, which is a big plus). The most inaccurate one for me is ASUS AI Suite. At the moment, my CPU's around 35-36 degrees celsius according to CoreTemp, but ASUS AI Suite always says 10 degrees cooler (and just for the hell of it, my GPU's currently 51 degrees).

Link to comment
Share on other sites

Personally, I use Real Temp; that seems to be the most accurate one in my case (plus I can also show the temperature of my graphics card, which is a big plus). The most inaccurate one for me is ASUS AI Suite. At the moment, my CPU's around 35-36 degrees celsius according to CoreTemp, but ASUS AI Suite always says 10 degrees cooler (and just for the hell of it, my GPU's currently 51 degrees).

Tried realtemp and got this "The processor is not supported" :(

Link to comment
Share on other sites

Thing i of course that while AMD is slower than intel today, what do you need the fastest intel cpu's for ? even the fastest AMD CPU's are more than fast enough for even the most demanding games today. After all it's not CPU they need, so get a fast AMD CPU and a good graphics card and you can max any game as good as with an intel CPU

The only ones needing the power of the fastest Intel CPU's are workstations, specifically and mostly those doing 3D work needing fast CPU's for preview renders and for render stations. but for gaming AMD is still very much a worthwhile contender, especially as it's cheaper to. of course people don't buy them because they believe they need the faster intel CPU's. Heck even the stuff that used to require a lot of CPU is now relyign less and less on the CPU, such as AI, simply because the CPU architecture is inneficient for the tasks they used to be doing in games, while the massive parallel cores on the GPU is able to do all that stuff far more efficiently.

So spending twice as much to get the most powerful intel CPU for your gamign rig is a huge waste when you could get a top end AMD and get the same perfromance,, better performance if you put the extramoney into a better or extra graphics card.

Intel doesn't only make super expensive cpu's though, i5-2500 is pretty damn good bang/buck for gaming.

Link to comment
Share on other sites

The point is that there's not point in exlcusively looking at intel for a gaming rig as it won't give a practical benefit.

Link to comment
Share on other sites

This topic is now closed to further replies.