Jump to content



Photo

Intel is reportedly going to kill the CPU socket

intel socket cpu

  • Please log in to reply
127 replies to this topic

#46 Phouchg

Phouchg

    has stopped responding

  • 5,689 posts
  • Joined: 28-March 11

Posted 27 November 2012 - 13:22

That's a load of BS, imo. Intel did made CPU that needed to be soldered to the motherboard, in the past. In the 286-386 era. The market is a completely different beast right now....

Sadly, this idea indeed appears to have resurfaced:
http://www.intel.com...troduction.html

Next Unit of Computing my lower back. IT'S A NETTOP! And pretty lame one at that. I can see it being a reasonable HTPC, among others, what with the (dual) HDMI or the 'Bolt (no one really cares about), but no USB 3.0, no analog audio, no S/PDIF, either. While we're at it, BY model (at the very least) suffers from WiFi card overheating the SSD. Intel promises to fix it, but it's more or less a design defect - they're simply too close to each other. What one's going to do about it? Nothing! Because it isn't possible to do anything else than pray that Intel indeed fixes it, or go exchange it. Or cry for your money.

Note that SODIMMs, mSATA SSD and half-size WiFi card must be bought separately - I guess that's good, except that the price of the little crap alone is $300.
I wrote two articles for the local site about this, compiling a lot of sources. Unfortunately, a lot of people seem to be genuinely impressed with it, for some reason. :|


#47 Glassed Silver

Glassed Silver

    ☆♡Neowin's portion of Crazy♡☆

  • 10,729 posts
  • Joined: 10-June 04
  • Location: MY CATFORT in Kassel, Germany
  • OS: OS X ML; W7; Elementary; Android 4
  • Phone: iPhone 5 64GB Black (6.0.2)

Posted 27 November 2012 - 13:43

...no there is another...

Nobody but Intel and AMD produce CPUs based on the x86 architecture, so no.
And if you think ARM is a viable alternative right now, I'm sorry to have to disagree very very strongly here.

Upgrade is not the point. The ability to do so is. See, motherboards arrive dead or fail, I'd say, three orders of magnitude more than processors do. How it will be more reliable and cheaper to change the whole thing? For example: if my big huge motherboard with all bells and whistles dies, I can buy some entry level crap for 30 $/€ and still stuff 3770K in it and enjoy most if not all its capabilities. If my CPU dies, I can buy 30 $/€ worth Celeron G to get my system going.

Now, as some have already said, it will not end with CPU sockets. Systems will become integrated to teeth, just like smartshytes and schmablets have - unopenable, with no swappable batteries, no removable storage etc. What one has to do when they fail in the slightest? Send them to the nearest RMA center, and in most cases after a month (30 days being in the consumer law here) of couldn't-be-arsed-to-look-at-it they'll exchange it for a new device because nothing can be repaired regardless of the damage. If you were lucky enough to get the data out or keep a copy, that's it. I could just have my disk attached to another machine and keep going all the time. With a proper machine, if big huge RAM dies, I can put 10 $/€ ValueRAM 2 GB stick and keep going. If a discrete GPU dies, I can put the oldest PCI-E crap that barely moves bits around, but keep going.

Or welcome to the out-of-control generation...

Post of the day.
Fu** it, post of the month! (Y)

I think I'll create a couple of dupe accounts to give this the amount of likes it deserves... :shifty:

edit: / addition:
Sorry guys, read page 3 now and apparently it's not what it seemed to be.
Apart from that, I still think Phouchg's post is excellent, as it is a nice take on how some people would be totally careless about it happening and the message is true with other things going on in the tech industry where many people downplay the importance of repairability, maintaining your gear yourself and so forth.

I often feel alienated with my sentiments around some people, so yeah, nice read!

Glassed Silver:ios

#48 n_K

n_K

    Neowinian Senior

  • 5,425 posts
  • Joined: 19-March 06
  • Location: here.
  • OS: FreeDOS
  • Phone: Nokia 3315

Posted 27 November 2012 - 13:59

Nobody but Intel and AMD produce CPUs based on the x86 architecture, so no.
And if you think ARM is a viable alternative right now, I'm sorry to have to disagree very very strongly here.

Wrong. VIA and cyrix both produce/produced x86 architecture.
Transmeta does too.

#49 ViperAFK

ViperAFK

    Neowinian Senior

  • 10,872 posts
  • Joined: 07-March 06
  • Location: Vermont

Posted 27 November 2012 - 13:59

Upgrade is not the point. The ability to do so is. See, motherboards arrive dead or fail, I'd say, three orders of magnitude more than processors do. How it will be more reliable and cheaper to change the whole thing? For example: if my big huge motherboard with all bells and whistles dies, I can buy some entry level crap for 30 $/€ and still stuff 3770K in it and enjoy most if not all its capabilities. If my CPU dies, I can buy 30 $/€ worth Celeron G to get my system going.

Now, as some have already said, it will not end with CPU sockets. Systems will become integrated to teeth, just like smartshytes and schmablets have - unopenable, with no swappable batteries, no removable storage etc. What one has to do when they fail in the slightest? Send them to the nearest RMA center, and in most cases after a month (30 days being in the consumer law here) of couldn't-be-arsed-to-look-at-it they'll exchange it for a new device because nothing can be repaired regardless of the damage. If you were lucky enough to get the data out or keep a copy, that's it. I could just have my disk attached to another machine and keep going all the time. With a proper machine, if big huge RAM dies, I can put 10 $/€ ValueRAM 2 GB stick and keep going. If a discrete GPU dies, I can put the oldest PCI-E crap that barely moves bits around, but keep going.

Or welcome to the out-of-control generation...


Agreed, I can't believe anyone in their right mind thinks this is a good thing for desktop pc's.

#50 f0rk_b0mb

f0rk_b0mb

    Neowinian Senior

  • 2,526 posts
  • Joined: 02-June 12

Posted 27 November 2012 - 14:02

oh no.JPG

Why is the new trend fixing what isn't broken?

#51 Detection

Detection

    Detecting stuff...

  • 8,369 posts
  • Joined: 30-October 10
  • Location: UK
  • OS: 7 SP1 x64

Posted 27 November 2012 - 14:06

Don't Intel force you to buy a new socket board for each new CPU release anyway?

#52 Rickkins

Rickkins

    Neowinian

  • 649 posts
  • Joined: 04-April 07
  • OS: Windows 8, Desktop Mode
  • Phone: Galaxy S3

Posted 27 November 2012 - 14:18

AMD is our last hope...



Always was. The only thing ever holding intel's greed in check has been amd's very existence.

#53 threetonesun

threetonesun

    Neowinian Senior

  • 11,943 posts
  • Joined: 26-February 02

Posted 27 November 2012 - 14:22

Apart from that, I still think Phouchg's post is excellent, as it is a nice take on how some people would be totally careless about it happening and the message is true with other things going on in the tech industry where many people downplay the importance of repairability, maintaining your gear yourself and so forth.

I often feel alienated with my sentiments around some people, so yeah, nice read!

Glassed Silver:ios


Well, this is basically the situation with laptops. Yes, it might be possible to replace things, but the average consumer won't, and the trend it make the cases near impossible to open / tamper with. If this happened to consumer desktops, well, it wouldn't surprise me.

The thing you fail to note is that if, say, your GPU blows up, there's no fixing it... you send it off to get RMA'd, or toss it in the trash. We're well beyond the days of breaking out the soldering iron and attaching some new capacitors, so it's really just a matter of throw it out in pieces, or throw it out all at once. From a quality control standpoint, something soldered on in house in a cleanroom is obviously going to be more reliable than something pieced together in someone's office workshop.

Beyond all this is the fact that if you plug a component in and it works, the odds of it failing in its useable lifespan is pretty slim. I think PC enthusiasts still worry about this because we're often dealing with components that are intentionally overworked or relatively poorly built (specifically graphics cards, which are the only things I've had die on me in my years of PC building) and will fail randomly. The failure rate on a properly designed CPU / board / memory has to be incredibly small.

#54 +LogicalApex

LogicalApex

    Software Engineer

  • 6,516 posts
  • Joined: 14-August 02
  • Location: Philadelphia, PA
  • OS: Windows 7 Ultimate x64
  • Phone: Nexus 5

Posted 27 November 2012 - 14:29

Well, this is basically the situation with laptops. Yes, it might be possible to replace things, but the average consumer won't, and the trend it make the cases near impossible to open / tamper with. If this happened to consumer desktops, well, it wouldn't surprise me.

The thing you fail to note is that if, say, your GPU blows up, there's no fixing it... you send it off to get RMA'd, or toss it in the trash. We're well beyond the days of breaking out the soldering iron and attaching some new capacitors, so it's really just a matter of throw it out in pieces, or throw it out all at once. From a quality control standpoint, something soldered on in house in a cleanroom is obviously going to be more reliable than something pieced together in someone's office workshop.

Beyond all this is the fact that if you plug a component in and it works, the odds of it failing in its useable lifespan is pretty slim. I think PC enthusiasts still worry about this because we're often dealing with components that are intentionally overworked or relatively poorly built (specifically graphics cards, which are the only things I've had die on me in my years of PC building) and will fail randomly. The failure rate on a properly designed CPU / board / memory has to be incredibly small.


That was the point of the post GS quoted really...

The failure rate may be low (no data on if it is low or not), but when you fall on the unfavorable side of the statistic you can recover quickly. Otherwise, you're at the mercy of someone else to get you up and running and they won't care half as much as you do about your ability to get back up and running quickly.

Additionally, even with super high tech clean rooms and etc. you still have parts that fail prematurely or arrive dead. So soldering it all in one unit and building it all in house won't eliminate this. It may lower it, but again you're hoping not to fall on the wrong side of the statistic as you're at the mercy of someone else when you do.

#55 Mordkanin

Mordkanin

    Neowinian Senior

  • 5,405 posts
  • Joined: 15-February 04

Posted 27 November 2012 - 14:30

It's inevitable for some basic engineering purposes that every EE has to deal with: Cost, thermal, parasitics, size.

These sockets are EXPENSIVE.

They offer decreased thermal performance. You can use thermal vias with a BGA package, and suck heat out the back of the board.

Electrically, sockets tend to suck, a lot. You're sticking a little LC network on every pin. That's bad, and hurts switching. You can get lower voltages and better performance off a BGA.

And size. Ah size. LGA2011 has a pitch of ~1mm in a hex array. 0.3mm square pitches exist in BGA form. The pin density available is insanely better. You can't achieve that with a socket.

#56 1941

1941

    Banned

  • 18,175 posts
  • Joined: 17-July 06

Posted 27 November 2012 - 14:36

The name of the source is Semiaccurate.com. That should leave some room for error.

#57 Jason S.

Jason S.

    Neowinian Senior

  • 12,192 posts
  • Joined: 01-September 03
  • Location: Cleveland, Ohio

Posted 27 November 2012 - 14:43

i could see Intel leaving the Enthusiast market where it is.... say, the current 2011 socket is only available to Enthusiasts with swappable CPUs, while the 1155 socket/cpus are the ones that are soldered.

It's inevitable for some basic engineering purposes that every EE has to deal with: Cost, thermal, parasitics, size.

These sockets are EXPENSIVE.

They offer decreased thermal performance. You can use thermal vias with a BGA package, and suck heat out the back of the board.

Electrically, sockets tend to suck, a lot. You're sticking a little LC network on every pin. That's bad, and hurts switching. You can get lower voltages and better performance off a BGA.

And size. Ah size. LGA2011 has a pitch of ~1mm in a hex array. 0.3mm square pitches exist in BGA form. The pin density available is insanely better. You can't achieve that with a socket.

sounds like you just finished some related college course :p

#58 Hum

Hum

    totally wAcKed

  • 63,565 posts
  • Joined: 05-October 03
  • Location: Odder Space
  • OS: Windows XP, 7

Posted 27 November 2012 - 14:46

Intel kills off the desktop, PCs go with it. What will we do if we can't upgrade our rigs?


AMD gets very rich :D

#59 vetneufuse

neufuse

    Neowinian Senior

  • 17,256 posts
  • Joined: 16-February 04

Posted 27 November 2012 - 15:03

great, so now to upgrade a CPU I have to buy a new motherboard...... Motherboard issues, but a new CPU too..... and vise versa....

people wonder why the desktop market is going down, it's stuff like this that takes it down for the builder market.....

#60 Crisp

Crisp

    To infinity and beyond

  • 5,529 posts
  • Joined: 06-May 10
  • Location: 127.0.0.1

Posted 27 November 2012 - 15:06

Always was. The only thing ever holding intel's greed in check has been amd's very existence.


Amen.