Sign in to follow this  
Followers 0

128 posts in this topic

Posted

What?

http://www.techspot....770k/page4.html

Techspot and every review site says you are wrong.

Cost will go down significantly, why? Why does intel want to save you money? What benefit does that give intel? Please do tell me.

Actually it is being hard to understand what exactly you don't understand. Anyway... The cost will go down because Intel will be able to sell the motherboard and the processor, gaining a lot of scale. Also... if manufacturing prices are lowered, Intel will be able to lower prices for consumers, which are definitely looking for other options, like tablets. And tablets can't change processors, can they?

Share this post


Link to post
Share on other sites

Posted

Intel is a company, companies like to make money, what kind of business would undercut themselves to save you money and make them lose money?

If manufacturing prices are lowered, they'll keep prices the same and pocket the extra cash!

Share this post


Link to post
Share on other sites

Posted

Intel is a company, companies like to make money, what kind of business would undercut themselves to save you money and make them lose money?

If manufacturing prices are lowered, they'll keep prices the same and pocket the extra cash!

Not if there's a threat the size of (of an) ARM running for them. :)

Share this post


Link to post
Share on other sites

Posted

ARM is threatening their tablet/mobile side, ARM does not pose a threat to intel's desktop/server side.

ARM is pretty efficient but for that their trade-off is having very bad performance. Intel has high performance at the trade-off of having low energy efficiency.

Share this post


Link to post
Share on other sites

Posted

All I can say is go AMD.

Share this post


Link to post
Share on other sites

Posted

I dont see it as a big issue actually. Example: right now i have asus p7p55 with core i7 750. If i want an upgrade i'll have to get another mobo and another cpu. The same was for other pc's i've build.

Share this post


Link to post
Share on other sites

Posted

Actually Im a graphic designer I had a mac for many years! ;P lol

A graphic designer using Adobe products (not installed through the app store) on a Mac?

LOCKDOWN. :laugh:

Share this post


Link to post
Share on other sites

Posted

It will be interesting to see what their rationale is behind the decision because this idea was floated by Sun a few years ago to get around a technical limitation but I'm surprised that we've hit it so soon.

As for enthusiasts - don't think too highly of your position given the bulk of sales certainly don't go to you or I.

Share this post


Link to post
Share on other sites

Posted

Intel kills off the desktop, PCs go with it. What will we do if we can't upgrade our rigs?

http://semiaccurate....pcs-go-with-it/

I've kiboshed this on two other *tech* sites - now it's Neowin's turn.

They are referring specifically to Broadwell, which is the upcoming replacement to Atom/Cedar Trail/Clover Trail. Atom ships only in BGA packaging today.

The ONLY reference to Broadwell in the mainstream space is if - and ONLY if - devices supplant desktops as the mainstream; as it is, the earliest THAT would happen is late second half of 2015 (according to the leaked roadmap from Intel RoC).

Broadwell is, in fact, based on Haswell; however, it is NOT the planned *desktop* version of the technology! The desktop version is Lynx Point, and will succeed Ivy Bridge (*tock* - remember, Ivy Bridge itself is a *tick*)

There will also be a Haswell-EX (the successor to Sandy Bridge-E as well as XEON based on SB-E) and will be in LGA2011 or a successor socket thereof.

It is NOT known (or even speculated) whether Lynx Point will require a new socket.

1 person likes this

Share this post


Link to post
Share on other sites

Posted

It is NOT known (or even speculated) whether Lynx Point will require a new socket.

that is good to know

i think there would come a point where 'enthusiast' and 'mainstream' are more clearly seperated... as in, enthusiasts are not going to be using simply faster versions of mainstream things since 'mainstream' is now about tablets and portability, but rather things which are more based on enterprise/professional stuff , like workstations and servers and those kinds of things

Share this post


Link to post
Share on other sites

Posted

A graphic designer using Adobe products (not installed through the app store) on a Mac?

LOCKDOWN. :laugh:

No I used to use mac, then I woke up. ;)

Share this post


Link to post
Share on other sites

Posted

I have no problem, even being an enthusiast as long as they lower the Board+CPU price, its fine by me.

I wont leave Intel for AMD, nor 90% of those who are saying that they will.

Share this post


Link to post
Share on other sites

Posted

I have no problem, even being an enthusiast as long as they lower the Board+CPU price, its fine by me.

I wont leave Intel for AMD, nor 90% of those who are saying that they will.

if you had to combine the board and CPU the price might actually go up! now the MB mfg has to account for the different CPU speeds and models for every motherboard they make... right now we can put in what we want into a generic board that supports that line... they are moving the config from two options from intel + mb mfg to who knows how many boards from a single seller... which they have to stock, inventory themselves, etc..... and now the MB mfg is responsible if the CPU doesn't work.... putting more cost on them

Share this post


Link to post
Share on other sites

Posted

Not only that but to test if the system works you have to test a motherboard with a CPU soldered to it, if there's any problems with the BGA points on it, the whole motherboard and CPU will need scrapping. For a refund or reuse the CPU will need to be desoldered from the board and sent back to intel (costs of shipping and desoldering and testing) and if it's reused, you will be getting a motherboard + CPU bundle that's in theory a second hand CPU.

Anyway, this is all unproven rumour and wouldn't make any sense. Dell and intel were told by the US government years ago to make their desktop systems more upgradable to be more environmental, I don't think they'll suddenly stop that and **** the government off some how.

Share this post


Link to post
Share on other sites

Posted

Not only that but to test if the system works you have to test a motherboard with a CPU soldered to it, if there's any problems with the BGA points on it, the whole motherboard and CPU will need scrapping. For a refund or reuse the CPU will need to be desoldered from the board and sent back to intel (costs of shipping and desoldering and testing) and if it's reused, you will be getting a motherboard + CPU bundle that's in theory a second hand CPU.

Anyway, this is all unproven rumour and wouldn't make any sense. Dell and intel were told by the US government years ago to make their desktop systems more upgradable to be more environmental, I don't think they'll suddenly stop that and **** the government off some how.

and could you imagine 1000+ BGA solder points on a high wattage processor? those things *can* get hot. at least laptop lines are designed to be cooler to prevent this due to clock throttleing and other methods.... at least with LGA you can pop it out and see a bend hook and say yeah my socket is shot... BGA can fracture from heat changes over time

Share this post


Link to post
Share on other sites

Posted

Lets wait for confirmation and prices, dont jump to conclusions. If its that bad for business they will change this supposed path. Intel normally makes things the right way, i cant imagine doing bad at this point of evolution.

Share this post


Link to post
Share on other sites

Posted

Actually, those PLCC chips where pretty easy to replace, you just needed a grabber tool to pull them out... only problem wasn't too many options to go to

not only that but...IRQ nightmare as well! But that's not the point; the point here is that no one want's a PC market with less choice. Also if this technology would come to the PC space (instead of mobile/tablet) the cost of mainboards would not only rise (mainboard + cpu) but the RMA would be a serious problem for the costumer (one problem with the mainboard = new expensive mainboard with CPU). Also the mainboard selection array would be smaller, due to costs for the mainboards OEM, reflecting in a smaller choices for the costumer.

Again that would be a massive backwards decision and will be like returning to the 80's and 90's.

interesting that home consoles are getting more and more PC look a like (as well Arcades).

Share this post


Link to post
Share on other sites

Posted

I'm not sure less choice is a bad thing, per se. I mean there's 6 Sandy Bridge and 6 Ivy Bridge chipsets, and X number of boards from each manufacturer, but a lot of them are junk, and a lot shouldn't even be on the enthusiast market,

Share this post


Link to post
Share on other sites

Posted

I've kiboshed this on two other *tech* sites - now it's Neowin's turn.

They are referring specifically to Broadwell, which is the upcoming replacement to Atom/Cedar Trail/Clover Trail. Atom ships only in BGA packaging today.

The ONLY reference to Broadwell in the mainstream space is if - and ONLY if - devices supplant desktops as the mainstream; as it is, the earliest THAT would happen is late second half of 2015 (according to the leaked roadmap from Intel RoC).

Broadwell is, in fact, based on Haswell; however, it is NOT the planned *desktop* version of the technology! The desktop version is Lynx Point, and will succeed Ivy Bridge (*tock* - remember, Ivy Bridge itself is a *tick*)

There will also be a Haswell-EX (the successor to Sandy Bridge-E as well as XEON based on SB-E) and will be in LGA2011 or a successor socket thereof.

It is NOT known (or even speculated) whether Lynx Point will require a new socket.

Thank you for posting this. I was tired of responding to the threads of various sites, including this one, that the information was misinterpretated and a rumor at best.

Share this post


Link to post
Share on other sites

Posted

ARM is threatening their tablet/mobile side, ARM does not pose a threat to intel's desktop/server side.

ARM is pretty efficient but for that their trade-off is having very bad performance. Intel has high performance at the trade-off of having low energy efficiency.

You have too little knowledge of how things work in business. Maybe you're not even 18, for god's sake. But if you're, please give up.

Share this post


Link to post
Share on other sites

Posted

I've kiboshed this on two other *tech* sites - now it's Neowin's turn.

They are referring specifically to Broadwell, which is the upcoming replacement to Atom/Cedar Trail/Clover Trail. Atom ships only in BGA packaging today.

The ONLY reference to Broadwell in the mainstream space is if - and ONLY if - devices supplant desktops as the mainstream; as it is, the earliest THAT would happen is late second half of 2015 (according to the leaked roadmap from Intel RoC).

Broadwell is, in fact, based on Haswell; however, it is NOT the planned *desktop* version of the technology! The desktop version is Lynx Point, and will succeed Ivy Bridge (*tock* - remember, Ivy Bridge itself is a *tick*)

There will also be a Haswell-EX (the successor to Sandy Bridge-E as well as XEON based on SB-E) and will be in LGA2011 or a successor socket thereof.

It is NOT known (or even speculated) whether Lynx Point will require a new socket.

once again, people need to read this reply ...

Broadwell is, in fact, based on Haswell; however, it is NOT the planned *desktop* version of the technology! The desktop version is Lynx Point, and will succeed Ivy Bridge

It is NOT known (or even speculated) whether Lynx Point will require a new socket.

Share this post


Link to post
Share on other sites

Posted

You have too little knowledge of how things work in business. Maybe you're not even 18, for god's sake. But if you're, please give up.

Humm? I have little knowledge...? Let's go back to where you said intel would hapially change to BGA only and pass the cost savings on to you (makes ZERO sense from a business perspective)

If you look back, upgrading a processor is not that beneficial anymore, because to achieve real performance gains, you actually have to change everything: from motherboard, to RAM and storage.

Also, I think the cost will go down significantly.

Just lol.

Share this post


Link to post
Share on other sites

Posted

once again, people need to read this reply ...

Broadwell is, in fact, based on Haswell; however, it is NOT the planned *desktop* version of the technology! The desktop version is Lynx Point, and will succeed Ivy Bridge

It is NOT known (or even speculated) whether Lynx Point will require a new socket.

Lynx Point is the upcoming Intel 8-Series chipset for LGA1150 and not a "desktop version of Haswell". It's just like Ivy Bridge and its respective Intel 7-series "Panther Point" chipset.

From ZDNet:

I have now independent confirmation from a PC building OEM, who declined to be named, along with two motherboard makers, that Intel has briefed them of the switch from LGA to BGA for Broadwell architecture processors, which are expected to make an appearance next year.

http://www.zdnet.com...pus-7000008024/

Share this post


Link to post
Share on other sites

Posted

and could you imagine 1000+ BGA solder points on a high wattage processor? those things *can* get hot. at least laptop lines are designed to be cooler to prevent this due to clock throttleing and other methods.... at least with LGA you can pop it out and see a bend hook and say yeah my socket is shot... BGA can fracture from heat changes over time

I can. I design power supplies with very high power dissipation. Certainly more than some puny computer gives off. I've also worked extensively with LED packagers that are forced to use water cooling in their most modest power systems. (Industrial processes needing UV lighting). These are guys that bond dies to heatsinks directly and then wirebond directly from chip to their power busses. Cool stuff (But not literally....)

A BGA build that fails was designed with poor CTE matching, insufficient cooling, or a defect after reflow (Which should have been caught in automatic xray inspect). Period. (And in the event of *failed* cooling, the design can detect this, and prevent physical failure)

Share this post


Link to post
Share on other sites

Posted

"I design power supplies with very high power dissipation"

So already you're classing intel producing a CPU for the 'average joe' with making an enterprise product.

That's like saying a $400 PC out of bestbuy has the same reliability as a very high end spec enterprise server running mission critical software - they're not even in the same league.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0

  • Recently Browsing   0 members

    No registered users viewing this page.