Gordon Moore Predicts 15 More Years for his Law

Moore's Law, one of the most widely known laws in the computing world, may not have much longer to live. According to Gordon Moore, co-founder of Intel and best known for his theory predicting that transistor sizes would decrease by 50% every 18 to 24 months, fundamental physical limits will prevent engineers from further chip shrinkages within the next 15 years. He announced this prediction at the Intel Developer Forum.

"In another decade, decade and half or something, we will hit something that is fundamental," Moore said when asked if there would be an end to his 'law'. But he also pointed out that there always have been fundamental barriers that prevented chip technologies from further advancing. "There really are some fundamental limits. It's been amazing to me how the technologies have been able to keep pushing those out ahead of us. As long as I can remember, the fundamental limits are two, three generations out. So far we have been able to get around them."

View: Full Story on vnunet.com

Report a problem with article
Previous Story

Turkey blocks YouTube...Again

Next Story

Sony Denies Sale of Cell Chip Facilities to Toshiba

18 Comments

Commenting is disabled on this article.

15 years ago, someone along the lines said the exact same thing. Then we developed transistors for computers that were half the size, and then have was cut in half.

years ago, a single transistor was huge, and cost roughly $5, now you can buy millions at the same price, and are now into nano technology.

We will always invent something new, smaller, faster and better, thats what makes this world great

We will always invent something new, smaller, faster and better, thats what makes this world great

Yeah, but this article is about transistors, not about "something new". I don't think Moore says that the IT industry will come into a full stop, just precisely that we'll soon enough need "something new, faster, and better" than transistors as the size reduction curve of those will flatten out.

Moore's law is all about how to squeez more money out of Joe Consumer. It's a marketing buzz, nothing more and nothing less.

1. Release gradually upgraded CPUs
2. ???
3. Profit

Intel already talks about 32nm and CPUs with 1.9 trillion transistors. So they ought to have the technology to go that high. They could just skip 45nm and go 32nm all the way.

There's nothing stopping both AMD and Intel to go berserk on trillions of transistors and massive die-shrinking that doesn't follow the general gradual rule. They simply will never do so, because they know it will cost them assload of money.

Yes logically they should have left everyone waiting for the last 28 years and just released the Core 2 Quad from the start. I am sure all consumers could have easily lived without a computer for that time. </sarcasm>

Grow up.

backslash said,
Yes logically they should have left everyone waiting for the last 28 years and just released the Core 2 Quad from the start. I am sure all consumers could have easily lived without a computer for that time. </sarcasm>

Grow up.

whats the problem? ... i think he has a good point... cause ill bet that Intel/AMD rolls stuff out slowly instead of just putting out what they COULD put out... basically like he said, to milk the public for even more money... cause ill bet we aint seeing the full potential of intel/amd as they could release stuff atleast a little quicker than they do if you ask me. i just think they wait to milk the public for as much as they can... o well, u cant blame em to much as most businesses would do it.

i aint sure how much they doing this but it's gotta be atleast a little bit... do u atleast agree with that much?

p.s. also about what you said... he did not mean it that way... like to not realase a CPU etc at all... he basically ment that intel/amd is realeasing stuff slower than they could be just to suck more money out of the public etc.

ThaCrip said,

whats the problem? ... i think he has a good point... cause ill bet that Intel/AMD rolls stuff out slowly instead of just putting out what they COULD put out... basically like he said, to milk the public for even more money... cause ill bet we aint seeing the full potential of intel/amd as they could release stuff atleast a little quicker than they do if you ask me. i just think they wait to milk the public for as much as they can... o well, u cant blame em to much as most businesses would do it.

i aint sure how much they doing this but it's gotta be atleast a little bit... do u atleast agree with that much?

p.s. also about what you said... he did not mean it that way... like to not realase a CPU etc at all... he basically ment that intel/amd is realeasing stuff slower than they could be just to suck more money out of the public etc.

I agree with you..

Problem is that this kind of manufacturing is always at the forefront of todays technology, and therefore will take time to develop into a reliable and stable product for the everyday consumer.
Rushing these CPU's out will inevitably not only lead to higher market prices, but also increased risk of malfunctions..

Andre and ThaCrip - you guys are wrong. What you are saying means that everybody in the market of computer hardware technology is going slow on purpose and consciously deciding to make customers pay more and more as they create each new progression of hardware. They may be going slower than they could be if they were crazy, but that doesn't mean it's somehow evil and unethical to go slower than the maximum possible speed of hardware improvement release to the public. Even if it was all true like you say and Moore's law is just a marketing scheme, why isn't Intel making those 32nm CPUs or even smaller/faster CPUs they aren't talking about and selling them to different markets besides regular customers like me and you? Say NASA or a wealthy research group? If they could gain profit from such endeavours, they would do it and they'd get a lot of news-coverage for it too. There are 2 research groups that have made quantum computers already. Note however that they just made 1 each. They couldn't and wouldn't make more than one and start selling them to me and you. It's just not possible to do it that way. Everybody knows of the amazing possibilities for computer technology. It's just not cost-efficient or logical to try to go as fast as possible on a large-scale.

What about video card developers, monitor developer and hard-drive makers? If you think it's only CPUs then explain why. Why is it harder to advance hard-drive and monitor technology (in terms of size) than it is CPUs? It isn't generally and they're not all equal either. However, CPUs tend to be the main necessity for the other technology to continue being improved and publicly released. Some video cards are actually so fast that they are limited by CPUs. Do you see the problem? If they kept on going faster and faster with video card technology, it would gain so much on the CPU market, it would be useless! In most cases, CPUs are much more complex and require far more testing and planning than almost any other piece of computer hardware. They also have a very lengthy adoption and development curve for those reasons and others. If you didn't notice, most game developers have had a hard time getting used to and developing for PS3 technology. A big part of that is the CPU in the console. Now, do you think that for argument's sake IF they could, Sony should release the next generation CPU and PS4 tomorrow? How would that make any sense? It's not a matter of if something is possible, in fact most large companies such as Intel, AMD, Nvidia and Microsoft do look ahead and forecast and plan for years in advance and have internal projects that attempt new ideas. They have many radical and otherwise non-radical but simply eventual ideas for smaller CPUs and faster video cards of course, but they just go at a certain pace that is deemed best at the present time.

Finally, you might not realize it but large businesses have one primary goal; profit. It drives the computer market and every company to make new technology. Why would they even work on new 32nm CPUs if they knew they would make LESS money than if they progressively scale their CPUs? Large businesses don't make new technology to simply please nerds who have a transistor fetish. There are many Nobel Prize winning economists and mathematicians that have spent their lives creating and proving theorems and ideas that help businesses maximize profit, efficiency, quality and excell in their craft and all of the time an intergral part of such ideas is the customer/consumer - their dollar AND their satisfaction. Do you really think they're just being dishonest and greedy and you know better?

Anyway, I think that's enough to make my point. Don't make silly comments like that in the future unless you know what you're discussing and provide logical evidence for your arguments.

illz55 said,
Andre and ThaCrip - you guys are wrong. What you are saying means that everybody in the market of computer hardware technology is going slow on purpose and consciously deciding to make customers pay more and more as they create each new progression of hardware. They may be going slower than they could be if they were crazy, but that doesn't mean it's somehow evil and unethical to go slower than the maximum possible speed of hardware improvement release to the public. Even if it was all true like you say and Moore's law is just a marketing scheme, why isn't Intel making those 32nm CPUs or even smaller/faster CPUs they aren't talking about and selling them to different markets besides regular customers like me and you? Say NASA or a wealthy research group? If they could gain profit from such endeavours, they would do it and they'd get a lot of news-coverage for it too. There are 2 research groups that have made quantum computers already. Note however that they just made 1 each. They couldn't and wouldn't make more than one and start selling them to me and you. It's just not possible to do it that way. Everybody knows of the amazing possibilities for computer technology. It's just not cost-efficient or logical to try to go as fast as possible on a large-scale.

What about video card developers, monitor developer and hard-drive makers? If you think it's only CPUs then explain why. Why is it harder to advance hard-drive and monitor technology (in terms of size) than it is CPUs? It isn't generally and they're not all equal either. However, CPUs tend to be the main necessity for the other technology to continue being improved and publicly released. Some video cards are actually so fast that they are limited by CPUs. Do you see the problem? If they kept on going faster and faster with video card technology, it would gain so much on the CPU market, it would be useless! In most cases, CPUs are much more complex and require far more testing and planning than almost any other piece of computer hardware. They also have a very lengthy adoption and development curve for those reasons and others. If you didn't notice, most game developers have had a hard time getting used to and developing for PS3 technology. A big part of that is the CPU in the console. Now, do you think that for argument's sake IF they could, Sony should release the next generation CPU and PS4 tomorrow? How would that make any sense? It's not a matter of if something is possible, in fact most large companies such as Intel, AMD, Nvidia and Microsoft do look ahead and forecast and plan for years in advance and have internal projects that attempt new ideas. They have many radical and otherwise non-radical but simply eventual ideas for smaller CPUs and faster video cards of course, but they just go at a certain pace that is deemed best at the present time.

Finally, you might not realize it but large businesses have one primary goal; profit. It drives the computer market and every company to make new technology. Why would they even work on new 32nm CPUs if they knew they would make LESS money than if they progressively scale their CPUs? Large businesses don't make new technology to simply please nerds who have a transistor fetish. There are many Nobel Prize winning economists and mathematicians that have spent their lives creating and proving theorems and ideas that help businesses maximize profit, efficiency, quality and excell in their craft and all of the time an intergral part of such ideas is the customer/consumer - their dollar AND their satisfaction. Do you really think they're just being dishonest and greedy and you know better?

Anyway, I think that's enough to make my point. Don't make silly comments like that in the future unless you know what you're discussing and provide logical evidence for your arguments.

Your argument is somehow vague.

1. If Intel as to provide better chip technology to NASA/CIA/FBI/research group, Intel will have to sign a NDA like agreement that prevent both parties leaking ANY information as it can be considered as national security (or something more serious, like EARTH security??). Therefore how do you know for sure there's no such things happened?

2. Moore's law is about transistors. It's NOT about hard-drive, which has mechanical parts; Display technology, which contains optical and some other technology not directly related to transistors/shrinkage/bla bla bla.

3. You said large business have one primary goal: profit. That is very correct. However, why would you think "work on new 32nm CPUs if they knew they would make LESS money than if they progressively scale their CPUs"?????
Well, you said yourself: if they don't progress they will earn LESS money.

4. Large businesses don't make new technology simply to please nerds. True. Maximize profit already implied that it is related to or even restricting technology progression. If you have a bunch of techs you wanna sell/contribute to human well-begins (let's pretend they are so great here), do you sell them ALL at once, or you keep them and sell them one at a time?

Nobody is saying the business is being dishonest. But they are greedy, as every businessmen do. That's the more plain-engrish translation of "Maximize profit". Greedy businessmen decide how technology is going to progress: If they foresee more profit by slowing down tech progress, they will undoutly do that. You don't? Then you will be out of business very soon.

ThaCrip said,
he basically ment that intel/amd is realeasing stuff slower than they could be just to suck more money out of the public etc.

There *is* fierce market competition and if AMD is slower on something than Intel and get behind in the performance curve, they *do* make market share losses, like now recently, and I doubt that is anything more than another crazy conspiracy theory. They obviously don't have a "super sekret deal" to do it like this, and they need one for this strategy to work at all. Come back when you at least have a few observations to back that theory up.

Richteralan said,
If they foresee more profit by slowing down tech progress, they will undoutly do that.

Right. But unless a company has an absolute and unassailable monopoly on its product lines, there is some serious risk in pursuing that particular strategy – as Intel found out a couple of years ago when it decided that 64-bit processor technology wasn't necessary or justified and continued cranking out 32-bit processors while AMD forged ahead with 64-bit.

The result? AMD took a huge bite out of Intel's market share – which in turn drove Intel into massive overdrive churning out newer and faster processors like the Core2 Duo and Quad lines.

Which, BTW, makes it so funny that people are now accusing these companies of instituting artificial slowdowns. What the hell! These things are coming so fast that you can't even buy a state-of-the-art processor any more because a newer, faster one is always being put on the verge of release before you can buy the current one!

Slow indeed!

"In another decade, decade and half or something, we will hit something that is fundamental,"

So.. in some time frame... something will happen.... if not.. something else will happen instead... or for that matter, something may not happen.

Hope that clears up the mud.