Recommended Posts

Transistors inside new Intel CPUs unveiled last week are hundreds of times thinner than a human hair, thanks to a 22-nanometer manufacturing process that the company says ?fuels Moore?s Law for years to come.?

Not everyone agrees.

Theoretical physicist Michio Kaku believes instead that an end to Moore?s famous theory is -- at last -- in sight.

"In about 10 years or so, we will see the collapse of Moore?s Law," said Kaku, professor of theoretical physics at City University of New York (CUNY), in a videotaped interview on BigThink.com.

?In fact, already we see a slowing down of Moore?s Law. Computing power simply cannot maintain this rapid, exponential rise using standard silicon technology.?

Is it possible? Could the end really be in sight for Moore?s Law?

In 1965, an article in Electronics magazine by Gordon Moore, the future founder of chip juggernaut Intel, predicted that computer processing power would double roughly every 18 months. Or maybe he said 12 months. Or was it 24 months? Actually, nowhere in the article did Moore actually spell out that famous declaration, nor does the word "law" even appear in the article at all.

'In about 10 years or so, we will see the collapse of Moore?s Law.'

- Michio Kaku, professor of theoretical physics

Yet the idea has proved remarkably resilient over time, entering the public zeitgeist and lodging hold like a tick on dog -- or maybe a stubborn computer virus you just can't eradicate.

more

Link to comment
Share on other sites

Moore's law is a rule of thumb in the history of computing hardware whereby the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.

Nowhere does it say speeds will double, just that "inexpensive" transistors' count will double. That processing speeds can be roughly correlated with the transistor count is consequential.

The law is named after Intel co-founder Gordon E. Moore, who described the trend in his 1965 paper. The paper noted that the number of components in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965 and predicted that the trend would continue "for at least ten years".

Idiots such as Kaku and others have extended it to today. That anyone would consider comments made during the transition from the vacuum tube to integrated circuits as some sort of binding "law" is absurd, egocentric and ....

  • Like 2
Link to comment
Share on other sites

If Michio Kaku said it, then I believe it. Not like we couldn't see it coming.

Idiots such as Kaku . . .

Your theory may be correct, but calling him an idiot? He's a very smart man. Surely smarter than both of us.

Link to comment
Share on other sites

If Michio Kaku said it, then I believe it. Not like we couldn't see it coming.

Your theory may be correct, but calling him an idiot? He's a very smart man. Surely smarter than both of us.

Even considering he is one of a few who have a way of translating highly technical terms so the lay-person can understand, he has been given more credence than he should have been entitled to.

For him to act, unabashedly, as if he were the next Carl Sagan... yeah, I'd say he's an idiot.

Perhaps, maybe I should just say "over-extended"?

Link to comment
Share on other sites

Over-extended I can agree with, he's just trying to get a taste of everything. Not an idiot though, not by a long shot.

Link to comment
Share on other sites

The key part is "....using standard silicon technology.?

Silicon replacements are already in the lab, and include different flavors of nano-carbon, memristor and molecular computing. Nano-carbon may be here a lot sooner than many people think what with South Korea, China, Japan etc. in the hunt.

Link to comment
Share on other sites

This has been well-known for, well, ever. The Pentium 4 was 3.2Ghz in 2004, Intel's latest CPU is now 3.5Ghz. Moore's law has already stopped yielding faster clock speeds a while ago. Since the P4, it's yielding more transistors on a chip hence more similarly-clocked cores. This is great for parallelizable computations, but not everything is perfectly parallelizable. Running things in parallel presents unavoidable overhead that often make it not even worth it. Especially in video games, where frame n+1 is a function of frame n so it's impossible to compute both in parallel; and each frame must be computed very quickly (< 1/30th of a second) so the overhead of spreading the job on multiple cores is often greater than the potential gains. Programmers are only starting to figure out how to program for true parallel architectures, and it's unclear how that approach will evolve.

There might be replacements for silicon, but will they be as flexible, inexpensive, will they really allow the same kind of exponential growth we've had for 50 years? Only time will tell.

Link to comment
Share on other sites

Since the P4, it's yielding more transistors on a chip hence more similarly-clocked cores

Actually, if I remember correctly, Moore's law was based around transistors on a chip doubling every 15-20 months-ish. I don't think clock-speed had ever had anything to do with it.

  • Like 1
Link to comment
Share on other sites

You know, I was thinking about this the other day, and it raised some interesting questions. I've heard some gloom and doom scenarios, but I don't think it's really a bad thing... Not good, but I don't see it hurting the industry too much. Maybe I'll write an editorial...

Link to comment
Share on other sites

Michio Kaku is our time's Einstein.

Well, actually I think that title belongs to Dr Stephen Hawking. But Kaku, and that black guy with the hyphenated last name are great - and crazy smart. Anytime I see either of them on TV I stop and watch it.

Regression, why do you say he has been "given more credence than he should have been entitled to." ?

Yes he likes to get his face on TV alot more these days, but is he not highly regarded in a field full of really smart people ?

astropheed, you're right it says nothing about clocks and cores. I think he was simply mentioning how deviations from the shove more in there to make it faster concept seemed to be the beginning of the end for Moore's Law, but it has always held its ground.

With the 3.2 Prescott (I think) the chips were getting too hot, so something had to be done - enter multiple cores. At least I think he was eluding to that :)

  • Like 2
Link to comment
Share on other sites

Well, actually I think that title belongs to Dr Stephen Hawking.

>

His stock went way down when the Hawking Paradox was resolved, and not really in his favor in that he had to admit that his 30 years of theories about information loss in black holes had been wrong - the information isn't lost.

Link to comment
Share on other sites

His stock went way down when the Hawking Paradox was resolved, and not really in his favor in that he had to admit that his 30 years of theories about information loss in black holes had been wrong - the information isn't lost.

Didnt that have something to do with Hawking Radiation & string theory ? I drank a lot of beer in college, so Im sorry if I just made an ass out of myself.

I guess I always admired him & didnt even realize the credibility at stake. I guess I always thought it was more like "well nobody's perfect Stevie - dont sweat it" LOL

Link to comment
Share on other sites

He's more comparable to Sagan.

Neither, he's made claims with no scientific backing, his "shows" on the Science channel are more SciFi than Science

And neither is Hawking, another overrated scientist, making all kinds of straight up stupid comments

Our Generation's Einstein will be none of the talking heads, it will be like Einstein someone quietly doing his thing and only later will he be classified at Einstein's level, that anyone can claim that any living scientist is equal to Einstein is the height of arrogance

  • Like 2
Link to comment
Share on other sites

Neither, he's made claims with no scientific backing, his "shows" on the Science channel are more SciFi than Science

And neither is Hawking, another overrated scientist, making all kinds of straight up stupid comments

Our Generation's Einstein will be none of the talking heads, it will be like Einstein someone quietly doing his thing and only later will he be classified at Einstein's level, that anyone can claim that any living scientist is equal to Einstein is the height of arrogance

Oh deary me...

Link to comment
Share on other sites

Neither, he's made claims with no scientific backing, his "shows" on the Science channel are more SciFi than Science

And neither is Hawking, another overrated scientist, making all kinds of straight up stupid comments

Our Generation's Einstein will be none of the talking heads, it will be like Einstein someone quietly doing his thing and only later will he be classified at Einstein's level, that anyone can claim that any living scientist is equal to Einstein is the height of arrogance

Einstein quietly doing his thing? Why talk about stuff like you know what you are on about. Einstein was a celebrity in his time. He was not quiet. He had a lot to say about the building of the atomic bomb and making pronouncements.

Link to comment
Share on other sites

This is pretty obvious if you understand what Moore?s Law means. ?The number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.? Well, there?s a hard limit: You can?t make a transistor smaller than an atom.

Link to comment
Share on other sites

This is pretty obvious if you understand what Moore?s Law means. ?The number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years.? Well, there?s a hard limit: You can?t make a transistor smaller than an atom.

I don't think that's exactly true. The atom itself is not a 'hard limit' to anything, really. There are certainly levels of structure below atoms, though that area gets very weird and complex. Whether we can use it is another matter, but I don't think it's reasonable to suggest the atom is a point of finality.

Link to comment
Share on other sites

I am no expert to prove it, but I get the feeling Michio Kaku is just a rehearsed physicist. I think he wants to come off as a sociable scientist that can give clear explanations to a layman when in reality he's just an act which prevents him from actually knowing anything.

Further, in any show that I've seen him, he is simply explaining someone else's theory. Even if he is smart, nothing he says is his own idea.

Link to comment
Share on other sites

And we have reached peak oil. The optimistic economist will always trump the pessimistic scientist. At least this has been the case for the past century or so.

Link to comment
Share on other sites

Actually, if I remember correctly, Moore's law was based around transistors on a chip doubling every 15-20 months-ish. I don't think clock-speed had ever had anything to do with it.

This! I don't understand why everyone always thinks that Moore's Law is about clock speed?

Link to comment
Share on other sites

This topic is now closed to further replies.