Intel Claims Breakthrough With 40Gbps Optical Chips

Intel Corp. researchers are a step closer to creating chips that transmit data at high speeds using light instead of electrons, but products based on the technology appear to remain over the horizon. On Wednesday, a team of Intel researchers unveiled a laser modulator made of silicon that is capable of encoding data at speeds up to 40G bits per second (bps), a significant increase in speed for the company.

The new modulator, which converts electrical data into light, opens the door to high-speed optical interconnects for computers and, when combined with 25 hybrid silicon lasers on a single chip, could be capable of transmitting terabits of data per second, wrote Ansheng Liu, principal engineer at Intel's Corporate Technology Group, in a blog post. Optical interconnects are desirable because fiber optics offer more bandwidth and carry data farther than copper, which is currently used to connect chips and move data inside a computer. Because they use laser light to transmit data, optical interconnects also eliminate the heat created by resistance as electrons pass through a copper trace.

View: The full story
News source: PCWorld

Report a problem with article
Previous Story

What will Windows 7 look like?

Next Story

AMD claims memory controller breakthrough

10 Comments

Commenting is disabled on this article.

Hmm.. its so hard with technology now-a-days to keep up. I think I'll just upgrade every 2-4 years, meaning only 2-3 more upgrades until I have myself a laser cpu

It looks like, but this CPU will be realy for mainstream only in 2010-2015. And it should be after Intel releases it's 2^5 cores CPU as I remember.

All of a sudden my 2.56Ghz P4 becomes worthless x)

Intel seem to be far ahead in discovering new technologies, although AMD tend to make better use of existing technology.

BTW, presumably this will mean lower power consumption too won't it?

The idea of using Light instead of Electrons has been around for quite some time. I believe the first attempt/initial research was done at MIT, so Intel should not be credited with the discovery. However, it does look like Intel is ahead of the pack with leading to this next evolutionary transition.

Energy is still energy, no matter how you slice it. I'd be willing to bet that an Optical CPU would still generate enough heat to require some sort of cooling solution. All depends on how it's engineered.