Nvidia Creates First Graphics Chip with Embedded DRAM

Taiwan Semiconductor Manufacturing Company (TSMC) and Nvidia Corporation have announced that TSMC has successfully produced a fully-functional sample of a graphics chip for handheld devices with embedded DRAM. "Nvidia is pleased to have collaborated with TSMC on their new 65nm embedded DRAM process, which has proven to be an excellent platform for our latest handheld GPU product. The efficiencies of the embedded DRAM process allowed us to raise the bar for features found in mainstream cell phones," said Michael Rayfield, general manager of the handheld division of Nvidia Corp.

Embedded Random Access Memory (eDRAM) allows the integration of a higher amount of memory into computer chips than currently used Static Random Access Memory (SRAM). Larger amounts of onboard memory mean higher bandwidth - faster processing. This means more functions for mainstream handhelds, and possible development of solutions for applications that require higher graphics performance. The 65nm embedded DRAM process of TSMC is built on up to 10 metal layers using copper low-k interconnect and nickel silicide transistor interconnect. The smaller form factor is due to the logic and memory functions built on a single device, which also enhances systems reliability. It features a cell size less than a quarter of its SRAM counterpart, and macro densities ranging from 4Mb to 256Mb. It also uses a low thermal budget module and is compatible with all 65nm logic libraries, making it an efficient process for IP reuse. The embedded DRAM design also features improved retention time and special power saving options.

News source: Xbit Laboratories

Report a problem with article
Previous Story

New iTunes version still not fully Vista-ready

Next Story

Microsoft: Xbox Live Arcade Game Development Contest


Commenting is disabled on this article.

pretty quick on the reply :p

I was still editing my comment, but yeah it is a little misleading.

there are a few videos on the net showing the new nvidia phone gpu's in action, it looks pretty impressive

Septimus said,
Bit of a misleading title. ATI did it 'first' but this is a first for mobile devices.

Nope Bitboys was really the frist in PC and for Mobile Devices even know it was a prototype which ATI acquires them later AMD acquires ATI so the real credit gose to Bitboys

SHS said,
Nope Bitboys was really the frist in PC and for Mobile Devices even know it was a prototype which ATI acquires them later AMD acquires ATI so the real credit gose to Bitboys

Yes, Bitboys did the first Prototype, but it was never produced, so Ati did the first chip that was actually produced.
Still, the title is totally misleading and should be changed.

didn't ati already do this with the xenos xbox 360 gpu

...the Xbox 360 uses a chip designed by ATI called Xenos. The chip was developed under the names "C1" and "R500".[50] Xenos contains 48 unified shader units, which are capable of both vertex and pixel shading operations. This is in contrast to older graphics processor designs which utilize separate specialized units for these tasks. The GPU package contains two separate silicon dies, each on a 90 nm chip with a clock speed of 500 MHz; the GPU proper, manufactured by TSMC and a 10 MiB eDRAM daughter-die . Thanks to the daughter die, the Xenos can do 4x FSAA, z-buffering, and alpha blending with no appreciable performance penalty on the GPU.[51] The GPU also houses additional capabilities typically separated into a motherboard chipset in PC systems, effectively replacing the northbridge chip. An aluminum heat sink is also implemented to cool the GPU; it is wider and shorter than the CPU heat sink.

I suppose the article mentions "handheld devices", but the title is a little misleading sorta stating that Nvidia was the first gpu chip with edram, which isn't true

While I'm not familiar with either AMDs or Nvidias design it seems the ATI chip uses two separate dies while it might be possible that the Nvidia chip has the GPU and memory on one die. The article about the Nvidia chip is not very clear on if that is the case and I might be way of here, but it mentions that it would replace the SRAM and that is usually integrated on die in modern microprocessors.

I think this might be the case, the ATi 360 GPU is off-die but connected with a super fast link, if the nVidia GPU has it on die it should run faster I guess, or at the same speed as the core clock does anyways.

Danrarbc said,
Does the embedded memory on the Nintendo/ATI's Flipper and Hollywood not count?

No it doesnt as they use SRAM and not DRAM