Samsung to produce faster graphics memory next year

Samsung Electronics next year plans to begin mass production of a new type of graphics memory that both consumes less power and is significantly faster than existing chips.

Called GDDR (Graphics Double Data Rate) 5, the new chips can transfer data at speeds up to 6Gbps, compared to transfer speeds of 3.2Gbps offered by GDDR 4 chips, currently the fastest graphics memory available. The difference is even greater when compared to GDDR 3, which is the most commonly used graphics memory and offers transfer speeds of 1.6Gbps.

View: The full story @ InfoWorld

Report a problem with article
Previous Story

Microsoft Partners: MinWin Could Soothe Vista Headaches

Next Story

A Mac-to-Vista Switcher in Pink

6 Comments

Commenting is disabled on this article.

Yes but then ati cripples its fast gddr4 ram with a snotty little 256b bus or less if you buy a 2600/2400 series card personaly until they can make cards that run 50fps at stupid resolutions like 1600x1050 with AA and aniso at full its a waste of time and money (dont get me wrong im not bashing ATI( well actualy i am) but nVidia are just the same) it costs you a butt load of money for the best and what do you get I'll tell you what crap without aa & aniso filtering they all perform well enough but start to kick those two in and hello where'd all my lovely FPS's go it's pretty much like running a K62-500 with 4Gb of DDR2-800 sure the ram isn't a bottleneck but the processor is (or GPU in this case)
New technolgy should never be slower than previous generation tech nor should it use more power and as for DX10.1 and shader model 4.1 well how many years you willing to wait for a programe to take advantage of it 1yr 2yrs 3yrs by that tiime there will be a brand new crop of cards out using DX11 shader model 5.* and GDDR6 or 7 but at the end of the day if the GPU aint up to it it's just another waste on money :disappointed:

Some people like to bash the new ATI 3800 cards, but the truth is, ATI is farther ahead in technology than Nvidia. ATI uses DDR4 (Nvidia uses DDR3), and ATI has directx10.1, shader 4.1, a 55nm core, hdmi output... Nvidia has none of that.