Samsung accelerates graphics memory to 2000 MHz

Posted by Wolfgang Gruener

Seoul (Korea) - The Gigahertz race may be over in the microprocessor space, but its is alive and kicking in the memory segment. Samsung today announces a 66% speed boost for high end graphics memory, which is expected to debut in enthusiast graphics cards later this year.

The new GDDR4 memory, a memory type that so far has only been in use in ATI's Radeon X1950 graphics cards, bumps the clock speed to 2000 MHz, up 66% from 1100 and 1200 MHz memory in today's high-end cards. Accordingly, the data transfer of the memory climbs from currently 2.4 Gb/s to 4 Gb/s.

Samsung said that it has shipped samples of the memory, which is manufactured in an 80 nm, 512 Mb (16 Mb x32) density, to manufacturers and expects the memory appear on commercially available graphics cards later this year.

"Our new GDDR4 memory will add even more zip in video applications, making gaming, computer-aided design and video editing faster than ever before," said Mueez Deen, marketing director of the graphics memory division of Samsung Semiconductor. "This will enable ultra-smooth movements in animation and make games incredibly realistic, resulting in a truly immersive user experience," he added.

The quick progress of GDDR4 memory points to the question how far the memory will scale and when the next generation of graphics memory will knock on our doors. According to Deen, GDDR4 could climb to clock speeds just under 2.5 GHz and GDDR5 will take over at this clock speed and 5 Gb/s bandwidth in late 2008 or early 2009.

With Samsung pushing GDDR4 memory aggressively into the market, it becomes less likely that Rambus' XDR memory will make it into graphics cards down the road. While Deen conceded the XDR provides superior performance in environments where pin counts are limited and XDR is not more expensive to produce than GDDR4, he mentioned that XDR technology has to be licensed from Rambus and therefore is available from only one source - which could be a major reason why XDR will not be widely adopted.