GDDR4 aí à porta

Zealot

I quit My Job for Folding
GDDR4 Knocking the Door

New Memory for Graphics to Emerge in Months
by Anton Shilov
05/05/2004 | 12:28 PM

Sources with knowledge of the matter indicated there is a new memory technology in the works. Apparently, the tech will come alive by the end of the year and may be used commercially after that to replace the GDDR3.

The GDDR3 memory is expected to be widely used on high-end and mainstream graphics cards this year. Both NVIDIA and ATI now begin to use the GDDR3 with their latest graphics processors GeForce 6800 Ultra and RADEON X800-series.

The GDDR3 evolves from GDDR2, but will sport some pretty important differences. Firstly, GDDR3 makes use of a single-ended, unidirectional strobe that separates the reads and writes. GDDR2, by contrast, uses differential bi-directional strobes. Secondly, GDDR3 utilizes a “pseudo-open drain” interface technique that is based on voltage rather than current. This was done so that graphics chips can be compatible with DDR, GDDR2 and GDDR3. Like GDDR2, GDDR3 interface uses 1.8-Volt SSTL. Such memory is generally better suited for point-to-point links used on graphics cards and allows the GPU developers to reach the new performance and feature heights with their products.

The GDDR4 memory builds upon the GDDR3 standard, just like the latter evolved from the GDDR2 specification, therefore, it is possible to expect the technology to utilize the point-to-point nature. There are no revolutions, it is said now, but special tweaks to bolster clock-speeds of DRAMs used on graphics cards brought by GDDR4.

The current goals for the GDDR4 are to complete the process of standardization by the end of 2004 and push up the frequencies towards the 1.40GHz (2.80GHz effective) level. Lower clock-speeds, e.g. 1.00GHz (2.00GHz effective) are achievable by the GDDR3 technology, according to Samsung Electronics, who plans to debut such memory by the end of the year.

The process of development is lead by JEDEC and leading graphics companies, such as ATI Technologies and NVIDIA Corp..

Fonte: http://www.xbitlabs.com/news/memory/display/20040505122429.html

Fogo! Já? :eek:
 
Realmente é incrivel como a tecnologia está a evoluir. Chega quase a ser uma evolução exponencial. A este ritmo ainda serei um jovem quando se conseguir "Real Lifelike Graphics" :D
 
Back
Topo