• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Samsung Develops GDDR5 Memory at 6Gbps

malware

New Member
Joined
Nov 7, 2004
Messages
5,422 (0.72/day)
Location
Bulgaria
Processor Intel Core 2 Quad Q6600 G0 VID: 1.2125
Motherboard GIGABYTE GA-P35-DS3P rev.2.0
Cooling Thermalright Ultra-120 eXtreme + Noctua NF-S12 Fan
Memory 4x1 GB PQI DDR2 PC2-6400
Video Card(s) Colorful iGame Radeon HD 4890 1 GB GDDR5
Storage 2x 500 GB Seagate Barracuda 7200.11 32 MB RAID0
Display(s) BenQ G2400W 24-inch WideScreen LCD
Case Cooler Master COSMOS RC-1000 (sold), Cooler Master HAF-932 (delivered)
Audio Device(s) Creative X-Fi XtremeMusic + Logitech Z-5500 Digital THX
Power Supply Chieftec CFT-1000G-DF 1kW
Software Laptop: Lenovo 3000 N200 C2DT2310/3GB/120GB/GF7300/15.4"/Razer
Samsung Electronics has announced that it has developed the world's fastest memory, a GDDR5 (series five, graphics double-data-rate memory) chip that can transfer data at six gigabits per second (Gbps). Samsung's GDDR5, which will be introduced at a density of 512 Mb (16Mb x 32) chips, is capable of transmitting moving images and associated data at 24 gigabytes per second (GBps). The new Samsung graphics memory operates at 1.5 volts, representing an approximate 20% improvement in power consumption over today's most popular graphics chip, the GDDR3. Samples of Samsung's new GDDR5 chip have been delivered to major graphic processor companies last month and mass production is expected in the first half of 2008. Samsung expects that GDDR5 memory chips will become standard in the top performing segment of the market by capturing more than 50% of the high-end PC graphics market by 2010.



View at TechPowerUp Main Site
 
Damnit! I am still on DDR1. :eek:
 
That's a tremendous amount of bandwidth.
 
My god...Who wants to bet ATI will be all over this?
 
Damnit! I am still on DDR1. :eek:

Actually you are not, this is graphics memory and you are using GDDR4 ;) I'm still on 3 and even after next GPU upgrade I'll be on it.
 
It looks like this will require a powerful GPU in order to take full advantage of the bandwidth.
 
Damnit! I am still on DDR1. :eek:

lol, you are fine, it's not Gddr, not ddr :laugh:
and i also think that not every GPU will be able to really take an advantage of this memory
 
My god...Who wants to bet ATI will be all over this?

i doubt it, with AMD struggling on component pricing and Samsung wanting to drive ram prices up i cannot see anyone rushing to implement this technology at a premium just yet.
 
Can't wait to see how well these will work :)

Think its possible that someone will use these for a SSD drive, even though they are meant as graphics memory? 6Gbps is a hella lot of speed!
 
At last, maybe GDDR2 will now finally drop off the fact of the earth. I mean my god, wtf were they thinking with GDDR2? It ran hotter, slower even with its massively faster clock speeds, and was only capable of a 128bit memory bus. I hope whoever had the idea of "improving" GDDR with GDDR2 got fired.
 
I hope whoever had the idea of "improving" GDDR with GDDR2 got fired.

lol, he died when a gddr2 chip exploded and hit him in the eye:roll:
 
Haha, I hope that actually happened :D
 
guess that may be going into the 8950's :)
 
If GDDR(x) is faster and more power efficient than regular DDR(x) then why doesnt the industry move over to GDDR(x) for system memory. Obviously, its more expensive, but with SO MANY enthusiast mobos and components, then surely this is what we need to BREAK the benchmarks.
 
If GDDR(x) is faster and more power efficient than regular DDR(x) then why doesnt the industry move over to GDDR(x) for system memory. Obviously, its more expensive, but with SO MANY enthusiast mobos and components, then surely this is what we need to BREAK the benchmarks.

because the architecture is too different.
 
Holy S***! GDDR5. I can't wait till GDDR6.
wow
 
I think these will get bought for ati/nvidia to use on their flagship graphics cards tho' at least.
 
You know this will be implemented by ATI ASAP just like they did with the X1950's XTX's with the GDDR4 memory to compete with the then just about to release Nvidea 8800's:laugh:
 
You know this will be implemented by ATI ASAP just like they did with the X1950's XTX's with the GDDR4 memory to compete with the then just about to release Nvidea 8800's:laugh:

um, gddr4 was implemented quickly by ati because they created it.
 
How does the jump to each consecutive form of graphics ram work then? Is it a smaller process,or differant chips or method of addresing the ram.I kinda understand with system ram.
 
How does the jump to each consecutive form of graphics ram work then? Is it a smaller process,or differant chips or method of addresing the ram.I kinda understand with system ram.

well, gddr4 isn't really a big enough improvement over gddr3 to warrant a new name, but ati uses weird marketing :shadedshu it's really more like gddr3.5.
the new name denotes architecture type.
there was never gddr1, because graphics cards actually used ddr system ram chips. gddr2 is specialized ddr2 system memory, and gddr3 was actually a completely new architecture. gddr4 was optimized gddr3, and gddr5 is ddr3 system memory that has been shrunk to reduce power usage but still allow high frequencies and throughput.
 
There is no such terme as GDDR perse, the "G" simply denotes "Graphics", and thus, makes it easier for people to differentiate between system and graphics memory. If it were me, I'd just use the term I invented to differentiate between system and graphics memory - VidRAM, just makes more sence to me.
 
This just gives me the giddies, lesser power consumption. Might make more efficient vid cards with these bad boys. Implement these into the next-gen cards and I think I'll be set for a LOOOOOONG time :respect:
 
Back
Top