• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GDDR vs DDR Whats the Difference?

Joined
Jun 17, 2007
Messages
7,336 (1.12/day)
Location
C:\Program Files (x86)\Aphexdreamer\
System Name Unknown
Processor AMD Bulldozer FX8320 @ 4.4Ghz
Motherboard Asus Crosshair V
Cooling XSPC Raystorm 750 EX240 for CPU
Memory 8 GB CORSAIR Vengeance Red DDR3 RAM 1922mhz (10-11-9-27)
Video Card(s) XFX R9 290
Storage Samsung SSD 254GB and Western Digital Caviar Black 1TB 64MB Cache SATA 6.0Gb/s
Display(s) AOC 23" @ 1920x1080 + Asus 27" 1440p
Case HAF X
Audio Device(s) X Fi Titanium 5.1 Surround Sound
Power Supply 750 Watt PP&C Silencer Black
Software Windows 8.1 Pro 64-bit
Whats the difference between GDDR RAM on Graphics Cards and DDR RAM on the MotherBoard? If GDDR has been advanced up to 4 and soon to be 5 how come we don't use that on our motherboard and we use DDR which has been advanced to the less popular 3 and more common 2?

I know that GDDR stands for Graphics Double Data Rate and DDR simply Double Data Rate but why is it GDDR and not DDR?

I remember some old cards being labled as having DDR2 and such but then all of sudden its GDDR?

Might be a retarded question, but honestly I"m confused here.
 
Is this just a ridiculous question to ask or no one knows?
 
its very simple. we dont use that power at all
 
They use GDDR instead of DDR for graphics to help distinguish the two. Other than that, there are no differences.
 
its very simple. we dont use that power at all

Wait, what do you mean? Could you clarify? Are you saying that GDDR uses more Volts(Power) so we don't use it, if thats what you are saying than thats not ture because GDDR uses less Volts than DDR.
 
They use GDDR instead of DDR for graphics to help distinguish the two. Other than that, there are no differences.

If thats the case than why don't we use 1GB GDDR4 Ram instead of 1GB DDR2 or 3?
 
If thats the case than why don't we use 1GB GDDR4 Ram instead of 1GB DDR2 or 3?

My guess would be because video cards use more bandwidth.
 
GDDR "G" = Graphics I believe...
-Higher Clock Rates
-Different voltage requirements
 
that and right now that much bandwith on a CPU is wasted
 
The 'G' in GDDR represents something to do with the IC's termination voltage control.

The function of that controller allows more stable operation at high speeds, and is not compatible with motherboard (cpu) memory controllers.

- what I heard in a casual conversation with a friend who works in the vga business
 
My guess would be because video cards use more bandwidth.

Actually I do not belive that to be the case.

I was reading in another forum (apprently someone ask somewhat of the same question as me) and someone brought up a very good point. If you go out and buy a Graphics Card with GDDR4 for $200 you can say that the RAM costs at least half of that price of it so $100 for the RAM on the Card. Now if you go look to buy DDR3 they cost about $100 (the cheapest, ranging up to $500) as well but are essentialy worse than 4 so why is this? Scam perhaps, I have no idea.
 
Last edited:
The 'G' in GDDR represents something to do with the IC's termination voltage control.

The function of that controller allows more stable operation at high speeds, and is not compatible with motherboard (cpu) memory controllers.

- what I heard in a casual conversation with a friend who works in the vga business

Well is a Graphics Card not like a small computer itself. You have your GPU and RAM like you do your CPU and RAM on your MB, I thought they both worked rather similler.
 
Hmmm.
I guess it could be like, DDR1 is incompatible with DDR2, DDR2 with DDR3, etc.
Maybe there's certain requirements that do not allow motherboards to work with GDDR.(?)
 
This just in from the Wiki:
"Using "DDR2" to refer to GDDR-2 is a colloquial misnomer. In particular, the performance-enhancing doubling of the I/O clock rate is missing. It had severe overheating issues due to the nominal DDR voltages. ATI has since designed the GDDR technology further into GDDR3, which is more true to the DDR2 specifications, though with several additions suited for graphics cards."
 
This just in from the Wiki:
"Using "DDR2" to refer to GDDR-2 is a colloquial misnomer. In particular, the performance-enhancing doubling of the I/O clock rate is missing. It had severe overheating issues due to the nominal DDR voltages. ATI has since designed the GDDR technology further into GDDR3, which is more true to the DDR2 specifications, though with several additions suited for graphics cards."

Ok so GDDR is better suited for the Graphics Card, but that its self makes it better than DDR since the Graphics Cards are more senstive, plus its way cheaper. I think they can move the technology over, but don't, then again I'm probably way wrong.:ohwell:
 
we had a thread about this a while ago.

Summary: GDDR *USED* to be different. Lots of places got it confused and still do, so that even tho DDR ram is used now on video cards many places still (mistakenly) call it GDDR.

The reason DDR 4 isnt out on motherboards, is because they take a few years to design systems that work with it - you dont think it takes 6 months to re-design hardware do you? Memory controllers are complex.
The reason we dont see much DDR4 and video cards are stuck at DDR3, is because they ran out. There simply isnt enough DDR4 to go around all the video cards.
 
we had a thread about this a while ago.

Summary: GDDR *USED* to be different. Lots of places got it confused and still do, so that even tho DDR ram is used now on video cards many places still (mistakenly) call it GDDR.

The reason DDR 4 isnt out on motherboards, is because they take a few years to design systems that work with it - you dont think it takes 6 months to re-design hardware do you? Memory controllers are complex.
The reason we dont see much DDR4 and video cards are stuck at DDR3, is because they ran out. There simply isnt enough DDR4 to go around all the video cards.

My bad I didn't know there was a thread like this posted here before.:banghead:

Thanks for the info though. Clears things up.
 
Ha... Read this line

"Microsoft's Xbox 360 is also shipped with 512 MiB of GDDR3 memory, and is helping to pioneer the use of this memory as standard system memory rather than only video memory."
 
Without going into much detail, there are some minor differences between GDDRX and DDRX namely to increase the bandwidth of the memory. Some graphics cards use DDRX as opposed to GDDRX to save money (yes the more generic modules are cheaper). Generally, GDDRX is one step ahead of the motherboard market (for example GDDR4 is becoming more common place in the high-end GPUs, whilst DDR3 is becoming more common place amongst high end motherboards).
 
How I view it is that since the memory is built into the PCB, vid cards get the benefit of newer memory techs with every GDDR revision since there doesn't have to be set standards for slot technology, resistances, voltage fluxuations that would happen over time and such with contacts such as that in DIMMS. Where-as GDDR is soldered to the PCB itself, the contact is greater, allowing easier implementation and allowing a different set of allowances to the tuning of said GDDR to the application (GPUs).

Like said earlier it takes longer to implement such technologies into the standard CPU/MB arena...DIMMS don't help...if people would've been able to use a standard set of soldered memory to the MB itself with a different standard of memory controller hubs, things would be more even between vid cards and the rest of the PC build. Granted this is my two cents, and it could be worth even less then that, but that is how I've viewed it over the years.

:toast:
 
DDR1 and GDDR1 are pretty much the same except that system memory DDR1 has 8bits wide I/O and GDDR1 has 16bit width. Basically GDDR1 just allows slightly higher bandwidth than DDR1.

But, on the other hand:
DDR2 vs. GDDR2
DDR3 vs. GDDR3/GDDR4

The difference?
Basically they all are completely different DRAM architectures. GDDR2 has nothing to do with DDR2. DDR2 is a bit like half-breed of GDDR1 and DDR2, it has features from both. The real GDDR2 hasn't been used on video cards for several years anymore, the "GDDR2" you see these days is merely standard DDR2 we all love. GDDR3 is an sort of an enhanced DDR2 - it certainly isn't anything like DDR3. And GDDR4 is just a tweaked GDDR3.

You ask why don't we see GDDR3 used as system memory since they're clearly superior to DDR2 for what comes to bandwidth? That's because GDDR3 is quite a complex chip with a nasty I/O architecture: GDDR3/4 are 32bits wide and DDR2 is 8bits wide. This is a problem since the amount of signal traces that can sensibly be built into desktop motherboards are very limited - by cost that is. 128bits is complex (expensive) enough for desktop pricerange motherboards. So when you have a motherboard with just 128bit bus you could only access 4 GDDR3/4 chips (128bit/32bit) at a time. This would seriously limit the RAM capacity of the system as GDDR3 densities are what they are - 256MB/chip, tops. Usually it's just 128BM/chip. This could be countered with making boards with a huge array of DIMM slots, but that would be even more expensive.
 
GDDR = "Graphics Double Data Rate".
The standard is different in clock speed, bandwidth, and power-management capabilities.
 
Back
Top