• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

GDDR3 vs. DDR2

Joined
Dec 21, 2006
Messages
252 (0.04/day)
Location
California
Processor AMD Ryzen 9 5900X
Motherboard MSI X570S Carbon Max Wifi
Cooling Arctic Liquid Freezer II 280
Memory 2x16GB G.Skill Trident Z Neo 3600mhz
Video Card(s) Gigabyte RTX 3080 10GB Gaming OC
Storage 2 x 2TB m.2 nvme
Display(s) LG 45" OLED 3440x1440 Ultra Wide + 2 x 27" 1440p
Case darkFlash DK431
Power Supply EVGA 850W Gold
Mouse Logitech G502
Keyboard Logitech G815
Software Windows 10
so in gfx cards is there a significant difference in memory in these two??
 
Yes, mainly DDR2 runs hotter and is only 128bit, not to mention slower. DDR3 all the way.
 
so is regular GDDR really bad?
 
DDR for graphics cards is actually better as it can be 256bit. DDR3 is obviously what anyone should be looking at when choosing a new card.
 
DDR for graphics cards is actually better as it can be 256bit. DDR3 is obviously what anyone should be looking at when choosing a new card.

I would go for DDR3 if you are looking to get a new graphics card.. It is definately the fastest out right now. Until the X2800 series cards roll out. I think one type of those cards is going to have GDDR4 memory, but im not sure which one.. maybe the XT?:)
 
ya well definately buying a new one but i was just wondering cuz mine is GDDR a geforce6600, it kinda sucks but hey i got it for free.....
 
ya well definately buying a new one but i was just wondering cuz mine is GDDR a geforce6600, it kinda sucks but hey i got it for free.....

Well hey, can't go wrong there.
 
lol 6600 is quite decent. DDR-2/GDDR-2 Isn't very efficient and not very great in terms of performance, GDDRIII is good and powerful as well as GDDR4. DDR is basically great when clocked high but generates so much heat.
 
Last edited:
Yes, mainly DDR2 runs hotter and is only 128bit, not to mention slower.
It was GDDR2 that was too hot and slow, DDR2 on the other hand is a good budjet solution and it actually consumes less power than GDDR3. GDDR2 has more common to DDR1 than to DDR2, and infact, GDDR2 is an older tech compared to plain DDR2 which is the chip-of-choise for budjet GFX today, the exact same chips are used as main system memory on most recent AMD/Intel platforms.
DDR3 all the way.
DDR3 is not the same as GDDR3. GDDR3 is a DDR2 chip modified to be used as high-bandwidth texture buffer memory where latency is not an issue.

And btw, the 64bit/128bit/256bit/512bit memory interface on graphics cards is not a feature of a single memory chip nor it is limited by the type of chip used (DDR2/GDDR3/etc.). Infact, you could build a 1024bit bus using any given generation chips. Memory bus width is the sum of chip count and chip width.
For example the X1900XT I had sometime ago had 8 Samsung GDDR3 chips (K4J52324QC-BJ11). As you know, X1900XT has a 256bit wide memory bus. The bolded "32" in the part number is the width of the chip; chip width is basically the amount of parallel cell arrays inside the chip, thus chip width is the amount of bits the chip can send or receive within a single clock cycle.

One "x32" chip forms a 32bit bus - 8 such chips form a 256bit wide bus together (8*32bit=256bit), as long as the GPU's memory controller has a dedicated bus for each of the chips.
 
Last edited:
It was GDDR2 that was too hot and slow, DDR2 on the other hand is a good budjet solution and it actually consumes less power than GDDR3. GDDR2 has more common to DDR1 than to DDR2, and infact, GDDR2 is an older tech compared to plain DDR2 which is the chip-of-choise for budjet GFX today, the exact same chips are used as main system memory on most recent AMD/Intel platforms. DDR3 is not the same as GDDR3. GDDR3 is a DDR2 chip modified to be used as high-bandwidth texture buffer memory where latency is not an issue.

And btw, the 64bit/128bit/256bit/512bit memory interface on graphics cards is not a feature of a single memory chip nor it is limited by the type of chip used (DDR2/GDDR3/etc.). Infact, you could build a 1024bit bus using any given generation chips. Memory bus width is the sum of chip count and chip width.
For example the X1900XT I had sometime ago had 8 Samsung GDDR3 chips (K4J52324QC-BJ11). As you know, X1900XT has a 256bit wide memory bus. The bolded "32" in the part number is the width of the chip; chip width is basically the amount of parallel cell arrays inside the chip, thus chip width is the amount of bits the chip can send or receive within a single clock cycle.

One "x32" chip forms a 32bit bus - 8 such chips form a 256bit wide bus together (8*32bit=256bit), as long as the GPU's memory controller has a dedicated bus for each of the chips.
you can also "stack" mem chips (<for lack of knowing the technical name)

^a great example of this is system memory

you can have 16x 32-bit mem chips form a 256 bit bus where each pair instead of being 64-bits total remains 32 bits (2x 32 bit chips only adding 32 bits to the total bus width) there are many possible configurations these ones only being a few possible configs

i suppose a good analogy for what im talking about would be 8 pairs of 32v, 1A batteries hooked to each other in parallel then hooking the pairs up in a series

(32v1A+32v1A [in parallel] = 32v2A)x8 [in series] = 256v16A (right?) lol :p


Hahahhaahahah finally Ketxxx gets pwned :nutkick:
Ket got pwned Ket got pwned so bad

heh (dont take this the wrong way dude) but i've noticed him "assuming" things allot but its all good i kno he is just tryin to help out and i've seen alot of that from him ;)

Err isn't that the interface for the CARD and not the GDDR2?
largon did a good job of explaining the answer to that question

but basically no :) its the interface for all the memory chips on the card combined
 
Last edited:
i love how people ask questions to get answers and other people get carried away with other crap......
 
also the difference between 128-bit and 256-bit
 
Difference between 128-bit and 256-bit?

256bit has double the bandwidth of 128bit. Given the frequency is equal.
i love how people ask questions to get answers and other people get carried away with other crap......
Well, I would rather read a long thread with correct answers instead of short one with inaccurate ones.
 
Difference between 128-bit and 256-bit?

256bit has double the bandwidth of 128bit. Given the frequency is equal.
theoretically ;)
Well, I would rather read a long thread with correct answers instead of short one with inaccurate ones.
cheers :)
 
Back
Top