• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GDDR5 Memory - Under the Hood

HTC

Joined
Apr 1, 2008
Messages
4,601 (0.79/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
In the graphics business, there's no such thing as too much memory bandwidth. Real-time graphics always wants more: more memory, more bandwidth, more processing power. Most graphics cards on the market today use GDDR3 memory, a graphics-card optimized successor to the DDR2 memory common in PC systems (it's mostly unrelated to the DDR3 used in PC system memory).

A couple years ago, ATI (not yet purchased by AMD ) began promoting and using GDDR4, which lowered voltage requirements and increased bandwidth with a number of signaling tweaks (8-bit prefetch scheme, 8-bit burst length). It was used in a number of ATI graphics cards, but not picked up by Nvidia and, though it became a JEDEC standard, it never really caught on.

AMD's graphics division is at it again now with GDDR5. Working together with the JEDEC standards body, AMD expects this new memory type to become quite popular and eventually all but replace GDDR3. Though AMD plans to be the first with graphics cards using GDDR5, the planned production by Hynix, Qimonda, and Samsung speak to the sort of volumes that only come with industry-wide adoption. Let's take a look at the new memory standard and what sets it apart from GDDR3 and GDDR4.

Source: ExtremeTech
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,283 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
But what's the point when only one company uses it, and that too on its 'top-of-the-line' product (HD4870), while another company used GDDR3 across 5 generations of products?
 

spearman914

New Member
Joined
Apr 14, 2008
Messages
3,338 (0.57/day)
Location
Brooklyn, New York 11223
System Name Mine | Dad + Mom
Processor E8500 E0 Wolfdale @ 4.6GHz 1.5V | E2180 M0 Allendale @ 3.0GHz 1.3V
Motherboard Asus Maximus Formula (X48) w/ Rampage BIOS | Asus P5Q Pro (P45)
Cooling Xigmatek Rifle HDT-S1283 w/ SFF21F Fan | Arctic Cooling Freezer 7 Pro
Memory G.Skill Pi Black 2x2GB 1.02GHz CL5 | OCZ Reaper 2x2GB 1.05GHz CL5
Video Card(s) Sapphire 4870X2 2GB 820/1020MHz | Sapphire 4850 1GB 700/1100MHz
Storage WD VR 150GB 10K RPM + WD 500GB 7.2K RPM | WD 200GB 7.2K RPM
Display(s) Acer P243WAID 24" 1920x1200 LCD | Acer V193W 19" 1440x900 LCD
Case Cooler Master HAF 932 Full-Tower | Antec Twelve Hundred Mid-Tower
Audio Device(s) Fatal1ty Xtreme Gamer w/ Z-5500 5.1 | On-Board Audio w/ S-220 2.1
Power Supply PC Power and Cooling 750W Non-Modular | Corsair HX-520W Modular
Software Windows Vista Home Premium X64 | Windows Vista Home Premium X64
Benchmark Scores Not Wasting Time!
But what's the point when only one company uses it, and that too on its 'top-of-the-line' product (HD4870), while another company used GDDR3 across 5 generations of products?

Yea I know. But really, theres no real difference in gaming between GDDR3,4,and 5 yet.
 
Joined
May 9, 2006
Messages
2,116 (0.32/day)
System Name Not named
Processor Intel 8700k @ 5Ghz
Motherboard Asus ROG STRIX Z370-E Gaming
Cooling DeepCool Assassin II
Memory 16GB DDR4 Corsair LPX 3000mhz CL15
Video Card(s) Zotac 1080 Ti AMP EXTREME
Storage Samsung 960 PRO 512GB
Display(s) 24" Dell IPS 1920x1200
Case Fractal Design R5
Power Supply Corsair AX760 Watt Fully Modular
Just have to weigh the costs between things. I guess one of the big questions would be if its cheaper to go GDDR5 with a 256bit bus or to go GDDR3 with a 512bit bus.
 

HTC

Joined
Apr 1, 2008
Messages
4,601 (0.79/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
Have you dudes checked the page? There's more, you know!
 
Joined
May 19, 2007
Messages
7,662 (1.24/day)
Location
c:\programs\kitteh.exe
Processor C2Q6600 @ 1.6 GHz
Motherboard Anus PQ5
Cooling ACFPro
Memory GEiL2 x 1 GB PC2 6400
Video Card(s) MSi 4830 (RIP)
Storage Seagate Barracuda 7200.10 320 GB Perpendicular Recording
Display(s) Dell 17'
Case El Cheepo
Audio Device(s) 7.1 Onboard
Power Supply Corsair TX750
Software MCE2K5
But what's the point when only one company uses it, and that too on its 'top-of-the-line' product (HD4870), while another company used GDDR3 across 5 generations of products?

Have you read the link?
 

Megasty

New Member
Joined
Mar 18, 2008
Messages
1,263 (0.22/day)
Location
The Kingdom of Au
Processor i7 920 @ 3.6 GHz (4.0 when gaming)
Motherboard Asus Rampage II Extreme - Yeah I Bought It...
Cooling Swiftech.
Memory 12 GB Crucial Ballistix Tracer - I Love Red
Video Card(s) ASUS EAH4870X2 - That Fan Is...!?
Storage 4 WD 1.5 TB
Display(s) 24" Sceptre
Case TT Xaser VI - Fugly, Red, & Huge...
Audio Device(s) The ASUS Thingy
Power Supply Ultra X3 1000W
Software Vista Ultimate SP1 64bit
In the graphics business, there's no such thing as too much memory bandwidth.

Too true, the increased speed doesn't hurt either :p
 
Joined
Nov 14, 2007
Messages
1,773 (0.30/day)
Location
Detroit, MI
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS TUF Gaming X570-Pro Wifi II
Cooling Hyper 212 EVO v2
Memory 2x16GB G.Skill DDR4-4000
Video Card(s) AMD RX 6750 XT
Storage WD Black SN850 2TB, various other SSD's from ages past
Display(s) LG 27GL850 1440@144, AG Neovo EM2701QC 1440@75
Case Zophos EVO Silent by Raijintek
Audio Device(s) HyperX Cloud II Wireless headphones
Power Supply Corsair RM850x
Mouse ProtoArc EM01
Keyboard Razer Blackwidow X Chroma Mercury
Software Windows 11 Pro
That was a pretty good read. In the end they figure GDDR5 to be as cost effective as GDDR3, so why not just replace the whole lot of the stuff? It'd be overkill for weaker cards, but they'd probably get better pricing if they went exclusive GDDR5 with Samsung, Qimonda, and Hynix.

Every card I've owned has always benefited more from a higher memory clock than core. I'm down with a 4870 if they got the GDDR5, but I'd rather stick with a 4850 if they get the same stuff.
 
Joined
Nov 8, 2006
Messages
5,052 (0.80/day)
Location
Manchester, United Kingdom
Processor AMD FX 8320 @ 4GHz
Motherboard Gigabyte GA-990FXA-UD5 rev1
Cooling Corsair H70
Memory 4 x 4GB DDR3 Ripjawz 1600Mhz
Video Card(s) Sapphire Vapor-X AMD R9 280X
Storage 1 x 500GB Samsung Evo 850, 1 x 500GB Vrap Data Drive, 3 x 2TB Seagate, 1 x 1TB Samsung F1
Display(s) 3 x DGM IPS-2402WDH
Case Coolermaster HAF X
Audio Device(s) Onboard
Power Supply Coolermaster 1000W Silent Pro M
Mouse Logitech G502
Keyboard Logitech G510
Software Windows 10 Pro x64
Yea I know. But really, theres no real difference in gaming between GDDR3,4,and 5 yet.

Says who? GDDR5 has never been seen in a real card.

And GDDR4 is better than GDDR3, it's just offset by the crappy bus widths nvidia and ATi use.

Also, it won't cost more, think about it, GDDR5 costs slightly more than GDDR4 ok, but then GPUs are getting smaller, they are fitting more on a wafer, so each one costs less. In the end, counting everything overall, the graphics card will be cheaper to make.

And to btarunr, because ATi looks to the future (and generally fails lol), where as nvidia is still stuck with it's brute force method (bigger badder GPUs).
 
Joined
Dec 20, 2005
Messages
245 (0.04/day)
Processor 2500K
Motherboard Asus P8Z68-V
Cooling Stock
Memory Samsung MV-3V4G3D/US 4x4 1866@99927 1.41v
Video Card(s) Sapphire 280x
Display(s) crossover
Case junk
Audio Device(s) usbstick
Power Supply enermax 82+pro 5years+ still good
There was some news, where someone stated that its more profitable to produce flash memory for wannebe ssd ´s instead of ddr5, so this "shortage" might last for quite some time :mad:
 
Joined
Nov 14, 2007
Messages
1,773 (0.30/day)
Location
Detroit, MI
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS TUF Gaming X570-Pro Wifi II
Cooling Hyper 212 EVO v2
Memory 2x16GB G.Skill DDR4-4000
Video Card(s) AMD RX 6750 XT
Storage WD Black SN850 2TB, various other SSD's from ages past
Display(s) LG 27GL850 1440@144, AG Neovo EM2701QC 1440@75
Case Zophos EVO Silent by Raijintek
Audio Device(s) HyperX Cloud II Wireless headphones
Power Supply Corsair RM850x
Mouse ProtoArc EM01
Keyboard Razer Blackwidow X Chroma Mercury
Software Windows 11 Pro
What shortage?

Switching to GDDR5 would make smaller chips, less PCB layers, more efficient bus widths, and cause less "shortages".
 

HTC

Joined
Apr 1, 2008
Messages
4,601 (0.79/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
Says who? GDDR5 has never been seen in a real card.

And GDDR4 is better than GDDR3, it's just offset by the crappy bus widths nvidia and ATi use.

Also, it won't cost more, think about it, GDDR5 costs slightly more than GDDR4 ok, but then GPUs are getting smaller, they are fitting more on a wafer, so each one costs less. In the end, counting everything overall, the graphics card will be cheaper to make.

And to btarunr, because ATi looks to the future (and generally fails lol), where as nvidia is still stuck with it's brute force method (bigger badder GPUs).

Yeah: compare the die sizes of both nVidia's and ATI's next gen cards :twitch:
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,283 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
And to btarunr, because ATi looks to the future (and generally fails lol), where as nvidia is still stuck with it's brute force method (bigger badder GPUs).

I'm just looking at the near future of GDDR4/5 memory, and that a tiny minority of cards actually use them, that too from ATI which commands less than half of the discrete graphics market share. Since NV makes powerful GPU's that end up faring better than competition, they needn't use better memory and end up using GDDR3 that's dirt cheap these days....profit. Whereas ATI use GDDR4/5 more to build up aspirational value to their products. They need performance increments to come from wherever they can manage to. Stronger/more expensive memory used. It's expensive because companies like Qimonda are pushed to making these memory that are produced on a small scale, lower profit.
 
Joined
Nov 8, 2006
Messages
5,052 (0.80/day)
Location
Manchester, United Kingdom
Processor AMD FX 8320 @ 4GHz
Motherboard Gigabyte GA-990FXA-UD5 rev1
Cooling Corsair H70
Memory 4 x 4GB DDR3 Ripjawz 1600Mhz
Video Card(s) Sapphire Vapor-X AMD R9 280X
Storage 1 x 500GB Samsung Evo 850, 1 x 500GB Vrap Data Drive, 3 x 2TB Seagate, 1 x 1TB Samsung F1
Display(s) 3 x DGM IPS-2402WDH
Case Coolermaster HAF X
Audio Device(s) Onboard
Power Supply Coolermaster 1000W Silent Pro M
Mouse Logitech G502
Keyboard Logitech G510
Software Windows 10 Pro x64
I'm just looking at the near future of GDDR4/5 memory, and that a tiny minority of cards actually use them, that too from ATI which commands less than half of the discrete graphics market share. Since NV makes powerful GPU's that end up faring better than competition, they needn't use better memory and end up using GDDR3 that's dirt cheap these days....profit. Whereas ATI use GDDR4/5 more to build up aspirational value to their products. They need performance increments to come from wherever they can manage to. Stronger/more expensive memory used. It's expensive because companies like Qimonda are pushed to making these memory that are produced on a small scale, lower profit.

Ok, but did you read the rest of the article? Yes, the memory itself is more expensive, but it allows for less complex PCB designs (lower cost), and the die shrinks (lower cost) and the experience at producing dies at 55nm (lower cost)

So all in all, there won't be that much of a price hike, if any.

Not only that, but GDDR5 is going to be a bigger performance jump than going from GDDR3 to GDDR4, just because GDDR3 is dirt cheap doesn't make it better to use on your next generation GPUs.

And you're wrong, You NEED to pair a strong GPU, with stronger memory. With GPUs getting more and more powerful, they need bigger bandwidths, and GDDR5 is the next logical step.
It's just like in a PC, if you start bottlenecking the GPU, it doesn't matter how powerful you make it, because the GDDR3 will be holding it back.
 

Rebo&Zooty

New Member
Joined
May 17, 2008
Messages
490 (0.08/day)
Execly DN, memory price will be higher but pcb price offsets that.

also more complex pcb the more likely you are to have failings in the pcb itself due to production flaws.

the more complex something is the more likely it is to fail. this has alwase been true.

now as to stron gpu not needing strong ram.........I cant say what i am thinking without getting infracted so I will put it a diffrent way.

only a fanboi would say that a strong gpu dosnt need good/strong ram, and in this case i see alot of nvidiots saying that kinda crap because nvidia is sticking with ddr3, honestly the reasion they are sticking with ddr3 is because ITS CHEAP and they have large contracts that make it even cheaper, not because its the best tech for the job, not because it gives the best performance, but because they want to make as much per card as they can, look at their card prices, they are alwase TO DAMN HIGH per card, i have an 8800gt 512(its being replaced how...) it was an "ok" buy at the time, but the price most ppl where paying for them was 320bucks, thats insain........

ok the 9600and 8800gt/9600gos are decent priced BUT they are still high for what your getting in my view....the 3870 would be a better buy at that price range and far more future proof.

blah i dont want to argue, im tired, its late, and i need some rest........

read the artical, and understand that lower power and higher clocks/bandwith mean that u dont need to make an insainly complex card that costs a ton to build, you can build a cheaper card(pcb) and get the same or better performance.

also note 3 makers are already onboard, would be more follow suit to.......cant wait to see this stuff in action.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,283 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
And you're wrong, You NEED to pair a strong GPU, with stronger memory. With GPUs getting more and more powerful, they need bigger bandwidths, and GDDR5 is the next logical step.
It's just like in a PC, if you start bottlenecking the GPU, it doesn't matter how powerful you make it, because the GDDR3 will be holding it back.

Well that's what NVidia chooses not to do. They're making the GT200 use GDDR3, but part of the reason is also that the GPU itself is very expensive ($125 /die, $150 /package) so that's $150 for the GPU alone. More in this contentious article. So NVidia is using GDDR3 more for economic reasons. And if this is the scheme of things, they'll keep themselves away from GDDR4/5 for quite some time though they're already a JEDEC standard technologies.
 
Joined
Nov 8, 2006
Messages
5,052 (0.80/day)
Location
Manchester, United Kingdom
Processor AMD FX 8320 @ 4GHz
Motherboard Gigabyte GA-990FXA-UD5 rev1
Cooling Corsair H70
Memory 4 x 4GB DDR3 Ripjawz 1600Mhz
Video Card(s) Sapphire Vapor-X AMD R9 280X
Storage 1 x 500GB Samsung Evo 850, 1 x 500GB Vrap Data Drive, 3 x 2TB Seagate, 1 x 1TB Samsung F1
Display(s) 3 x DGM IPS-2402WDH
Case Coolermaster HAF X
Audio Device(s) Onboard
Power Supply Coolermaster 1000W Silent Pro M
Mouse Logitech G502
Keyboard Logitech G510
Software Windows 10 Pro x64
Well that's what NVidia chooses not to do. They're making the GT200 use GDDR3, but part of the reason is also that the GPU itself is very expensive ($125 /die, $150 /package) so that's $150 for the GPU alone. More in this contentious article. So NVidia is using GDDR3 more for economic reasons. And if this is the scheme of things, they'll keep themselves away from GDDR4/5 for quite some time though they're already a JEDEC standard technologies.

Have you ever wondered WHY nvidia is making such an expensive GPU? As I've said before, it's just a brute force method to make a more powerful GPU. Instead of stopping, scrapping when they have and creating a really efficient GPU architecture (like ATi did) they don't stand a cat in hells chance against ATi this coming year.

Considering how powerful the 4870 is meant to be, I honestly don't see anyone with any knowledge of GPUs going for nvidia with that hefty a price tag...
 
Joined
Dec 28, 2006
Messages
4,378 (0.69/day)
Location
Hurst, Texas
System Name The86
Processor Ryzen 5 3600
Motherboard ASROCKS B450 Steel Legend
Cooling AMD Stealth
Memory 2x8gb DDR4 3200 Corsair
Video Card(s) EVGA RTX 3060 Ti
Storage WD Black 512gb, WD Blue 1TB
Display(s) AOC 24in
Case Raidmax Alpha Prime
Power Supply 700W Thermaltake Smart
Mouse Logitech Mx510
Keyboard Razer BlackWidow 2012
Software Windows 10 Professional
on paper the 2900XT should have crushed all comers, instead it barley put up a fight agasint the 8800GTS 640. AMD can look great on paper, but give me some proof they can compete. As for the 3870 being futureproof i beg to differ. It has 5 groups of 64 shaders. Only 1 in each group can do complex shader work, 2 can do simple, one does interger the other does floating point. Now in the real world this means that 128 of those shader units wont be used if at all, the floating point and interger units, and the simple shaders are not used thanks to AMD's lack to supply a complier for there cards. LEt it look as good as you want, but if AMD can't supply a code complier so code works right on there design they are still screwed.
 
Joined
Sep 5, 2004
Messages
1,956 (0.27/day)
Location
The Kingdom of Norway
Processor Ryzen 5900X
Motherboard Gigabyte B550I AORUS PRO AX 1.1
Cooling Noctua NB-U12A
Memory 2x 32GB Fury DDR4 3200mhz
Video Card(s) PowerColor Radeon 5700 XT Red Dragon
Storage Kingston FURY Renegade 2TB PCIe 4.0
Display(s) 2x Dell U2412M
Case Phanteks P400A
Audio Device(s) Hifimediy Sabre 9018 USB DAC
Power Supply Corsair AX850 (from 2012)
Software Windows 10?
But what's the point when only one company uses it, and that too on its 'top-of-the-line' product (HD4870), while another company used GDDR3 across 5 generations of products?
whats the point of Intel @ DDR3? :p
adopting new technologies is good
even the memory companys agree and will make GDDR5 a standard so whats the problem?
 
Joined
Dec 28, 2006
Messages
4,378 (0.69/day)
Location
Hurst, Texas
System Name The86
Processor Ryzen 5 3600
Motherboard ASROCKS B450 Steel Legend
Cooling AMD Stealth
Memory 2x8gb DDR4 3200 Corsair
Video Card(s) EVGA RTX 3060 Ti
Storage WD Black 512gb, WD Blue 1TB
Display(s) AOC 24in
Case Raidmax Alpha Prime
Power Supply 700W Thermaltake Smart
Mouse Logitech Mx510
Keyboard Razer BlackWidow 2012
Software Windows 10 Professional
the problem is GDDR5 will suffer like GDDR4 did when it was new, insane latancy. Also for Nvidia they started work on the GT200 right after the G80 was shipped, at the time GDDR4 wasn't viable and GDDR5 wasn't heard of. Do you expect Nvidia to stop working on there next gen just to include a new memory controller?
 
Joined
Sep 5, 2004
Messages
1,956 (0.27/day)
Location
The Kingdom of Norway
Processor Ryzen 5900X
Motherboard Gigabyte B550I AORUS PRO AX 1.1
Cooling Noctua NB-U12A
Memory 2x 32GB Fury DDR4 3200mhz
Video Card(s) PowerColor Radeon 5700 XT Red Dragon
Storage Kingston FURY Renegade 2TB PCIe 4.0
Display(s) 2x Dell U2412M
Case Phanteks P400A
Audio Device(s) Hifimediy Sabre 9018 USB DAC
Power Supply Corsair AX850 (from 2012)
Software Windows 10?
the problem is GDDR5 will suffer like GDDR4 did when it was new, insane latancy. Also for Nvidia they started work on the GT200 right after the G80 was shipped, at the time GDDR4 wasn't viable and GDDR5 wasn't heard of. Do you expect Nvidia to stop working on there next gen just to include a new memory controller?
insane what?
its not atis problem, nvidia is lacking a proper memory controller ^^
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,283 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
the problem is GDDR5 will suffer like GDDR4 did when it was new, insane latancy. Also for Nvidia they started work on the GT200 right after the G80 was shipped, at the time GDDR4 wasn't viable and GDDR5 wasn't heard of. Do you expect Nvidia to stop working on there next gen just to include a new memory controller?

ATI started work on the R700 architecture about the same time when they released the HD 2900XT. Taken, GDDR5 was unheard of then, but later, the RV770 did end up with a GDDR5 controller, didn't it? Goes on to show that irrespective of when a company starts work on an architecture, something as modular as a memory controller can be added to the architecture even weeks before they hand over designs to the fabs to make an ES and eventually mass production.

So, about when NV started work on the GT200 is a lame excuse.
 
Joined
Dec 28, 2006
Messages
4,378 (0.69/day)
Location
Hurst, Texas
System Name The86
Processor Ryzen 5 3600
Motherboard ASROCKS B450 Steel Legend
Cooling AMD Stealth
Memory 2x8gb DDR4 3200 Corsair
Video Card(s) EVGA RTX 3060 Ti
Storage WD Black 512gb, WD Blue 1TB
Display(s) AOC 24in
Case Raidmax Alpha Prime
Power Supply 700W Thermaltake Smart
Mouse Logitech Mx510
Keyboard Razer BlackWidow 2012
Software Windows 10 Professional
ATI made GDDR5 bta didnt you get the memo they have most likly been working on it just as long. I approve of what Nvidia is doing, using known tech with a wider bus is just as effective, and there is less chance of massive lat issues like there will be with GDDR5. I prefer tried and true, this will be the 2nd time AMD tried something new with there graphics cards and this will be the 2nd time they fail. I was dead right with the 2900XT failing, i said it would before it even went public, and ill be right about this.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,283 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
GDDR5 is a JEDEC standard. Irrespective of who makes it, any licensed company can use it. HD4870 is more of something that will beat 9800 GTX and go close to GTX 260. It's inexpensive, cool, efficient. Don't try to equate HD4870 to GTX 280, you'll end up comparing a sub-400 dollar card to something that's 600+ dollars. The better comparison would be to HD4870 X2, which is supposed to be cheaper than GTX 280 and has win written all over it.
 

Rebo&Zooty

New Member
Joined
May 17, 2008
Messages
490 (0.08/day)
GDDR5 is a JEDEC standard. Irrespective of who makes it, any licensed company can use it. HD4870 is more of something that will beat 9800 GTX and go close to GTX 260. It's inexpensive, cool, efficient. Don't try to equate HD4870 to GTX 280, you'll end up comparing a sub-400 dollar card to something that's 600+ dollars. The better comparison would be to HD4870 X2, which is supposed to be cheaper than GTX 280 and has win written all over it.

yes bta, exectly, but you gotta remmber candle has an irrational hate for amd/ati, logic: he fails it....

tho im shocked to see you say the 4870/4870x2 has win writen allover it.....did you forget your nvidia pillz today?
 
Top