• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Production of 21Gbps and 24Gbps GDDR6X Memory Chips Underway at Micron

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,768 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Memory giant Micron Technology has commenced mass-production of 21 Gbps and 24 Gbps-rated GDDR6X memory chips that will be exclusively used by NVIDIA in its next-generation RTX 40-series "Ada" graphics cards. GDDR6X is a derivative of GDDR6 co-developed by NVIDIA and Micron, which leverages PAM4 signaling to increase data-rates. Depending on the graphics card model, NVIDIA will use 8 Gbit (1 GB) or 16 Gbit (2 GB) density memory chips. We're hearing that 21 Gbps will be the standard data-rate used by SKUs that succeed the RTX 3080 and RTX 3080 Ti; while 24 Gbps will be used by the faster RTX 3090/Ti successors. The part numbers of these memory chips are listed below.



View at TechPowerUp Main Site | Source
 
which leverages PAM4 signaling to increase data-rates
Does it actually though? I thought the promise was to double the effective data rate at the same clocks, yet the only benefit for Ampere was the higher clocks.

So if that's true, it could make for a hell of a lot of bandwidth at the same memory bus widths relatively, and justify the power usage of GDDR6X.

That's partly why these cards are so power hungry, the GDDR6X on my 3080 wants for ~80w+ when stressed. I mean you only need to look as far as the 3070 vs 3070Ti to see that 220w vs 290w for a 7% increase in performance... yeah the GDDR6X is massively dragging down overall efficiency.

If the bandwidth is actually going to be double at the same bus width and clock rate.... wowee :eek: But if not... meh, better of with fast GDDR6.
 
6X sounds cooler than 6, which I feel is the entire reason for the existence of this. Well, this and a good relationship between Nvidia and Micron. Didn't Samsung already announce their 24gbps GDDR6 sans-X?
 
6X sounds cooler than 6, which I feel is the entire reason for the existence of this. Well, this and a good relationship between Nvidia and Micron. Didn't Samsung already announce their 24gbps GDDR6 sans-X?
You are correct.

July 13th
 
Does it actually though? I thought the promise was to double the effective data rate at the same clocks, yet the only benefit for Ampere was the higher clocks.

So if that's true, it could make for a hell of a lot of bandwidth at the same memory bus widths relatively, and justify the power usage of GDDR6X.

That's partly why these cards are so power hungry, the GDDR6X on my 3080 wants for ~80w+ when stressed. I mean you only need to look as far as the 3070 vs 3070Ti to see that 220w vs 290w for a 7% increase in performance... yeah the GDDR6X is massively dragging down overall efficiency.

If the bandwidth is actually going to be double at the same bus width and clock rate.... wowee :eek: But if not... meh, better of with fast GDDR6.
The data rate is going to be 12 gigatransfers per second (for the faster variant), multiplied by 2 bits per transfer. That's the idea and it should lead to some power saving. 15%, they say. However, even if the transfers consume less, there will necessarily be other parts of the DRAM chips operating at 24 GHz, which means hotter.

As for "clocks", it's complicated and there are at least two clocks involved, CK will be 3 GHz and WCK will be 6 GHz.
 
Yeyy get ready to see 115ÂşC VRAM temps...
 
You are correct.

July 13th

The difference being, it took nearly 2 years for Samsung to mass-produce that memory (and nearly three to double density,) while the 6x parts typically ship within a few months of being announced!


The 19-20 gbs gddr6 is a lot more likely for Ada
 
Last edited:
Which Fabs are being used for them? Hopefully not in China or Taiwan... drums of war and all that.
 
Which Fabs are being used for them? Hopefully not in China or Taiwan... drums of war and all that.

We've been hearing about war between China and Taiwan for fifty years now, and its still just as infeasible today as it was 50 years ago!

As an island nation 100 miles off the coast, it would take massive forces to hold the island in the first place, and then its painless to simply blockade all trade, and make all those chip fabs worthless overnight. Imagine if we actually tried to do a full military blockade, instead of doing "due-diligence" , and leave it up to corporations to police their imports?

China isn't stupid, as they know they will lose all their freebies they currently enjoy ( endless piles of advanced tech that they can slowly absorb by being beholden to our businesses.) But imagine the amount of blockade they would permanently get if they pulled a Ukraine?

They just like to talk big on the world stage, as its good for internal oppression campaigns (never question our global superiority, by being the last one to speak in the in the global slap fight!)

Also, Samsung has no fabs in either country
 
Last edited:
We've been hearing about war between China and Taiwan for fifty years now, and its still just as infeasible today as it was 50 years ago!

As an island nation 100 miles off the coast, it would take massive forces to hold the island in the first place, and then its painless to simply blockade all trade, and make all those chip fabs worthless overnight. Imagine if we actually tried to do a full military blockade, instead of doing "due-diligence" , and leave it up to corporations to police their imports?

China isn't stupid, as they know they will lose all their freebies they currently enjoy ( endless piles of advanced tech that they can slowly absorb by being beholden to our businesses.) But imagine the amount of blockade they would permanently get if they pulled a Ukraine?

They just like to talk big on the world stage, as its good for internal oppression campaigns (never question our global superiority, by being the last one to speak in the in the global slap fight!)

Also, Samsung has no fabs in either country
Samsung has a fab in China
 
We've been hearing about war between China and Taiwan for fifty years now, and its still just as infeasible today as it was 50 years ago!

As an island nation 100 miles off the coast, it would take massive forces to hold the island in the first place, and then its painless to simply blockade all trade, and make all those chip fabs worthless overnight. Imagine if we actually tried to do a full military blockade, instead of doing "due-diligence" , and leave it up to corporations to police their imports?

China isn't stupid, as they know they will lose all their freebies they currently enjoy ( endless piles of advanced tech that they can slowly absorb by being beholden to our businesses.) But imagine the amount of blockade they would permanently get if they pulled a Ukraine?

They just like to talk big on the world stage, as its good for internal oppression campaigns (never question our global superiority, by being the last one to speak in the in the global slap fight!)

Also, Samsung has no fabs in either country
China being sanctioned and/or blockaded if they invaded Taiwan isn't so certain. If China gains control of the only leading edge facilities in the world, who is going to come Taiwan's aid, just to see their chip supply cut off, along with a lot of other trade? I think I saw some plans to stop sending ASML machines/tools to China if they invade, but if China thinks they can circumvent that plan, Taiwan without serious international support is doomed imo.

Edit: I mean I hope you're right and all but let's face reality, china has been rapidly expanding both their navy, air force and belligerent North Korean-style rhetoric in the last 5 years.
 
Sheesh, it's armchair general hour again already? Wow kids.

Which still lacks even more than one or two real things we call a "modern aircraft carriers." Friggin lol.

China being sanctioned and/or blockaded if they invaded Taiwan isn't so certain.
You have to be joking. Please be joking.
 
Sheesh, it's armchair general hour again already? Wow kids.


Which still lacks even more than one or two real things we call a "modern aircraft carriers." Friggin lol.


You have to be joking. Please be joking.
My armchair general uniform has sixteen stars and a hundred medals, what does yours have, anonymous internet user? :):laugh:
 
My armchair general uniform has sixteen stars and a hundred medals, what does yours have, anonymous internet user? :):laugh:
Just knowledge that we don't know anything.

So, a pickle.
 
Does it actually though? I thought the promise was to double the effective data rate at the same clocks, yet the only benefit for Ampere was the higher clocks.

So if that's true, it could make for a hell of a lot of bandwidth at the same memory bus widths relatively, and justify the power usage of GDDR6X.

That's partly why these cards are so power hungry, the GDDR6X on my 3080 wants for ~80w+ when stressed. I mean you only need to look as far as the 3070 vs 3070Ti to see that 220w vs 290w for a 7% increase in performance... yeah the GDDR6X is massively dragging down overall efficiency.

If the bandwidth is actually going to be double at the same bus width and clock rate.... wowee :eek: But if not... meh, better of with fast GDDR6.

maybe cost advantage? something like 21Gbps GDDR6x cost the same as 16Gbps GDDR6?

The only Nvidia GPU that has 16Gbps GDDR6 is the 2080S, even 3070 has 14Gbps GDDR6
 
maybe cost advantage? something like 21Gbps GDDR6x cost the same as 16Gbps GDDR6?

The only Nvidia GPU that has 16Gbps GDDR6 is the 2080S, even 3070 has 14Gbps GDDR6
Definitely a ;possibility that I didn't consider, I just hope it's worth it overall for cost/performance/power consumption.
 
Back
Top