• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce RTX 4070 with Slower GDDR6 Memory Priced on-par with Regular RTX 4070 Cards

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,683 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA GeForce board partners are preparing a silent launch of a variant of the GeForce RTX 4070 with slower 20 Gbps GDDR6 memory in place of the 21 Gbps GDDR6X that's standard to the RTX 4070, which results in a 5% reduction in memory bandwidth. It turns out that other specs, such as GPU clocks or core-configuration aren't changed to compensate for the reduced memory bandwidth. ASUS is among the first board partners with an RTX 4060 GDDR6 card, the ASUS DUAL RTX 4070 GDDR6, which was briefly listed on Newegg for $569, before it went out of stock. This is reported by VideoCardz as being the same price as the regular ASUS DUAL RTX 4070 with GDDR6X.

ASUS isn't the only NVIDIA board partner with an RTX 4070 GDDR6, Wccftech spotted a GALAX branded card that comes with the model string "RTX 4070 D6 1-click OC." Its retail box features a large specs-sheet on the front face that clearly mentions GDDR6 as the memory type. NVIDIA's move to re-spec the RTX 4070 with 20 Gbps GDDR6 was originally seen as a move to reduce its costs, letting the card be sold closer to the $500-mark. It remains to be seen if real-world prices settle down below those of the original RTX 4070 cards.



View at TechPowerUp Main Site | Source
 
Any reviews of the GDDR6 version? Seen none. Maybe the perf diff is minor. Should be slightly cheaper tho.
 
I don´t but people buy it anyway..
 
Same price as GDDR6X version. They list the GDDR6 in the specs on Newegg and put it on the box but I doubt most gamers look at that. They probably just see 4070 name. Price should be lower but I have yet to see how much difference the slower VRAM makes so it's hard to tell what the price should be. Might just amount to a minuscule difference in benches.
 
Plans to make the new version cheaper angered Nvidia's accountants.
 
The last time this graphics card came up, I mentioned that it's a nothingburger and the G6 rendition might actually have some strengths over the normal variant. Hopefully we will see reviews soon.
 
@Dr. Dro
Right. I don’t think anything indicated that the 4070 was ever bandwidth starved (complaints were about the AMOUNT of VRAM) and 5% is a negligible decrease. And the regular GDDR6 should be less power hungry and cooler than 6X, so it might be an arguable improvement.
 
Looking forward to a TPU review on them to see, but my I've had a 4070 for almost 1 1/2 years, can't imagine wanting one this late in the release cycle.
 
@Dr. Dro
Right. I don’t think anything indicated that the 4070 was ever bandwidth starved (complaints were about the AMOUNT of VRAM) and 5% is a negligible decrease. And the regular GDDR6 should be less power hungry and cooler than 6X, so it might be an arguable improvement.

5% decrease in throughput but also tighter timings/lower access latency, the power and heat reduction is also a real factor. Should hit 21Gbps very easily anyway.

Looking forward to a TPU review on them to see, but my I've had a 4070 for almost 1 1/2 years, can't imagine wanting one this late in the release cycle.

I mean, this is clearly not targeted at 4070 owners nor sold as an improvement, it's just a late cycle subvariant.
 
Looking forward to a TPU review on them to see, but my I've had a 4070 for almost 1 1/2 years, can't imagine wanting one this late in the release cycle.
Well, some people are still buying Radeon 6000 and RTX 3000 series. I can't imagine that either, yet it happens.
 
Of course people will buy them. Not everyone needs cutting edge performance and everything indicates that NV will go with the top of the stack when Blackwell releases in, what, January at the earliest? The 4070 is a perfectly serviceable GPU performance-wise, someone buying it isn’t aiming for 4K, Ultra, RT Reflections on everything, Final Destination only anyway.
 
Any reviews of the GDDR6 version? Seen none. Maybe the perf diff is minor. Should be slightly cheaper tho.
Perf diff has to be minor. Memory is just 5% slower, the card would only be 5% slower if it was maxxing VRAM transfers all the time.
I'm more curious about what this change does to the power draw.
 
Last edited:
Perf diff has to be minor. Memory is just 5% slower, he card would only be 5% slower if it was maxxing VRAM transfers all the time.
I'm more curious about what this change does to the power draw.
True. I would say 25-50 watts lower considering 3070 Ti used like 50-75 watts more than 3070 while performance diff was ~5%, a few more cores too but GDDR6 vs GDDR6X was the main diff
 
The shrinking box of cereal or bag of chips has hit PC components. Awesome.
 
The shrinking box of cereal or bag of chips has hit PC components. Awesome.
When the price stays the same, but the quantity goes down, it's called "Shrinkflation" in the UK.
 
Just call it Nvidia Loyalty Tax.

Keep quiet and pay your overlords, they are there to screw with you.
 
True. I would say 25-50 watts lower considering 3070 Ti used like 50-75 watts more than 3070 while performance diff was ~5%, a few more cores too but GDDR6 vs GDDR6X was the main diff
25-50W sounds about the right range from what I remember from my RTX4070. The card power limit is still the same and if they save 20-30W, that will offset the memory disadvantage with higher clocks on the core.
I would much prefer they made a new SKU but on the other hand if the end result is the same, I am not sure which would confuse customers more.
 
  • Like
Reactions: bug
Just call it Nvidia Loyalty Tax.

Keep quiet and pay your overlords, they are there to screw with you.

Nonsense, hardware revision for high-volume SKUs is common practice. How many variants of Polaris did AMD release? ;)
 
Just call it Nvidia Loyalty Tax.

Keep quiet and pay your overlords, they are there to screw with you.
Has nothing to do with brand loyalty, most people don't really care - If AMD GPUs were competitive, they would sell.

Reality is that AMD is cheaper for a reason, and still don't sell well. Looking at pure raster perf and nothing else, in 2024, is a big nope. People want good upscaling. Good Frame Gen. Useful RT, Just features that work. Meanwhile AMD releases beta features like Anti Lag + that got people VAC banned on Steam. Reflex is far superior. Just like DLSS/DLAA beats FSR. DLDSR beats VSR. And I could keep going. Also, RTX features are in like 600+ games at the moment. We are not talking about a small handful of games here.

Nvidia sits at like ~90% gaming GPU marketshare because they offer superior products, with vastly better features and runs at lower power usage in general (both in games, when using multi monitor setups and watching simple video, Nvidia is like twice as efficient in MM and during videoplayback - Also wins in games by 10-25% depending on GPU - Go look any TPU review in the Power Consumption tests)

Also, its even worse in esport titles.

Take a look at 2:30 here:

 
Last edited:
The shrinking box of cereal or bag of chips has hit PC components. Awesome.

Bad analogy. That would imply that Nvidia will just replace GDDR6X with GDDR6 VRAM in all GPUs and keep the same MSRP.
 
it's called "Shrinkflation" in the UK.
Never met an anglophone unaware of this term. Not a UK exclusive word for sure. I go a tad more blunt and call it a daytime robbery.

However, I don't see how this "new" 4070 will be measurably better or worse than the normal 4070. Even the 4070 Ti struggles to max its b/w at resolutions below 4K and this GPU doesn't promise significant OC even if G6X→G6 shaves >20 W off.
Bad analogy. That would imply that Nvidia and AMD will just replace GDDR6X with GDDR6 VRAM in all GPUs and keep the same MSRP.
They replaced ~50% dies for $500 with ~33% dies for $600. 4070 is just a THIRD of what a fully developed Ada die can do, whereas 3070 is about 50% of what you can squeeze from Ampere. With competition being non-existent I don't see it stopping.
 
They replaced ~50% dies for $500 with ~33% dies for $600. 4070 is just a THIRD of what a fully developed Ada die can do, whereas 3070 is about 50% of what you can squeeze from Ampere. With competition being non-existent I don't see it stopping.

That's not what this topic is about and the GDDR6 decision isn't an across the board change so it was a bad analogy on Nater's part. Right?
 
Back
Top