Friday, September 6th 2024

ASUS Launches GeForce RTX 4070 with GDDR6 Memory

ASUS became the first NVIDIA add-in card partner to debut a GeForce RTX 4070 graphics card with GDDR6 memory, away from its original memory type of GDDR6X. The RTX 4070 originally comes with 12 GB of 21 Gbps GDDR6X memory, yielding 504 GB/s of memory bandwidth. This ASUS card (which we hope is part of a larger refresh by NVIDIA to reduce costs of the RTX 4070), uses GDDR6 of a yet-unspecified speed. You do have 21 Gbps GDDR6 (non-X) in the market, but we imagine these to be expensive, which leaves NVIDIA with the 20 Gbps GDDR6 chip that's more readily available (which AMD uses in its RDNA 3 graphics cards), and 18 Gbps GDDR6 that the company itself uses in cards such as the RTX 4060 Ti. Any reduction in memory bandwidth would have to be compensated with increase in GPU clocks, but we don't see that happening here—the regular variant of the ASUS DUAL RTX 4070 GDDR6 EVO comes with the reference 2475 MHz maximum boost speed, while the DUAL OC variant only slightly cranks this up to 2520 MHz.
Source: VideoCardz
Add your own comment

19 Comments on ASUS Launches GeForce RTX 4070 with GDDR6 Memory

#2
wolf
Better Than Native
I'm curious how much a circa 5% reduction to memory bandwidth affects the 4070's performance, does TPU plan to test one?

A straight 5% performance drop would make this product slimy as hell, but I can't say I expect it, if it's 1-2% it's still slimy for the bait and switch, but talking margin of error for silicon lottery and sample variance.
Posted on Reply
#3
Quicks
Nvidia finding new way to screw with their customers.
Posted on Reply
#4
P4-630
At least it´s not DDR this time... :D
Posted on Reply
#5
Ruru
S.T.A.R.S.
I still think that these should be called as 4070 SE or 4060 Ultra.
Posted on Reply
#6
Philaphlous
RuruI still think that these should be called as 4070 SE or 4060 Ultra.
Isn't GDDR6 a step below GDDR6x? edit: just looked it up and it looks like 6x is way slower than 6... maybe it's just the way my brain works but I'd think "X" is better... apparently not? Maybe lower voltage/wattage memory is the X factor? I mean my GDDR6 is running at nearly the same speed as the desktop GDDR6X on the desktop 4070 so that's kind of a bummer....
Posted on Reply
#7
Kessara
Could there be a potential for better power efficiency in idle and low loads against GDDR6X? My 4070Ti tends to take even 30-40W during video playback or web browsing as any acceleration demand dials it up to working clock speeds. I regulate the power draw when gaming by undervolt and power limiting, but since my machine runs 24/7 often, every 10W in consumption is noticeable in the yearly bill.
Posted on Reply
#8
AnotherReader
PhilaphlousIsn't GDDR6 a step below GDDR6x? edit: just looked it up and it looks like 6x is way slower than 6... maybe it's just the way my brain works but I'd think "X" is better... apparently not? Maybe lower voltage/wattage memory is the X factor? I mean my GDDR6 is running at nearly the same speed as the desktop GDDR6X on the desktop 4070 so that's kind of a bummer....
GDDR6X is faster; look at MT/s rather than clock speed.
Posted on Reply
#9
Dave65
QuicksNvidia finding new way to screw with their customers.
But fan boys will say this is a magical good thing.
Posted on Reply
#10
holyprof
Also notice which brand is leading the slimy DRAM replacement, heh. Anyone surprised?
Posted on Reply
#11
Ruru
S.T.A.R.S.
holyprofAlso notice which brand is leading the slimy DRAM replacement, heh. Anyone surprised?
Won't be an Asus exclusive. From Nvidia website:

Posted on Reply
#12
holyprof
RuruWon't be an Asus exclusive. From Nvidia website:
Of course they can't do it without a permission from Nvidia, but Asus leads the way of slimy GPU/MoBo OEMs right now :(. As already mentioned above by other posters, I only consider this a non-slimy launch if they change the name so it's clearly stated it's not the real 4070 people see in the benchmarks. It's stated on the package, but it's in "fine print" character size compared to the rest of the text.
Posted on Reply
#13
Visible Noise
QuicksNvidia finding new way to screw with their customers.
AMD is a good teacher. 6900 -> 6950 was a 3% uplift in boost clocks and a bit more memory bandwidth for a $150 price increase.

Or the great 7600 XT, 16GB scam card that runs out of compute power long before it runs out of even the original 8GB of memory. But now you get to say you have more memory for “only” a 20% price increase.
Posted on Reply
#14
photonboy
"Any reduction in memory bandwidth would have to be compensated with increase in GPU clocks"

You're joking right?
This reveals a fundamental, basic misunderstanding of how computers work.

If the memory is too slow for the GPU to access then making the GPU even faster serves no purpose. You've got this exactly BACKWARDS.
Posted on Reply
#15
mama
Visible NoiseAMD is a good teacher. 6900 -> 6950 was a 3% uplift in boost clocks and a bit more memory bandwidth for a $150 price increase.

Or the great 7600 XT, 16GB scam card that runs out of compute power long before it runs out of even the original 8GB of memory. But now you get to say you have more memory for “only” a 20% price increase.
Not the same. Value for money arguments are one thing. Changing a product mid cycle to remove components to make the product worse is another thing. I expect the price will be the same so what does that tell you? Reviews are in for the original but the downgrade will be what you get.
Posted on Reply
#16
photonboy
Visible NoiseAMD is a good teacher. 6900 -> 6950 was a 3% uplift in boost clocks and a bit more memory bandwidth for a $150 price increase.

Or the great 7600 XT, 16GB scam card that runs out of compute power long before it runs out of even the original 8GB of memory. But now you get to say you have more memory for “only” a 20% price increase.
16GB vs 8GB even for an RX-7600XT already benefits several games, and 8GB will increasingly be a problem over the life cycle of this card. So you're just wrong there. Just one video (see conclusion):

I've tested this myself. So price aside (I have no idea) you're just wrong on the facts about VRAM.

As for the 6950XT it offered an average 6% boost for a 15% cost increase (was about $1100USD). That's actually not out of line at the high end.

There are examples of bad cards, such as 8GB cards in 2024 that cost too much and have issues with modern games due to lack of VRAM. It's almost like people buy them not knowing that's a problem even at the level of RX-7600XT but I know nobody here would think that....
Posted on Reply
#17
LabRat 891
RX 7700 XT competitor?
Or, is/are AMD/AIBs planning to release a further stripped down Navi31XL? -12GB, 192-bit, 4608shader, 7900M/7900GREcutdown on a 7700's PCB?
(7700XTX? 7800GTO? 7900XL?)
Posted on Reply
#18
TheinsanegamerN
photonboy16GB vs 8GB even for an RX-7600XT already benefits several games, and 8GB will increasingly be a problem over the life cycle of this card. So you're just wrong there. Just one video (see conclusion):

I've tested this myself. So price aside (I have no idea) you're just wrong on the facts about VRAM.

As for the 6950XT it offered an average 6% boost for a 15% cost increase (was about $1100USD). That's actually not out of line at the high end.

There are examples of bad cards, such as 8GB cards in 2024 that cost too much and have issues with modern games due to lack of VRAM. It's almost like people buy them not knowing that's a problem even at the level of RX-7600XT but I know nobody here would think that....
Well now you've summoned the 8GB brigade to tell you how 2013 VRAM capacity is perfectly fine and you never need more then that despite all evidence to the contrary.
Posted on Reply
#19
photonboy
TheinsanegamerNWell now you've summoned the 8GB brigade to tell you how 2013 VRAM capacity is perfectly fine and you never need more then that despite all evidence to the contrary.
??
(UPDATE: I apologize if the response was to the guy that I responded to, but it shows up as responding to me)

If responding to ME:
That's literally the OPPOSITE of what I said. Can you not read?
Posted on Reply
Add your own comment
Oct 11th, 2024 11:03 CDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts