• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

12GB Confirmed to be GeForce RTX 4070 Standard Memory Size in MSI and GIGABYTE Regulatory Filings

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,857 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
It looks like 12 GB will be the standard memory size for the NVIDIA GeForce RTX 4070 graphics card the company plans to launch in mid-April 2023. It is very likely that the card has 12 GB of memory across the 192-bit memory bus width of the "AD104" silicon the SKU is based on. The RTX 4070 is already heavily cut down from the RTX 4070 Ti that maxes out the "AD104," with the upcoming SKU featuring just 5,888 CUDA cores, compared to the 7,680 of the RTX 4070 Ti. The memory sub-system, however, could see NVIDIA use the same 21 Gbps-rated GDDR6X memory chips, which across the 192-bit memory interface, produce 504 GB/s of memory bandwidth. Confirmation of the memory size came from regulatory filings of several upcoming custom-design RTX 4070 board models by MSI and GIGABYTE, with the Eurasian Economic Commission (EEC), and Korean NRRA.



View at TechPowerUp Main Site | Source
 
400~450 bucks and not a cent more is where this needs to be placed. 450 being the super duper AIB variant.

See, saving money is easy, its all about expectations.
 
400~450 bucks and not a cent more is where this needs to be placed. 450 being the super duper AIB variant.

See, saving money is easy, its all about expectations.
At best $300, 400-450 is a rip off for what is a 60 series of GPU.
 
At best $300, 400-450 is a rip off for what is a 60 series of GPU.
12GB, >500 Gbps is not 60 series level, come on.
Shader count isn't either, and 300 for an x70 isn't realistic to begin with.

Let's refresh our memories a bit - a crippled 3.5GB 970 was already MSRP 329,-
Nine years ago.

1679919015336.png


But we all know this x70 won't release for 400-450, it'll do 550+ at least.
 
I don't think Nvidia listens to customer feedback in random forums.

7g01r1.jpg



And as they have said, AI will need tens of thousands of GPUs, so all you gamers can go play with your rain sticks.
 
Lets be realistic here, 1060 is $300, that's 7 years ago, and thanks to the fed going printer goes brrrrr we have very substantial inflation.
If we're being realistic then we can look at nvidia's gross margins and see that outside of the 2021-2022 fluke their margins today are only 7% higher then they were a decade ago. It's not just them, the whole industry has gotten far more expensive.

But people hate thinking that the prices today are caused by anything but greed.

This card should be priced in the $400-$500 but we all know it's going to be $649+
If the market will pay it there's no reason not to, leaving money on the table would be silly.
 
400~450 bucks and not a cent more is where this needs to be placed. 450 being the super duper AIB variant.

See, saving money is easy, its all about expectations.
I could go as high as $500. Anything above that, idgaf.
Then again, $500 for a custom model means you got the MSRP right.
 
12GB, >500 Gbps is not 60 series level, come on.
A 192-bit memory bus isn't 70 series level, come on. And Nvidia itself set 12GB as the standard for a 60 series card with the 3060 via panicked undoing of their own cost-cutting, so that's their problem. I'm not sure why you're crediting Nvidia for memory technology getting faster either. It's supposed to do that. Technology is supposed to advance. Memory bandwidth should be ridiculously plentiful now for that very reason. Instead, Nvidia just make up for faster VRAM being available by crippling the memory bus. There had never been a 70 series card in history with less than a 256-bit bus before these cut down, comically overpriced pieces of trash arrived on the market, so please don't only selectively cite historical precedent.
 
A 192-bit memory bus isn't 70 series level, come on. And Nvidia itself set 12GB as the standard for a 60 series card with the 3060 via panicked undoing of their own cost-cutting, so that's their problem. Instead, Nvidia just make up for faster VRAM being available by crippling the memory bus. There had never been a 70 series card in history with less than a 256-bit bus before these cut down,
GeForce 6700 was only 128 bit. Hell xx8x series were 256 bit for YEARS. And bits are not everything. Remember when the 512 bit 290x was getting stomped by 256 bit 980s?
I'm not sure why you're crediting Nvidia for memory technology getting faster either. It's supposed to do that. Technology is supposed to advance. Memory bandwidth should be ridiculously plentiful now for that very reason.
And it is plentiful. So plentiful that you don't need a 256 bit wide bus to feed a xx7x tier card anymore.
so please don't only selectively cite historical precedent.
"don't use evidence I wont like because I don't have an argument for it"

How about some current precedent? The 6900xt with a 256 bit bus is able to keep up in raster with the 320 bit 3080 and 384 bit 3090, depending on if AMD bothered to release optimized drivers. The 7900xt, a 320 bit card, averages out to the same speed as the 4070ti, a 192 bit card, and loses to the 4080, a 256 bit card.


Bits =! speed.
 
I'm not sure why you're crediting Nvidia for memory technology getting faster either. It's supposed to do that. Technology is supposed to advance. Memory bandwidth should be ridiculously plentiful now for that very reason.

Nah, Jensen said that this era is over, that Moore's Law is dead. And months later, Moore actually died. Coincidence?
 
A 192-bit memory bus isn't 70 series level, come on. And Nvidia itself set 12GB as the standard for a 60 series card with the 3060 via panicked undoing of their own cost-cutting, so that's their problem. I'm not sure why you're crediting Nvidia for memory technology getting faster either. It's supposed to do that. Technology is supposed to advance. Memory bandwidth should be ridiculously plentiful now for that very reason. Instead, Nvidia just make up for faster VRAM being available by crippling the memory bus. There had never been a 70 series card in history with less than a 256-bit bus before these cut down, comically overpriced pieces of trash arrived on the market, so please don't only selectively cite historical precedent.
I don't think you understand what's going on here. Wide memory busses mean more complicated PCBs (because of more traces), thus higher costs*. What we are looking at is a game where, as the VRAM chips offer increasingly more bandwidth, GPU makers try to gauge how much of that is really needed and use as narrow a bus as they can, without starving the GPU.

*No, the irony of talking cost saving when video cards cost as much as they do today is not lost on me.
 
And as they have said, AI will need tens of thousands of GPUs, so all you gamers can go play with your rain sticks.

They wish, a massive chunk of their revenue still comes from consumer products.
 
They wish, a massive chunk of their revenue still comes from consumer products.
And all of those datacenter and pro level products come with the far more expensive to maintain pro tier drivers that have to be guaranteed to work and thoroughly tested before release.
 
12GB, >500 Gbps is not 60 series level, come on.
Shader count isn't either, and 300 for an x70 isn't realistic to begin with.

Let's refresh our memories a bit - a crippled 3.5GB 970 was already MSRP 329,-
Nine years ago.

View attachment 289439

But we all know this x70 won't release for 400-450, it'll do 550+ at least.

I'll be semi shocked if it's less than 700 usd at this point. Don't hate on the 970 in SLI is was pretty beastly in 2014 handling most games at 4k just fine for less than $700 this new card will likely cost more than 2 and be a joke of a 2023 card.


I agree with you though 300-400 usd is not happening ever again on a XX70 card people will be lucky if the 4060 is remotely close to 300 usd.
 
I got pushed over the edge, my next purchase is price based, I don't care if it's slower than what I've got. So it's gonna be until my card dies or the card becomes a compatibility issue.
 
If "massive chunk" means less than half, then yes: https://s201.q4cdn.com/141608511/files/doc_financials/2023/Q123/Rev_by_Mkt_Qtrly_Trend_Q123.pdf
Nvidia's data center revenue % has been growing for years.
49% of your revenue does, in fact, qualify as a "massive chunk". Datacenter % is nearly the same in q4 23 as q2 21, accourding to your source, so not sure where this "its % has been growing for years" is coming from.

I got pushed over the edge, my next purchase is price based, I don't care if it's slower than what I've got. So it's gonna be until my card dies or the card becomes a compatibility issue.
Which is how it should be. The more recent gens were driven by resolution anyway. If you are still gaming at 1080p today you dont need these wundercards to get by, a 1660 super or 5600xt is still plenty, and the difference between high and ultra settings is just a lower FPS number 99% of the time.

We got spoiled by years of cheap cards thanks to the 2008 recession and slow recovery.
 
What about those two 3060 Ti SUPERs at the end of the list? :kookoo:
 
If we're being realistic then we can look at nvidia's gross margins and see that outside of the 2021-2022 fluke their margins today are only 7% higher then they were a decade ago. It's not just them, the whole industry has gotten far more expensive.

But people hate thinking that the prices today are caused by anything but greed.


If the market will pay it there's no reason not to, leaving money on the table would be silly.

Their margins tipped 64% during the pandemic. That their margins during the current dip after historic demand and a recession are still 7% higher than normal goes to show you they grossly overpriced their products are.

Poor Nvidia, only making 7% above their average during an economic expansion while everyone else is struggling to put food on the table. Woe is them.
 
Last edited:
Well, it makes sense. It has about 20% more performance than the RTX 3080 and 20% more VRAM to go with it.

I still don't think that it's nearly enough for that potent of a video card though.
 
4070 won't be any faster than a 3080 really, maybe the 3080 12 GB variant, that didn't have MSRP but last known price was reduced to 799.
 
Last edited:
Nvidia getting away with murder, what the F*** is the competition even DOING?

Milking too
 
Nvidia getting away with murder, what the F*** is the competition even DOING?

Milking too

Pricing their cards as high as the market will allow and in the case of the 7900XT at least 100 usd too expensive lol.

I was pretty underwhelmed with the 7000 series to the point I'm not surprised at 4000 series pricing. I feel like AMD took at least a step back vs the 6000 series which in general competed better with Nvidia's 90 tier card. Not that the performance is bad it's actually pretty good but the 4080 is one of the most underwhelming nvidia cards from a price perspective literally a 71% price increase vs it's predecessor but even at the ridiculous 1200 usd msrp which is kinda sad because Nvidia left AMD with a huge window to obliterate the 4080/4070ti and at best they are matching them.


I really hope at the XX70 tier and lower the 7000 series is much more impressive where RT matters a lot less.
 
Last edited:
Back
Top