Monday, March 27th 2023

12GB Confirmed to be GeForce RTX 4070 Standard Memory Size in MSI and GIGABYTE Regulatory Filings

It looks like 12 GB will be the standard memory size for the NVIDIA GeForce RTX 4070 graphics card the company plans to launch in mid-April 2023. It is very likely that the card has 12 GB of memory across the 192-bit memory bus width of the "AD104" silicon the SKU is based on. The RTX 4070 is already heavily cut down from the RTX 4070 Ti that maxes out the "AD104," with the upcoming SKU featuring just 5,888 CUDA cores, compared to the 7,680 of the RTX 4070 Ti. The memory sub-system, however, could see NVIDIA use the same 21 Gbps-rated GDDR6X memory chips, which across the 192-bit memory interface, produce 504 GB/s of memory bandwidth. Confirmation of the memory size came from regulatory filings of several upcoming custom-design RTX 4070 board models by MSI and GIGABYTE, with the Eurasian Economic Commission (EEC), and Korean NRRA.
Sources: harukaze5719 (Twitter), VideoCardz
Add your own comment

62 Comments on 12GB Confirmed to be GeForce RTX 4070 Standard Memory Size in MSI and GIGABYTE Regulatory Filings

#1
Vayra86
400~450 bucks and not a cent more is where this needs to be placed. 450 being the super duper AIB variant.

See, saving money is easy, its all about expectations.
Posted on Reply
#2
Chaitanya
Vayra86400~450 bucks and not a cent more is where this needs to be placed. 450 being the super duper AIB variant.

See, saving money is easy, its all about expectations.
At best $300, 400-450 is a rip off for what is a 60 series of GPU.
Posted on Reply
#3
Vayra86
ChaitanyaAt best $300, 400-450 is a rip off for what is a 60 series of GPU.
12GB, >500 Gbps is not 60 series level, come on.
Shader count isn't either, and 300 for an x70 isn't realistic to begin with.

Let's refresh our memories a bit - a crippled 3.5GB 970 was already MSRP 329,-
Nine years ago.



But we all know this x70 won't release for 400-450, it'll do 550+ at least.
Posted on Reply
#4
Verpal
ChaitanyaAt best $300, 400-450 is a rip off for what is a 60 series of GPU.
Lets be realistic here, 1060 is $300, that's 7 years ago, and thanks to the fed going printer goes brrrrr we have very substantial inflation.
Posted on Reply
#5
DeeJay1001
VerpalLets be realistic here, 1060 is $300, that's 7 years ago, and thanks to the fed going printer goes brrrrr we have very substantial inflation.
This card should be priced in the $400-$500 but we all know it's going to be $649+
Posted on Reply
#6
Bwaze
I don't think Nvidia listens to customer feedback in random forums.




And as they have said, AI will need tens of thousands of GPUs, so all you gamers can go play with your rain sticks.
Posted on Reply
#7
TheinsanegamerN
VerpalLets be realistic here, 1060 is $300, that's 7 years ago, and thanks to the fed going printer goes brrrrr we have very substantial inflation.
If we're being realistic then we can look at nvidia's gross margins and see that outside of the 2021-2022 fluke their margins today are only 7% higher then they were a decade ago. It's not just them, the whole industry has gotten far more expensive.

But people hate thinking that the prices today are caused by anything but greed.
DeeJay1001This card should be priced in the $400-$500 but we all know it's going to be $649+
If the market will pay it there's no reason not to, leaving money on the table would be silly.
Posted on Reply
#8
bug
Vayra86400~450 bucks and not a cent more is where this needs to be placed. 450 being the super duper AIB variant.

See, saving money is easy, its all about expectations.
I could go as high as $500. Anything above that, idgaf.
Then again, $500 for a custom model means you got the MSRP right.
Posted on Reply
#9
Aretak
Vayra8612GB, >500 Gbps is not 60 series level, come on.
A 192-bit memory bus isn't 70 series level, come on. And Nvidia itself set 12GB as the standard for a 60 series card with the 3060 via panicked undoing of their own cost-cutting, so that's their problem. I'm not sure why you're crediting Nvidia for memory technology getting faster either. It's supposed to do that. Technology is supposed to advance. Memory bandwidth should be ridiculously plentiful now for that very reason. Instead, Nvidia just make up for faster VRAM being available by crippling the memory bus. There had never been a 70 series card in history with less than a 256-bit bus before these cut down, comically overpriced pieces of trash arrived on the market, so please don't only selectively cite historical precedent.
Posted on Reply
#10
TheinsanegamerN
AretakA 192-bit memory bus isn't 70 series level, come on. And Nvidia itself set 12GB as the standard for a 60 series card with the 3060 via panicked undoing of their own cost-cutting, so that's their problem. Instead, Nvidia just make up for faster VRAM being available by crippling the memory bus. There had never been a 70 series card in history with less than a 256-bit bus before these cut down,
GeForce 6700 was only 128 bit. Hell xx8x series were 256 bit for YEARS. And bits are not everything. Remember when the 512 bit 290x was getting stomped by 256 bit 980s?
AretakI'm not sure why you're crediting Nvidia for memory technology getting faster either. It's supposed to do that. Technology is supposed to advance. Memory bandwidth should be ridiculously plentiful now for that very reason.
And it is plentiful. So plentiful that you don't need a 256 bit wide bus to feed a xx7x tier card anymore.
Aretakso please don't only selectively cite historical precedent.
"don't use evidence I wont like because I don't have an argument for it"

How about some current precedent? The 6900xt with a 256 bit bus is able to keep up in raster with the 320 bit 3080 and 384 bit 3090, depending on if AMD bothered to release optimized drivers. The 7900xt, a 320 bit card, averages out to the same speed as the 4070ti, a 192 bit card, and loses to the 4080, a 256 bit card.

www.techspot.com/review/2642-radeon-7900-xt-vs-geforce-rtx-4070-ti/#1440p

Bits =! speed.
Posted on Reply
#11
Bwaze
AretakI'm not sure why you're crediting Nvidia for memory technology getting faster either. It's supposed to do that. Technology is supposed to advance. Memory bandwidth should be ridiculously plentiful now for that very reason.
Nah, Jensen said that this era is over, that Moore's Law is dead. And months later, Moore actually died. Coincidence?
Posted on Reply
#12
bug
AretakA 192-bit memory bus isn't 70 series level, come on. And Nvidia itself set 12GB as the standard for a 60 series card with the 3060 via panicked undoing of their own cost-cutting, so that's their problem. I'm not sure why you're crediting Nvidia for memory technology getting faster either. It's supposed to do that. Technology is supposed to advance. Memory bandwidth should be ridiculously plentiful now for that very reason. Instead, Nvidia just make up for faster VRAM being available by crippling the memory bus. There had never been a 70 series card in history with less than a 256-bit bus before these cut down, comically overpriced pieces of trash arrived on the market, so please don't only selectively cite historical precedent.
I don't think you understand what's going on here. Wide memory busses mean more complicated PCBs (because of more traces), thus higher costs*. What we are looking at is a game where, as the VRAM chips offer increasingly more bandwidth, GPU makers try to gauge how much of that is really needed and use as narrow a bus as they can, without starving the GPU.

*No, the irony of talking cost saving when video cards cost as much as they do today is not lost on me.
Posted on Reply
#13
Vya Domus
BwazeAnd as they have said, AI will need tens of thousands of GPUs, so all you gamers can go play with your rain sticks.
They wish, a massive chunk of their revenue still comes from consumer products.
Posted on Reply
#14
TheinsanegamerN
Vya DomusThey wish, a massive chunk of their revenue still comes from consumer products.
And all of those datacenter and pro level products come with the far more expensive to maintain pro tier drivers that have to be guaranteed to work and thoroughly tested before release.
Posted on Reply
#16
oxrufiioxo
Vayra8612GB, >500 Gbps is not 60 series level, come on.
Shader count isn't either, and 300 for an x70 isn't realistic to begin with.

Let's refresh our memories a bit - a crippled 3.5GB 970 was already MSRP 329,-
Nine years ago.



But we all know this x70 won't release for 400-450, it'll do 550+ at least.
I'll be semi shocked if it's less than 700 usd at this point. Don't hate on the 970 in SLI is was pretty beastly in 2014 handling most games at 4k just fine for less than $700 this new card will likely cost more than 2 and be a joke of a 2023 card.


I agree with you though 300-400 usd is not happening ever again on a XX70 card people will be lucky if the 4060 is remotely close to 300 usd.
Posted on Reply
#17
xorbe
I got pushed over the edge, my next purchase is price based, I don't care if it's slower than what I've got. So it's gonna be until my card dies or the card becomes a compatibility issue.
Posted on Reply
#18
TheinsanegamerN
bugIf "massive chunk" means less than half, then yes: s201.q4cdn.com/141608511/files/doc_financials/2023/Q123/Rev_by_Mkt_Qtrly_Trend_Q123.pdf
Nvidia's data center revenue % has been growing for years.
49% of your revenue does, in fact, qualify as a "massive chunk". Datacenter % is nearly the same in q4 23 as q2 21, accourding to your source, so not sure where this "its % has been growing for years" is coming from.
xorbeI got pushed over the edge, my next purchase is price based, I don't care if it's slower than what I've got. So it's gonna be until my card dies or the card becomes a compatibility issue.
Which is how it should be. The more recent gens were driven by resolution anyway. If you are still gaming at 1080p today you dont need these wundercards to get by, a 1660 super or 5600xt is still plenty, and the difference between high and ultra settings is just a lower FPS number 99% of the time.

We got spoiled by years of cheap cards thanks to the 2008 recession and slow recovery.
Posted on Reply
#19
Bjørgersson
What about those two 3060 Ti SUPERs at the end of the list? :kookoo:
Posted on Reply
#20
evernessince
TheinsanegamerNIf we're being realistic then we can look at nvidia's gross margins and see that outside of the 2021-2022 fluke their margins today are only 7% higher then they were a decade ago. It's not just them, the whole industry has gotten far more expensive.

But people hate thinking that the prices today are caused by anything but greed.


If the market will pay it there's no reason not to, leaving money on the table would be silly.
Their margins tipped 64% during the pandemic. That their margins during the current dip after historic demand and a recession are still 7% higher than normal goes to show you they grossly overpriced their products are.

Poor Nvidia, only making 7% above their average during an economic expansion while everyone else is struggling to put food on the table. Woe is them.
Posted on Reply
#21
Avro Arrow
Well, it makes sense. It has about 20% more performance than the RTX 3080 and 20% more VRAM to go with it.

I still don't think that it's nearly enough for that potent of a video card though.
Posted on Reply
#22
N/A
4070 won't be any faster than a 3080 really, maybe the 3080 12 GB variant, that didn't have MSRP but last known price was reduced to 799.
Posted on Reply
#23
Fluffmeister
Nvidia getting away with murder, what the F*** is the competition even DOING?

Milking too
Posted on Reply
#24
oxrufiioxo
FluffmeisterNvidia getting away with murder, what the F*** is the competition even DOING?

Milking too
Pricing their cards as high as the market will allow and in the case of the 7900XT at least 100 usd too expensive lol.

I was pretty underwhelmed with the 7000 series to the point I'm not surprised at 4000 series pricing. I feel like AMD took at least a step back vs the 6000 series which in general competed better with Nvidia's 90 tier card. Not that the performance is bad it's actually pretty good but the 4080 is one of the most underwhelming nvidia cards from a price perspective literally a 71% price increase vs it's predecessor but even at the ridiculous 1200 usd msrp which is kinda sad because Nvidia left AMD with a huge window to obliterate the 4080/4070ti and at best they are matching them.


I really hope at the XX70 tier and lower the 7000 series is much more impressive where RT matters a lot less.
Posted on Reply
#25
Minus Infinity
ChaitanyaAt best $300, 400-450 is a rip off for what is a 60 series of GPU.
106 series die for 103 pricing, thank you Huang.
Posted on Reply
Add your own comment
Apr 26th, 2024 16:01 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts