• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4080 Comes in 12GB and 16GB Variants

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,675 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA's upcoming GeForce RTX 4080 "Ada," a successor to the RTX 3080 "Ampere," reportedly comes in two distinct variants based on memory size, memory bus width, and possibly even core-configuration. MEGAsizeGPU reports that they have seen two reference designs for the RTX 4080, one with 12 GB of memory and a 10-layer PCB, and the other with 16 GB of memory and a 12-layer PCB. Increasing numbers of PCB layers enable greater density of wiring around the ASIC. At debut, the flagship product from NVIDIA is expected to be the RTX 4090, with its 24 GB memory size, and 14-layer PCB. Apparently, the 12 GB and 16 GB variants of the RTX 4080 feature vastly different PCB designs.

We've known from past attempts at memory-based variants, such as the GTX 1060 (3 GB vs. 6 GB), or the more recent RTX 3080 (10 GB vs. 12 GB), that NVIDIA turns to other levers to differentiate variants, such as core-configuration (numbers of available CUDA cores), and the same is highly likely with the RTX 4080. The RTX 4080 12 GB, RTX 4080 16 GB, and the RTX 4090, could be NVIDIA's answers to AMD's RDNA3-based successors of the RX 6800, RX 6800 XT, and RX 6950 XT, respectively.



View at TechPowerUp Main Site | Source
 
Hopefully Nvidia will name the 16 GB variant a 4080 Ti to avoid confusion. The GTX 1060 was an unnecessary mess for gamers that weren't informed. Half the VRAM and 10% less cores on the 3GB variant. Developers listing the 1060 in requirements without distinguishing which one. Nvidia could have easily named the 1060 6GB as a 1060 Ti.
 
Nvidia was always limiting the future proof value of it's "cheaper" strong cards thought limiting the capacity of memory on them or the memory bus. I think they are doing this for the last 20 years.
 
Give me 4080 16GīBē for 500e and we have a deal ngreedia.
 
Give me 4080 16GīBē for 500e and we have a deal ngreedia.

Realistically you will need to double that at least.
 
I'll be very interested in the 16gb cards performance. I wonder how closely they'll perform...
Firstly, the price you want is entirely unrealistic for launch, so adjust that expectation now. I've also yet to find other big tech companies not interested in profits, but hey, you do you.
 
Realistically you will need to double that at least.

Well, then the product is not for me. :D then I'll scope on something else.


Firstly, the price you want is entirely unrealistic for launch, so adjust that expectation now. I've also yet to find other big tech companies not interested in profits, but hey, you do you.

That is right, I will choose what I want not what is told by companies, lold.
 
it's either getting the ones with lots of VRAM or not buying them at all. I say we choose the latter just to hurt NoVideo's bottom line.
 
I'll be very interested in the 16gb cards performance. I wonder how closely they'll perform...

Depends on how many more cores the 16 GB variant has over the 12 GB variant. I look forward to reviews when they are available but I don't expect the reviews anytime soon. The 4090 is slotted to be released first.
 
As a general rule; I have always tried to have my GPU memory half as much as my system memory, meaning my system memory = 32GB, so I would have liked to have a 16GB for GPU to go with it, this has always served me greatly in future-proofing.

As always, thank you for ruining it with your stupidity nVidia, AIB's could actually at one point in time offer models with more VRAM. I don't always play the latest AAA, but I do mod old games to the brim and I need it all.
 
So RTX 4080 Super 16GB and RTX 4080 Ti 12 GB?
 
It's sad that such an expensive card will have as much VRAM as the 3060 and AMD's RDNA2 mid-range cards which are both a generation older. Heck it has a mere 1GB more than the 3 generation old 1080 Ti.

Firstly, the price you want is entirely unrealistic for launch, so adjust that expectation now. I've also yet to find other big tech companies not interested in profits, but hey, you do you.

I think it's Nvidia that needs to adjust it's expectations if it thinks it'll waltz in with overpriced cards, no increase in VRAM capacity, and an increase to power consumption. The market is being flooded with cards and the world is entering a recession.

I may very well just sit on my 1080 Ti another generation. Still performs excellently in every game and doesn't chug power.

I see, $1200 and $1600.

Those prices would not work with a competitive AMD and in the current market.
 
That is right, I will choose what I want not what is told by companies, lold.
Naturally, I wasn't saying that at all, more loling myself at the name calling.

I'm not loyal to either 'Nvgreedia' or 'AMgreeD' lol, both of them want my money pretty badly so let the best product (for me) win.
 
There will only be a few games that take advantage of and benefit from the the 16GB of VRAM, but there will be benefit!
Running a 4k120 panel, I'm yet to find a case where my 10GB 3080 has run out, but I know future games will push that out. being my only major hobby these days, and chasing a top tier experience, I'm happy to move the 3080 on and get a 4080/7800XT (or thereabouts). I fear beyond that the price to performance for the halo products could just be silly, and carry unnecessarily large VRAM buffers for my needs.
 
it's either getting the ones with lots of VRAM or not buying them at all. I say we choose the latter just to hurt NoVideo's bottom line.
NoVideo is an Intel meme now, officially because literally. Can't use that for green anymore, sorry

As a general rule; I have always tried to have my GPU memory half as much as my system memory, meaning my system memory = 32GB, so I would have liked to have a 16GB for GPU to go with it, this has always served me greatly in future-proofing.

As always, thank you for ruining it with your stupidity nVidia, AIB's could actually at one point in time offer models with more VRAM. I don't always play the latest AAA, but I do mod old games to the brim and I need it all.
Euhhh that makes absolutely not a single drop of sense at all. 32GB and 16GB GPU is for gaming also complete and utter nonsense right now. You can't get em full even if you try.

12GB GPU, sure, 16GB RAM, sure. That's where its at right now and for this coming console gen, at best. 32 is only needed if you do more and its not gaming.
 
There will only be a few games that take advantage of and benefit from the the 16GB of VRAM, but there will be benefit!

It goes without saying that games cannot increase VRAM utilization until VRAM size increases on graphics cards. The fact that $700+ next gen graphics cards might not have enough VRAM to cover some existing games says enough. At that price point you expect the video card to have more than enough for current games and room for future titles.

There is typically a ebb and flow to arguments around VRAM sizes. That said we are at a point where Nvidia has kept the $700 price point at or below 12 GB (went from 11GB to 8GB to 11GB and now to 12GB) for four GPU generations. That is unprecedented stagnation in regards to VRAM size at this price point.
 
I think it's Nvidia that needs to adjust it's expectations if it thinks it'll waltz in with overpriced cards, no increase in VRAM capacity, and an increase to power consumption. The market is being flooded with cards and the world is entering a recession.
Be that as it may, I think 500USD/500Euro is thoroughly unrealistic for a 16GB 4080 at launch, that's all I'm saying here.
 
Realistically you will need to double that at least.
Yeah, I do not believe we will see 4080 for less than 1k EUR in Europe. Most likely quite more, considering GPU prices have far from dropped down everywhere.
 
Be that as it may, I think 500USD/500Euro is thoroughly unrealistic for a 16GB 4080 at launch, that's all I'm saying here.
I agree, though 700 would be the top end acceptable price for such a slot IMHO, AIB ranging towards 800~850
 
this seems like a mistake...12gb for the future with RT does not seem sufficient
 
Back
Top