• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 4080 12GB and 16GB Based on Different Chips, Vastly Different Shader Counts

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
When we first got news about NVIDIA's upcoming GeForce RTX 4080 "Ada" coming in 12 GB and 16 GB variants, we knew there was more setting the two apart than just memory size and memory bus-width. Turns out there's a lot more. According to detailed specifications leaked to the web, while the 16 GB variant of the RTX 4080 is based on the same AD103, the second largest chip after the AD102; the 12 GB RTX 4080 is based on the smaller AD104 chip which has a physically narrower memory bus.

It looks like NVIDIA is debuting the RTX 40-series with at least three models—RTX 4090 24 GB, RTX 4080 16 GB, and RTX 4080 12 GB. The RTX 4090 is the top-dog part, with the ASIC code "AD102-300-xx." It's endowed with 16,384 CUDA cores, a boost frequency of up to 2.52 GHz, 24 GB of 21 Gbps GDDR6X memory, and a typical graphics power (TGP) of 450 W, which is "configurable" up to 600 W. The RTX 4080 16 GB is based on the AD103-300-xx" comes with 9,728 CUDA cores, a boost frequency of 2.50 GHz, and 16 GB of 23 Gbps GDDR6X memory across a narrower memory bus than the one the RTX 4090 comes with. This card reportedly has a 340 W TGP configurable up to 516 W.



The GeForce RTX 4090 12 GB is positioned a notch below its 16 GB namesake, but is based on the smaller AD104 chip, with 7,680 CUDA cores running at speeds of up to 2.61 GHz, 12 GB of 21 Gbps GDDR6X memory, and a TGP of 285 W that's configurable up to 366 W. It's interesting how the leak includes not just TGP, but also maximum configurable TGP. The various board partners will utilize the latter as their power limits to achieve overclocked speeds. Even the NVIDIA Founders Edition board is technically "custom design," and so it could feature higher-than-stock TGP.

View at TechPowerUp Main Site | Source
 
Just looking at specs the lower tier 4080 looks like it should be a 4070ti or oem only varient....

Never been a huge fan of two different skus carrying the same name though.
 
The GeForce RTX 4090 12 GB is positioned a notch below its 16 GB namesake, but is based on the smaller AD104 chip, with 7,680 CUDA cores
I guess you mean the RTX 4080 12GB....
 
These are tricks to keep nVidia's outrageous 67 percent gross margin going (or at least attempt to keep the good times going for them, not for us).

RTX 3080 used the top 102 chip, if the 4080 12GB uses AD104 instead of 103 or 102, you are actually paying the same money for 2 tiers lower performance in the stack compared to last gen.
 
you are actually paying the same money for 2 tiers lower performance in the stack compared to last gen.
If the performance is there, I have no issues with it, even consuming less power....
 
If the performance is there, I have no issues with it, even consuming less power....

I agree, if the 4080 is 30-40% faster at the same ish price as the 3080 it'll be fine.

The 600, 700, 1000, 2000 non ti 80 series cards all used the 104 die. Ampere was the first time since the GTX 580 that we got above that for a 80 non ti tiered card.
 
Last edited:
"Buy a 4080 today! Starting at $699.99."

Back in the day, these "4080"'s would have been a 4070. Now they want to charge more without saying they've once again upped the price on the tiers, so they're disguising it by calling it a 4080. Far from "who cares if it performs," I'd care because they're charging you more for the performance tier they would have sold you in prior generations for less because it would have been designated a lower tier. This way, they get to reduce tier relative performance because if THIS is considered a 4080 then imagine the 4070 below it. Or the 4060 below that.
 
Sure you can say people buying a gpu will be tech savy enough, but I still feel this is done just to confuse and borderline scam people.
They are already doing too many SKU's but to then also have different versions of cards with the same name.

And yes I know both AMD and Nvidia have done this in the past (which I hated then as well), this needs to stop honestly.

Reviewers now have a lot more work telling people how each version of teh same gpu performs....
 
Which one utilize PCIe Express x8?
 
Which one utilize PCIe Express x8?
Ad106 afaik not these.

Don't like this personally and they're is likely to be a performance disparity between the 12/16GB parts, how could there not be?!.
 
Low quality post by Arco
Why not just call it a 4070ti and avoid confusion, unless it's the aim.
Always the aim, why not make 30 cards with very similar names and then make people get an inferior card.
 
Lol regarding model rumors for full AD104, we went from RTX 4070 12GB (best case) to RTX 4080 12GB (worst case)
So from 3070 successor ($499) we went to 3080 successor ($699) skipping 3070Ti successor ($599) price level entirely.
So I don't know what pricing level this rumor suggests, maybe something like the below scenario:

$999-$799 16GB cut-down AD103 based
$799-$649 12GB Full AD104 based
$649-$499 10GB cut-down AD104 based

If true, pricing seems to be getting worst by the day!
 
Lol regarding model rumors for full AD104, we went from RTX 4070 12GB (best case) to RTX 4080 12GB (worst case)
So from 3070 successor ($499) we went to 3080 successor ($699) skipping 3070Ti successor ($599) price level entirely.
So I don't know what pricing level this rumor suggests, maybe something like the below scenario:

$999-$799 16GB cut-down AD103 based
$799-$649 12GB Full AD104 based
$649-$499 10GB cut-down AD104 based

If true, pricing seems to be getting worst by the day!
Oof, do the black Friday deals change the prices at all much?
 
Oof, do the black Friday deals change the prices at all much?
Don't take too seriously the rumors (and more so the prices I quoted since I don't have any info and the reply was spontaneous without much though)
In less than one week, we will probably have the real deal from Nvidia themselves at GTC.
 
Don't take too seriously the rumors (and more so the prices I quoted since I don't have any info and the reply was spontaneous without much though)
In less than one week, we will probably have the real deal from Nvidia themselves at GTC.
Yeah, the rumor mill going strong. Personally, I'm building my first setup on Black Friday to get the best deals I can. When are we going to get 6 slot coolers or better yet.
1663156881005.png
 
Another generation of GPUs I won't be buying because it costs more than my mortgage.

I don't know why people buy them at this price. £250 maximum is a sensible amount to spend on a GPU in mh opinion.
 
Another generation of GPUs I won't be buying because it costs more than my mortgage.

I don't know why people buy them at this price. £250 maximum is a sensible amount to spend on a GPU in mh opinion.
If my build wasn't delayed like 2 years that would be the case for me. But I have to buy basically another setup excluding mic, mouse, and keyboard. So I might as well go insane and get a great setup with upgrade options. (Due to AM5, ATX3.0, DDR5, and PCIE5.)
 
I have to agree with the 12GB that it should be labeled a 3070/Ti, calling it a 3080 is more like a bait and switch cash grab. :shadedshu:
 
I think we need to stop seeing the 4080 as being a direct replacement upgrade for the 3080. It's not. It won't be. It's marketing. Whatever Nvidia chooses to name a GPU it should be reviewed on price for performance alone.
 
Truly some ugly mismarketing. I feel bad for those who doesn't know that much about computers, and gets the cheaper one just because "12GB is fine for me" and the card is actually way slower than the similarly named one.
 
Back
Top