• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 4080 12GB and 16GB Based on Different Chips, Vastly Different Shader Counts

If the performance is there, I have no issues with it, even consuming less power....

The problem lies in that it's clearly a tactic design to take advantage of customer exceptions. The name would imply that it lands close of the 4080 16GB but rumors suggest that it's more like a 4070 Ti. If the performance it too far from the 4080 then the naming is misleading plain and simple.

In addition, it sets Nvidia up for pulling more shenanigans in the future. PC gamers might have to worry about 4080s and future generations of cards with significant differences in performance despite the name suggesting they are the reasonably the same. If people don't flag this now there's nothing stopping Nvidia from bifurcating other SKUs and potentially increasing the gap in performance between said SKUs.

I really hope PC gamers put their foot down because this kind of trend only hurts consumers.
 
Last edited:
The problem lies in that it's clearly a tactic design to take advantage of customer exceptions. The name would imply that it lands close of the 4080 16GB but rumors suggest that it's more like a 4070 Ti. If the performance it too far from the 4080 then the naming is misleading plain and simple.

In addition, it sets Nvidia up for pulling more shenanigans in the future. PC gamers might have to worry about 4080s with significant differences in performance despite the name suggesting they are the reasonably the same. If people don't flag this now there's nothing stopping Nvidia from bifurcating other SKUs and potentially increasing the gap in performance between said SKUs.

I really hope PC gamers put their foot down because this kind of trend only hurts consumers.

Nividia has been pulling shenanigans since the GTX 680.
 
Nividia has been pulling shenanigans since the GTX 680.
In other news refreshed 3060 3060,Ti and 3070Ti using a castrated 102 on the way too.

So yeah, they're not alone though but they do it best.
 
Another generation of GPUs I won't be buying because it costs more than my mortgage.

I don't know why people buy them at this price. £250 maximum is a sensible amount to spend on a GPU in mh opinion.

Yep. I just wait and pick up used cards nowadays. Last couple cards have been second-hand market. I refuse to contribute to normalizing the pricing they are pushing.
 
No, because the 4080 will not only have two different VRAM sizes but also will have different amounts of shaders.
It happened to the 1060 and 2060 before and they were the actual opposite of a clusterfuck.
They'll just be different cards with the same name on the box. It won't matter the minute you install the card in your system and throw away the box.
 
It happened to the 1060 and 2060 before and they were the actual opposite of a clusterfuck.
They'll just be different cards with the same name on the box. It won't matter the minute you install the card in your system and throw away the box.
Yhea the 1060 3gb was just 11% slower, and most knew to avoid it. The 2060 was a special case, it released within ampere gen to make up for the lack of stock. But they didn't have a 26% difference in CUDA count.
("RTX 4080 GS" or RTX 4080 MX".) Anyway, we'll once they are out if they are really GPU of the same caliber deserving of sharing the same name. (Maybe time to bring back :"RTX 4080 GS" or RTX 4080 MX".)
 
what a mess...
 
Yeah, the rumor mill going strong. Personally, I'm building my first setup on Black Friday to get the best deals I can. When are we going to get 6 slot coolers or better yet.View attachment 261682

At the start of the RTX 20 announcement, Jensen said he was there to announce the GTX 1180.

It would be so cool if they followed up by showing a comically large 4080 and 4090 in honor of that April 1 video. I would have so much respect for them. :D


As for the article, it looks like a rebrand of the 4070 to the 4080 12 GB. If they sell that for $700, I will be skipping this generation. No point in upgrading for anything less than 30%. I usually prefer 50%. And my 2070 SUPER to 3080 upgrade gave me 80% in 4K.
 
As for the article, it looks like a rebrand of the 4070 to the 4080 12 GB. If they sell that for $700, I will be skipping this generation. No point in upgrading for anything less than 30%. I usually prefer 50%. And my 2070 SUPER to 3080 upgrade gave me 80% in 4K.
Exactly. What matters when upgrading is performance (like you, I feel that if there's not an extra 50% to be gained, it doesn't really let use higher settings to the point you can actually see the difference). If performance is there, would you dismiss the card just because it said 4090 on the box? Or maybe 4050? Or even 1234567890?
 
Exactly. What matters when upgrading is performance (like you, I feel that if there's not an extra 50% to be gained, it doesn't really let use higher settings to the point you can actually see the difference). If performance is there, would you dismiss the card just because it said 4090 on the box? Or maybe 4050? Or even 1234567890?
My upgrades are like 6-8 years apart but I get 300% percent per upgrade. (560SE to 1060 3GB.) This setup will get me like 500% of my perf at least! (10603GB to 4080-4090.)
 
My upgrades are like 6-8 years apart but I get 300% percent per upgrade. (560SE to 1060 3GB.) This setup will get me like 500% of my perf at least! (10603GB to 4080-4090.)
I'm not that stingy, but I always skip a generation or two. Now that I don't really have time for gaming anymore, I can probably slow down.
 
I'm not that stingy, but I always skip a generation or two. Now that I don't really have time for gaming anymore, I can probably slow down.
I just need a good setup at 4k@120 fps for a very long time. I don't play the newest games nor the biggest prosumer workloads, but I do want a very large headroom.
 
  • Like
Reactions: bug
So according to videocards site the 12 GB card uses 21GB gddrx6 and the 16GB uses 23GB gddrx6.

With the buss width and shaders these are not going to perform the same at all, like the 16GB is looking like a Ti version comparatively.
 
  • Like
Reactions: 64K
So according to videocards site the 12 GB card uses 21GB gddrx6 and the 16GB uses 23GB gddrx6.

With the buss width and shaders these are not going to perform the same at all, like the 16GB is looking like a Ti version comparatively.

The rumor is the 12GB version was going to be the 4070 or 4070ti but Nvidia changed the name last minute so that they could charge 700+ for it I guess we will know more on the 20th
 
So according to videocards site the 12 GB card uses 21GB gddrx6 and the 16GB uses 23GB gddrx6.

With the buss width and shaders these are not going to perform the same at all, like the 16GB is looking like a Ti version comparatively.

That is what I think should happen too. Call the 12GB a 4080 and the 16GB with more cores a 4080 Ti. That would end the confusion between the 2 GPUs and be the sensible way forward for Nvidia. If they wanted to release faster 4080s later on they could just tack on Super to the names.
 
The rumor is the 12GB version was going to be the 4070 or 4070ti but Nvidia changed the name last minute so that they could charge 700+ for it I guess we will know more on the 20th
These last minute changes aren't usually done on a whim, they're most often done because one player learns something about the other's lineup.
 
These last minute changes aren't usually done on a whim, they're most often done because one player learns something about the other's lineup.

Personally I couldn't care less what they call them just what the performance is over what I currently own.
 
  • Like
Reactions: bug
Back
Top