• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 4080 12GB and 16GB Based on Different Chips, Vastly Different Shader Counts

Sure you can say people buying a gpu will be tech savy enough, but I still feel this is done just to confuse and borderline scam people.
If you buy solely based on the sticker on the box, nothing can save you from making wrong choices.
If, however, you do the smallest amount of due diligence and read a review before buying, then you know exactly what performance you're buying. Internal organization of the GPU is pretty much irrelevant to the average buyer (and even for some enthusiasts).

I'm not a fan of using the same moniker for essentially different GPUs, but that's not the end of the world.
 
Last edited:
but that's not the end of world.
LIES! Nvidia wants you to see their flagship cards as the end-all, be-all.




End-all of your wallet. :laugh:
 
Truly some ugly mismarketing. I feel bad for those who doesn't know that much about computers, and gets the cheaper one just because "12GB is fine for me" and the card is actually way slower than the similarly named one.

I think the last nvidia GPU I bought that I didn't feel fucked after buying was the 8800gtx SLI config. The last GPU I bought that I felt good about and really happy with the price was a 6800gt for a second rig. The first rig had dual 6800 ultras and was really nice but that 6800gt only took one power connector, was single slot and spat out Doom 3 and HL2 at 1600 verticle just fine. 9700 pro was also a gem. Then it goes back to glide graphics and good lord what a mess.
 
I think the last nvidia GPU I bought that I didn't feel fucked after buying was the 8800gtx SLI config. The last GPU I bought that I felt good about and really happy with the price was a 6800gt for a second rig. The first rig had dual 6800 ultras and was really nice but that 6800gt only took one power connector, was single slot and spat out Doom 3 and HL2 at 1600 verticle just fine. 9700 pro was also a gem. Then it goes back to glide graphics and good lord what a mess.
Yeah, as GT wasn't disabled in any means, just slightly lower clocks and those practically always OC'd to Ultra clocks.
 
Ad106 afaik not these.

Don't like this personally and they're is likely to be a performance disparity between the 12/16GB parts, how could there not be?!.

Was meant for sarcasm but continue on :D
 
50% price difference confirmed, lol.
 
The 4080 will be a cluster-fuck.
 
Was meant for sarcasm but continue on :D
Continue on with what?!.

My reply had no snark, your original post had no :p either.
 
Because of the label on the box? Sure.

No, because the 4080 will not only have two different VRAM sizes but also will have different amounts of shaders.
 
No, because the 4080 will not only have two different VRAM sizes but also will have different amounts of shaders.
Not to mention memory buss width and hence bandwidth.
 
No, because the 4080 will not only have two different VRAM sizes but also will have different amounts of shaders.

I personally don't really care as anyone with half a brain should be able to do their own research on buying a likely 700+ usd gpu.... My worry is Nvidia is doing this to hit the 699 usd price point with the 12GB version while the 16GB varient (The real 4080) will be closer to 999 usd.
 
I personally don't really care as anyone with half a brain should be able to do their own research on buying a likely 700+ usd gpu.... My worry is Nvidia is doing this to hit the 699 usd price point with the 12GB version while the 16GB varient (The real 4080) will be closer to 999 usd.

imo most gamers don't visit tech sites so they won't even know the differences. It's like the two 1060s. Even game developers didn't distinguish the two in their system requirements at first.
 
imo most gamers don't visit tech sites so they won't even know the differences. It's like the two 1060s. Even game developers didn't distinguish the two in their system requirements at first.

So you're on a crusade to help gamers who don't educate themselves that are likely buying an Alienware or ibuypower system? Cool.

Get the good word out there I guess.

Personally those gamers are better off on a PS5/SeriesX

Edit. On a 2-400 product I'm with you but on a 700+ one I don't feel bad for people who make bad decisions. Especially whole system purchases that likely cost 1500-2000 usd. I also told countless people not to buy the 1060 3GB most didn't listen stating 3GB is plenty for 1080p...
 
Last edited:
I personally don't really care as anyone with half a brain should be able to do their own research on buying a likely 700+ usd gpu.... My worry is Nvidia is doing this to hit the 699 usd price point with the 12GB version while the 16GB varient (The real 4080) will be closer to 999 usd.
And we all know how reliable they're MSRP is eh.
 
And we all know how reliable they're MSRP is eh.

For realz... Here's your 12GB 4080 for the low low price of 699 (but really 800 usd sucka). I am hopeful that in the current market they aren't able to get away with this though.
 
Last edited:
Personally those gamers are better off on a PS5/SeriesX

There are tens of million PC gamers out there. The word is already out here but no mention on the PCgamer site
 
There are tens of million PC gamers out there. The word is already out here but no mention on the PCgamer site

People who can't even type what's the difference between X gpu vs x gpu into google have bigger problems than Nvidia choice to segment it's products. For those who don't even know what a gpu is ignorance is bliss I guess.
 
People who can't even type what's the difference between X gpu vs x gpu into google have bigger problems than Nvidia choice to segment it's products. For those who don't even know what a gpu is ignorance is bliss I guess.

But there's no reason for the uniformed to question the difference in the 4080s except of course for the price. That is what Nvidia is counting on and the developers of games will just say 4080 in requirements for 4K

I don't even know if the differences will even show up on sites like PCGamer. Their tech editor is mostly clueless about tech. He once stated in a review on a GTX 690 that it would give better 4K performance because it had twice as much VRAM.
 
But there's no reason for the uniformed to question the difference in the 4080s except of course for the price. That is what Nvidia is counting on and the developers of games will just say 4080 in requirements for 4K

I don't even know if the differences will even show up on sites like PCGamer. Their tech editor is mostly clueless about tech. He once stated in a review on a GTX 690 that it would give better 4K performance because it had twice as much VRAM.

I actually liked the GTX 690 it was a pretty neat card I preferred my sli 680s but only taking up one alot was pretty cool. Tear for the death of sli although I can't even imagine doing it with my 3080ti :laugh:

I only see the two 4080s as being an issue if they are priced the same or at least very close in price. I doubt that will be the case the specs are pretty different. I'm also not convinced this isn't an oem only varient where 300+ watt cards are a bad idea as it is.

This isn't the same situation as with the 1030 that came in both ddr4 and Gddr5 varients that costed about the same and performed very different that preyed on the market that would be most impacted by this.
 
I actually liked the GTX 690 it was a pretty neat card I preferred my sli 680s but only taking up one alot was pretty cool. Tear for the death of sli although I can't even imagine doing it with my 3080ti :laugh:

I only see the two 4080s as being an issue if they are priced the same or at least very close in price. I doubt that will be the case the specs are pretty different. I'm also not convinced this isn't an oem only varient where 300+ watt cards are a bad idea as it is.

This isn't the same situation as with the 1030 that came in both ddr4 and Gddr5 varients that costed about the same and performed very different that preyed on the market that would be most impacted by this.
Isn't it.

Because I expect the 16GB to be out and reviewed first.

Then they ship the LESS performance part with increased customer excitement for the slightly cheaper one.
Hopefully W1zzard will straighten it all out for people, but I will shit in the thread if I am right and the 12GB is slightly delayed.

Because that's a scum tactic.
 
Isn't it.

Because I expect the 16GB to be out and reviewed first.

Then they ship the LESS performance part with increased customer excitement for the slightly cheaper one.
Hopefully W1zzard will straighten it all out for people, but I will shit in the thread if I am right and the 12GB is slightly delayed.

Because that's a scum tactic.

Not really sure what Nvidia is thinking but going by the specs I expect these cards to be 200+ usd apart making them hard to confuse. They'd look better just calling the 12GB varient the 4070ti and could price it around $650 if it's indeed faster or similar than a 3090 and look good. People smarter than me when it comes to marketing a product are making these decisions so if they thik people are stupid enough to confuse them I guess it must be true.

None of these companies are our friend they are simply here to make money so really it comes down to how good a product both varients are not what they're named hopefully they both end up decent products without highly inflated prices vs 30 series.
 
Not really sure what Nvidia is thinking but going by the specs I expect these cards to be 200+ usd apart making them hard to confuse. They'd look better just calling the 12GB varient the 4070ti and could price it around $650 if it's indeed faster or similar than a 3090 and look good. People smarter than me when it comes to marketing a product are making these decisions so if they thinm people are stupid enough to confuse them I guess it must be true.

None of these companies are our friend they are simply here to make money so really it comes down to how good a product both varients are not what they're named hopefully they both end up decent products without highly inflated prices vs 30 series.
Paul's hardware mentioned a very good point.

The case's people have now, a lot of them are not going to work well with 800watts under load in use.

There's going to be a glut of random restarts etc too, since many will chance they're future proof 850watt PSU can do it, and some can't.
 
I really really do not want to know how much they will cost. And this is totally unnecessary, but I guess marketing people at Nvidia know what they are doing.
 
Paul's hardware mentioned a very good point.

The case's people have now, a lot of them are not going to work well with 800watts under load in use.

There's going to be a glut of random restarts etc too, since many will chance they're future proof 850watt PSU can do it, and some can't.

The 12GB variant may end up being the better card for a lot of people thankfully my 011 Dynamic XL with 7 Phantek T30 fans should be up to the task I hope lol.... I've been recommending people grab at least 1k psu since ampere launched if they plan on going higher than a 3070 just in case it should be interesting regardless.

My 3080ti actually runs pretty cool at 450w maxed out power limits 65-70C but I doubt 600W is realistically doable though. I still might try lol.
 
Last edited:
Back
Top