• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Cancels GeForce RTX 4080 12GB, To Relaunch it With a Different Name

Comparison with the previous gen won't matter when the previous gen isn't available anymore - that is, when your only choice is limited to buying the newest shiny crap or not.

That works when there's no global economic downturn: we'll have to wait and see if it will still hold true after both manufacturers release their lineups, a few months from now.
I can imagine two scenarios:
1. AMD slightly undercuts Nvidia with pricing, but people will still buy Nvidia due to their technological advantage and/or fanboyism.
2. AMD joins the game of big, heavy and expensive GPUs and comes out with something massive while still selling RDNA 2 for the people who care about price.

I do not expect RDNA 3 to be significantly cheaper than Ada.
 
It's an illusion. Between the higher-specced proper 24G RTX 3090 Ti and a limited 12G RTX 4080 I would choose the former any day.
"Power resides where men believe it resides. It’s a trick. A shadow on the wall. And a very small man can cast a very large shadow."
game of thrones dragon GIF
The real power is in the hands of buyers! If enough of them believe these things are worthless (or not worth $900) then Nvidia is effed, sadly not many do that :shadedshu:
 
Comparison with the previous gen won't matter when the previous gen isn't available anymore - that is, when your only choice is limited to buying the newest shiny crap or not.

That's true only if you ignore that the RTX 3090 Ti will exist for quite a while either in a brand new state or in the second-hand offers.

Why instantly choosing a 450W GPU if you can have the same performance level with just 285W GPU (37% less)?

It's not 450 watts and one can always undervolt.

24G is 24G. Double that of 12G..
 
That's true only if you ignore that the RTX 3090 Ti will exist for quite a while either in a brand new state or in the second-hand offers.



It's not 450 watts and one can always undervolt.

24G is 24G. Double that of 12G..

Yeah, i put also an 24GB monster into my PC, which only has 16GB system memory. But i could buy it t for 730USD a couple of weeks ago, while for 4080 (4060ti) i would still need to wait for.
 
The key here is: people just don't think before they buy.
No it isn't. The key is people want those things. We can live with candles instead of lightbulbs and a cheap beat down car, or we buy luxuries, because we can afford to buy them. Nvidia is simply pricing things according to the market, market spends more, Nvidia raises prices. Back in the day 80 series cards were a lot cheaper and people still didn't buy them. When 1080Ti came out A LOT of people bought one, that was the catalyst for Nvidia to realize they underpriced it.
 
No it isn't. The key is people want those things. We can live with candles instead of lightbulbs and a cheap beat down car, or we buy luxuries, because we can afford to buy them. Nvidia is simply pricing things according to the market, market spends more, Nvidia raises prices. Back in the day 80 series cards were a lot cheaper and people still didn't buy them. When 1080Ti came out A LOT of people bought one, that was the catalyst for Nvidia to realize they underpriced it.
Comparing a candle to a lightbulb is the same as to gpus, yeah allright.
You are just saying the same thing, people buying those stupidly overpriced gpus is what shows they don't think before they buy, they are compulsive buyers that happen to have money to waste.
 
It's not low performance.
It's ok for a 70 series card to match the TI/90 of the previous one.
The 3070 did the same to the 2080Ti.
The 4070 has the advantage to actually beat the 3090Ti in newer games while the 3070 does not acordingly.

For $900? Absolutely it would be low performance.

The problem is that NVIDIA appears to be going backwards in terms of perf/dollar. It looks more and more credible to me that they frontloaded the 4090 because the 4090 is where they put the bulk of their generational performance increase. So sure, the 4090 looks good if you compare it to previous halo cards that were an absolutely terrible value--and no one ever pretended otherwise, by the way. Everyone acknowledged that 80 and below were where the value lies, yet all of a sudden the 4090 releases and we're supposed to fall all over ourselves congratulating NVIDIA for releasing a $1600 Ada card that handily beats a ludicrously overpriced Ampere card whose MSRP was based on a concurrent and unprecedented GPU shortage.

But the 3090 and its Ti used the same die as the 3080. Any projections we discuss must take that into account; in other words, we can't depend on the lower tier cards to perform at traditional levels, relative to the halo product. NVIDIA's own charts bear me out on that, at least so far.

I'm glad they relented on the goofy naming scheme for the $900 "4080," but it isn't obvious that that this card won't come back with a different name and at an equally ridiculous price--or, as some in this thread have suggested, perhaps this card will just disappear entirely for a long time, while NVIDIA burns off its remaining stock of Ampere. Either way, the $1200 version of the 4080 doesn't exactly look like a value king, either, ATM. It looks like it will give you fewer frames per dollar than its predecessor, once you strip away the hocus pocus about "DLSS 3.0"--which is nothing of the kind, incidentally; 3.0 has nothing to do with the extant versions of DLSS, and it's far less useful. The name implies that 3.0 (interpolation, "fake" or visual-only FPS) is better than 2.0 (upscaling, real FPS), which is nonsense.

NVIDIA isn't just comparing apples to oranges in this case; they're selling you a single grape and telling you that it's the hottest new thing in orange technology.

That could be another reason that NVIDIA led with the 90 card, this time--DLSS 3.0 looks better visually at higher frame rates. Essentially they're selling you a feature that purports to raise frame rates dramatically, and thus a feature that would logically appeal most to lower-end consumers, but the feature really only works well when you already have high frame rates (and when you have an extremely high refresh rate monitor to take advantage of the extra AI-generated frames; since these frames don't reduce input latency, any of them above your monitor's max refresh are pointless).

And I'm not fanboying for AMD here, either. You'd think this situation presents a great opportunity for AMD to seize the day with aggressively priced products, but I'll believe it when I see it. NVIDIA sure doesn't seem to be worried. For the moment, I suppose at least Intel's getting a breather to refine Arc's drivers. Remember when we all lamented the timing of Intel's launch? Well, Big Blue may not have to worry about next-gen competition at their current price bracket any time soon. A third credible player in this market can't come soon enough.
 
It's not 450 watts and one can always undervolt.

24G is 24G. Double that of 12G..
3090ti is a 450w TDP at stock and go as much as 550w for OC.
You can also undervolt the 285w, no efficiency gain here.
And I agree on the 24gb, if you must know you have way more memory than you need.
On any other occasion, the 4080-12gb will do just fine, with way less power and heat, and will be the better choice for most if both are price equivalent.
 
It just boggles my mind how they keep getting away with naming products in ways that are clearly misleading on purpose. They been doing this for so long, especially on mobile parts, if you're looking at a laptop with an Nvidia GPU you'll pretty much have no idea whatsoever what it is that you're buying going by name alone unless you do extensive research and even then you wont know for sure because manufacturers do a very good job at straight up hiding or obfuscating the TDPs.

Someone has to sue their asses and put an end to this once and for all.
 
I literally had a GTX 1070 + A2000 in my rig (with gaming and workstation drivers) and then swapped the 1070 for an RX580 + A2000 in my system. So I had Adrenaline and RTX Studio drivers installed. Never did an uninstall/reinstall between cards on either swap. They all worked flawlessly together, although I only ran it that way for a few hours to make sure the cards were working.
Yeah, like I said, I know that works in Windows 10 and newer. I was just trying to imagine a scenario that made sense. I've been building PCs since Windows 95 and there have been plenty of bad drivers and bad OSes in that time. In the past 7 to 10 years, I haven't seen many BSODs that could be attributed to a GPU driver, maybe I've been lucky.

It just boggles my mind how they keep getting away with naming products in ways that are clearly misleading on purpose. They been doing this for so long, especially on mobile parts, if you're looking at a laptop with an Nvidia GPU you'll pretty much have no idea whatsoever what it is that you're buying going by name alone unless you do extensive research and even then you wont know for sure because manufacturers do a very good job at straight up hiding or obfuscating the TDPs.

Someone has to sue their asses and put an end to this once and for all.
Laptops in general are bad in every way when it comes to that. All these CPUs with configurable TDPs on top of the GPU shenanigans. You really have to find a review of a specific laptop model and then only buy that particular model and make sure it is configured exactly as in the review to know what you are getting. It's a real pain!
 
Comparison with the previous gen won't matter when the previous gen isn't available anymore - that is, when your only choice is limited to buying the newest shiny crap or not.
This assumes AMD would roll out RDNA3 with 7800 perf on par with 6800, but PRICIER.

I wonder why would they do that though.
 
All these CPUs with configurable TDPs on top of the GPU shenanigans.
Well at least with CPU's you can underclock/fine tune them or say plug them (laptop) while running & more or less get what you're paying for, with GPU's there's very little way around the "TGP" limit BS also in large part because the cooling would not be sufficient on such laptops! GPU's are definitely much worse in that regard.
 
And I agree on the 24gb, if you must know you have way more memory than you need.
On any other occasion, the 4080-12gb will do just fine, with way less power and heat, and will be the better choice for most if both are price equivalent.

If 24 GB is too many for you, you can always buy a 16 GB RX 6900 XT or 6950 XT, still more than the pathetic 12 GB.
No VRAM in-game warnings, no lower textures resolution - cheating by nvidia driver, no questionable future-proofing.
 
Yeah, I decided to look into the benchmarks of the 4080 that nvidia was showcasing and its rather pathetic, for something $1200.

So the only gpu worth it for 4xxx series is the 4090 if you got a 3070 or equivalent
 
the $1200 version of the 4080 doesn't exactly look like a value king
The +71% higher priced than the previous 3080 doesn't exactly look like a value king? really? I must be dreaming, how are people not sure this gpu is a complete rip off of gargantuan proportions. Dude, they increased $500 on a card that previously launched at $700. Add inflation, and whatever other stuff you want and still does not touch even $900.
 
Hmm


Wonder how true?

Edit: in either case, DLSS 3.0 with frame injections isn't exactly what I care or want. Raw performance with raw visuals instead.
 
Last edited:
Comparing a candle to a lightbulb is the same as to gpus, yeah allright.
You are just saying the same thing, people buying those stupidly overpriced gpus is what shows they don't think before they buy, they are compulsive buyers that happen to have money to waste.
People think before they buy iPhone Ultra Max for sending text messages and they still do it. Graphics cards are luxury purchases anyway.
 
I don't think too many people think before buying that god awful ugly piece of brick, no not the 2 kilo one :ohwell:
 
People think before they buy iPhone Ultra Max for sending text messages and they still do it. Graphics cards are luxury purchases anyway.

Some think, others don't think. For example, I always research for what I need - for example - water resistance, a normal price tag - all the things that Apple iPhone canNOT deliver.
 
Edit: in either case, DLSS 3.0 with frame injections isn't exactly what I care or want. Raw performance with raw visuals instead.

According to HWB you don't want DLSS3 on anything less than 120 FPS, since it adds what can be perceived as a mouse input latency. 4070 only has 40 FPS in 4K. and at this point doesn't make sense having it at all, on 20 - 30 series that don't have the optical tensor flow is even worse. And there go 20 billion transistors on the 4080 12 just wasted on tensor cores and raytracing, and l2 cache that seemingly does nothing or compensates poorly for the lack of 64 bit bus that was there on the similarly sized 1070/1080.
 
Share other graphics benchmarks of 4080, be so kind.


That's ok and rather obvious.


Well, here is "non-proper" (from her majesty nGreedia) that is supposed to highlight the 4000 series greatness... I guess (note how 4080 12GB stacks against 3080):

View attachment 265691


No one has done some simple math for that DLSS 3 graph...

if you have a RTX 4090 24gb you get 65% increase in Frame rates from DLSS 3
if you have a RTX 4080 16gb you get 132% increase in frame rates from DLSS 3
if you have a RTX 4080 12gb you get 150 % increase in frame rates from DLSS 3

there a drop off as you get to a bigger & more powerful cards....
 
According to HWB you don't want DLSS3 on anything less than 120 FPS, since it adds what can be perceived as a mouse input latency. 4070 only has 40 FPS in 4K. and at this point doesn't make sense having it at all, on 20 - 30 series that don't have the optical tensor flow is even worse. And there go 20 billion transistors on the 4080 12 just wasted on tensor cores and raytracing, and l2 cache that seemingly does nothing or compensates poorly for the lack of 64 bit bus that was there on the similarly sized 1070/1080.
Which makes no sense to get anything less than a 4090.

Wonder if they did this all on purpose. Ignore anything less than a 4090 or just grab a 30 series card.
 
there a drop off as you get to a bigger & more powerful cards....

And you know why that is. The GPU Load is 60% because of CPU bottleneck. hilarious, on 12900 is less but it's still there.

1665944394669.png
 
And you know why that is. The GPU Load is 60% because of CPU bottleneck. hilarious, on 12900 is less but it's still there.

View attachment 265720
now see thats the totally the opposite of what jensen said DLSS 3 did. Jensen claimed that it removed the cpu bottleneck from the cpu.
 
Last edited:
No it isn't. The key is people want those things. We can live with candles instead of lightbulbs and a cheap beat down car, or we buy luxuries, because we can afford to buy them. Nvidia is simply pricing things according to the market, market spends more, Nvidia raises prices. Back in the day 80 series cards were a lot cheaper and people still didn't buy them. When 1080Ti came out A LOT of people bought one, that was the catalyst for Nvidia to realize they underpriced it.

candles and cheap beat down cars are lost at sea! ....thats where the problem is. The higher premiums at the top level are calculatively manipulating bottom/mid-segment prices. Looking at 40-series, a $1200 4080 sets an extremely high reference point to tolerably cater for the mid-segment (or inferior). NVIDIA tried masking this massive gap in price by throwing in a 4080-12 which failed. We all get it, profits come first and we're more than happy to offer charitable donations in achieving our performance targets but come on, don't charge luxury car prices for standard premium models otherwise you end up with cheap beat down cars going for the premium model rate.

The way I see it, eventually we'll end up with lacklustre feature/performance trimmed $350-$500 SKUs which is hardly something to be excited about. For me, 40-series is just tiresome, over-hyped and in its current SKU formation financially inaccessible for 99.9% of game-dedicated consumers. Obviously its early days... but its not looking pretty!
 
Back
Top