• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Why does everyone hate the 4080?

Status
Not open for further replies.
At least the 7900 XTX is launching with the same MSRP as the 6900 XT did. When was the last time we saw that from Nvidia?
Probably when AMD gave them real compatition, someday in the past, so they were forced to keep it the same in order to sell.
Today NV is stronger so they 'cash in' as much has they can while it lasts.
AMD did the same with ZEN 3 when they had the advantage over Intel (until alder lake arrived) while bluntly neglecting the low and mid market.
With ZEN 4 the situation is only worse.
Try to look surprised.
 
Look at the AU prices


$3K for a 4090
1668904130794.png


$2.5k for a 4080
1668904155487.png



why would anyone getting a 4080 not feel like it's bad value compared to the 4090?

(3080 and 3090ti for comparison)
1668904205302.png
1668904223531.png
 
It's an integral part of the GPU, though.
Everyone would love a Chevrolet Bolt if it was 1$. The price cannot be separated from the product.

So.. Yha. The answer is just a bad performance-to-cost ratio.

A product can be "the best thing ever", but if it's priced "out of reach", it's useless, no?
 
A product can be "the best thing ever", but if it's priced "out of reach", it's useless, no?
That's exactly what I was saying, yes. Price is an important factor of a product just like everything else.
 
The last time Nvidia innovated
Nvidia Ada on the other hand, is nothing more than Ampere on a die shrink.
I'm not sure that's entirely correct.

Improvements to their RT pipeline are in Ada iirc, as well as, 4x tensor flops over 3090ti, vastly increased cache sizes, the new/vastly improved optical flow accelerator enabling DLSS3 FG, creating and launching FG, shader execution reordering, there's also the improvements to the power circuitry keeping possible insane power spikes better in check.

Perhaps not massive changes in some people's book, but indisputably more than Ampere on a die shrink, and some are definitely innovative in my book.
 
I personally didn't like it as it was a huge MSRP boost over 3080. performance is nice but over $1000 and expensive for many. Those who have more than $1000 can find and buy at the price of 4090 MSRP.
 
I don’t hate it, I like the competition, like that crypto is no longer raping our hardware for greed, like that companies are refocused on gaming instead of crypto.

As a product it’s overpriced but does contain an amazing amount of technology, GPUs have come a long way and the RT performance argument will only end with every product and game having a reasonable amount as it matures, just like tessellated concrete barriers.
 
I personally didn't like it as it was a huge MSRP boost over 3080. performance is nice but expensive for many over $1000. Those who have more than $1000 can find and buy at the MSRP price of 4090.
I agree - if you're going to spend $1200 then paying the extra $399 for 30-40% more performance is kind of a no brainer...

But also, it's way faster than a 3090TI, which is $1100 right now... so from that end $100 more for a much faster card also makes sense.
 
I agree - if you're going to spend $1200 then paying the extra $399 for 30-40% more performance is kind of a no brainer...

But also, it's way faster than a 3090TI, which is $1100 right now... so from that end $100 more for a much faster card also makes sense.
The 3090 used in my country is sold at a very good price. It is also used very little in mining. Not far behind the 3090 Ti. The 4080 is expensive here right now and the 4090 is more attractive. I currently have two cards GTX 1080 and RTX 3070 RTX 3070 is more than enough for me . Even the GTX 1080 at 1080p resolution is more than enough.
 
I agree - if you're going to spend $1200 then paying the extra $399 for 30-40% more performance is kind of a no brainer...

But also, it's way faster than a 3090TI, which is $1100 right now... so from that end $100 more for a much faster card also makes sense.
It's a yes-brainer.
The 4090 isn't being sold for MSRP, so realistically you'd have to spend ~600-800$ more for the same 30-40% performance.

Not saying the 4080 is actually worth its price, but it's definitely not as easy as deciding between 4080 @ MSRP and 4090 @ MSRP, since they are non-existent.
 
It's a yes-brainer.
The 4090 isn't being sold for MSRP, so realistically you'd have to spend ~600-800$ more for the same 30-40% performance.

Not saying the 4080 is actually worth its price, but it's definitely not as easy as deciding between 4080 @ MSRP and 4090 @ MSRP, since they are non-existent.

Sure, if you need a card today, then yes. Soon it will just be to buy a $2200 4090 or a $1800 4080...
 
Sure, if you need a card today, then yes. Soon it will just be to buy a $2200 4090 or a $1800 4080...
Soon

but for those (like yourself) that don't have a Micro Center near by and you're left to only being able to obtain a GPU by online sources
Following that, I just happened to come across Gigabyte's post while scrolling through Facebook.
They are "securing" GPUs for people in the US, but only in 19 states?
Why would they need to do that if there's so much stock?
CA, CO IL & OH are all states where according to your research has plenty of stock in Micro Center stores.

Very odd. Maybe they were instructed by nVidia to create the illusion of them being out of stock, but that's just me putting on my conspiracy hat.

1668934027309.png
 
Last edited:
I'm not sure that's entirely correct.

Improvements to their RT pipeline are in Ada iirc, as well as, 4x tensor flops over 3090ti, vastly increased cache sizes, the new/vastly improved optical flow accelerator enabling DLSS3 FG, creating and launching FG, shader execution reordering, there's also the improvements to the power circuitry keeping possible insane power spikes better in check.

Perhaps not massive changes in some people's book, but indisputably more than Ampere on a die shrink, and some are definitely innovative in my book.
That's what they're telling us, although the block diagram is exactly the same, and the performance drop in each game with RT on compared to RT off appears to be exactly the same as it was on Ampere or Turing. So where's the improvement? As for power, the die shrink keeps that in check. I see no improvement at all.
 
Because the customers feel that they are screwed. This is a rebadged RTX 4070 card which should be had for as low as 400-499$ but nvidia charges 3 times more!
 
Soon


Following that, I just happened to come across Gigabyte's post while scrolling through Facebook.
They are "securing" GPUs for people in the US, but only in 19 states?
Why would they need to do that if there's so much stock?
CA, CO IL & OH are all states where according to your research has plenty of stock in Micro Center stores.

Very odd. Maybe they were instructed by nVidia to create the illusion of them being out of stock, but that's just me putting on my conspiracy hat.

View attachment 270904

Miners in my country are starting to dump used GPUs en masse, probably because of FTX collapse, used 3080 are selling for 400usd now. Might want hold out for cheap Ampere GPU :)
 
I'm not sure if that's really the case.
I'm nVidia-centric myself, but I am rooting for AMD as much as I can.
I think the majority of us are just tired of nVidia's monopoly and trashing on its users.
We want a decent competitor, not only in price-to-performance but also in pure performance.
I want to see AMD competing against nVidia's high-end (ex-Titan / 4090 Ti) cards and leaving them in the dust.

Even if I choose to stay with nVidia, at least it'll be because I CHOSE to do it and not because nVidia is the only choice for top-tier performance all the time and I must go with it.


Can confirm this is correct. I know I am being screwed with the 4080's and I'd hate to support nVidia for what they're doing, but I also feel like I've waited too long and meh.
Luckily I have yet to pick up my cards before opening this thread and I won't be picking them up / opening them until the AMD 7900 XTX is released.
I can freely return them until January 14th, so I have ~1 month to see what's going to happen with the market.
If the 7900 XTX reviews really show it's better than the 4080 (especially at VR games & RT), I'll definitely be returning the 4080's.
Either that, or if I manage to grab a 4090 at MSRP.
There is a beautiful thing about PC hardware parts. Waiting is always profitable. Anyone with FOMO in this business just doesn't quite get how it works.

Waiting isn't just for financial advantages either.
- Post release support is better. Driver quality improves from launch ongoing. It isn't best at launch.
- Game support is better. Games will start using the hardware better over time, quite often a few years post launch. The likelihood of that being playable, also increases over time (see RT, even if its slow, there is progress) - even on the same GPU.
- Not early adopting is avoiding the fun stuff like a melting connector + accompanying correspondence with vendor/manufacturer and waiting for replacement, etc. The irony: being first, might mean being nowhere while others are gaming for their time.
- You're becoming a hunter instead of the prey; always on the prowl with money to burn, ready to jump on the tastiest snack. Prey buys at launch, diving deep into the marketing story to justify losing all the above advantages, and then gets the additional (even if minor-) stress of subsequent offers that complete the stack being quite a lot better, plus the effects of competition over sales (free games, etc.). See, for them, FOMO never ends. There is always something new around the corner. And that's the self-feeding/confirming cycle right there: must be first, because 'you were already waiting for so long' for release X/Y.

You've managed to break that cycle and you're now officially on the prowl ;)

1668967554239.png
 
Lots of 4080's in stock in UK. Very different times. It cheers me to see that people aren't swiping these £1300-1500 2nd tier GPUs from the shelves.
 
There is a beautiful thing about PC hardware parts. Waiting is always profitable. Anyone with FOMO in this business just doesn't quite get how it works.

Waiting isn't just for financial advantages either.
- Post release support is better. Driver quality improves from launch ongoing. It isn't best at launch.
- Game support is better. Games will start using the hardware better over time, quite often a few years post launch. The likelihood of that being playable, also increases over time (see RT, even if its slow, there is progress) - even on the same GPU.
- Not early adopting is avoiding the fun stuff like a melting connector + accompanying correspondence with vendor/manufacturer and waiting for replacement, etc. The irony: being first, might mean being nowhere while others are gaming for their time.
- You're becoming a hunter instead of the prey; always on the prowl with money to burn, ready to jump on the tastiest snack. Prey buys at launch, diving deep into the marketing story to justify losing all the above advantages, and then gets the additional (even if minor-) stress of subsequent offers that complete the stack being quite a lot better, plus the effects of competition over sales (free games, etc.). See, for them, FOMO never ends. There is always something new around the corner. And that's the self-feeding/confirming cycle right there: must be first, because 'you were already waiting for so long' for release X/Y.

You've managed to break that cycle and you're now officially on the prowl ;)

View attachment 270995
I'd rephrase that as "buying a higher-end model of an outgoing series is better than a lower-end model of the incoming one". Now is the best time to grab an RDNA 2, or Ampere GPU. Other than that, you're right.
 
Hi,
None at bestbuy so plenty of suckers in the US standing in line
Microcenter has some instock but third party so even more money.

Miners in my country are starting to dump used GPUs en masse, probably because of FTX collapse, used 3080 are selling for 400usd now. Might want hold out for cheap Ampere GPU :)
So do they also come with miners vbios to :eek:
 
That's what they're telling us ....... although the block diagram is exactly the same ......... I see no improvement at all.

Of course it's what they're telling us, they designed the GPU. RT improvements almost certainly need to be accounted for in game code, the RT cores themselves definitely do now have increased capability relative to Ampere.

NVIDIA engineers have developed three new features in the Ada RT Core to enable high-performance ray tracing of highly complex geometry:
● First, Ada’s Third-Generation RT Core features 2x Faster Ray-Triangle Intersection
Throughput relative to Ampere; this enables developers to add more detail into their
virtual worlds.
● Second, Ada’s RT Core has 2x Faster Alpha Traversal; the RT Core features a new Opacity
Micromap Engine to directly alpha-test geometry and significantly reduce shader-based
alpha computations. With this new functionality, developers can very compactly describe
irregularly shaped or translucent objects, like ferns or fences, and directly and more
efficiently ray trace them with the Ada RT Core.
● Third, the new Ada RT Core supports 10x Faster BVH Build in 20X Less BVH Space when
using its new Displaced Micro-Mesh Engine to generate micro-triangles from micromeshes on-demand. The micro-mesh is a new primitive that represents a structured mesh
of micro-triangles that the Ada RT Core processes natively, saving the storage and
processing compared to what is normally required when describing complex geometries
using only basic triangles

Also, a node shrink ≠ the improved power circuitry for spikes. The massive extra cache, the 4x improvement in tensor flops which outstrips the increase in tensor cores x clock speed, the 3x faster OFA..

Here's the Ada vs Ampere block Diagram, note the capabilities in the RT cores and the L0 i-cache.

smblock.JPG

You can choose to see no improvement if you want, but this is factually not a die shrunk Ampere despite how the swing of reactions go, it may share the vast majority or architectural design, but there are indisputable additions and differences, ergo in an absolute sense, not a die shrunk Ampere.

You can check out the NVIDIA ADA GPU ARCHITECTURE whitepaper for more info.
 
Last edited:
Of course it's what they're telling us, they designed the GPU. RT improvements almost certainly need to be accounted for in game code, the RT cores themselves definitely do now have increased capability relative to Ampere.

In other words, no games out now can support these new RT core improvements? I mean, that's fine for future titles but the PR mentioned RT improvements but the current performance increases in RT for ADA on current gen are brought through the sheer rasterisation uplift (as I mentioned somewhere previously when looking at % hit by turning RT on when comparing Ada and Ampere).
 
Status
Not open for further replies.
Back
Top