• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5080 Founders Edition

RT performance is really inconsistent across titles. The 7900XTX lost a whopping 9% vs the 3090TI since the 4070 review, even getting beaten by the 3090 when it was 15% faster before.

Just a guess, could it be because of the introduction of ray reconstruction?
 
If AMD also will not be aggressive with it's pricing these will be very long and boring two years. Basically sitting four years on a single generation for me refresh does not count.
 
Last edited:
Well, Multi-Frame Generation, for now, hardly justifies the price premium. Maybe if they iron out the kinks and it becomes a better story, like how DLSS 1.0 evolved to DLSS 4.0
I was thinking the same, but it will probably be more difficult to see the evolution that DLSS have seen. And the problem is that even with DLSS we see that while there is image quality degradation, it is hidden under the carpet of "a must have because it is better than the alternatives and increases framerates". What if multi frame generation ends up the same? A feature that degrades even further image quality but it is hidden under the same carpet? Even with a multi frame generation of version 4.0 in a few years, we will be dealing with worst image quality than what we get today. Probably we will be getting better lighting, because frame rates with path tracing will be acceptable, but in the end we will have ghosting and artifacts and who knows what else with just better(?) lighting.
AMD could pull a rebrand, like they did in the past, of the RX7000 series but lowering their asking price a notch and just keep the RX9070XT as their top offering undercutting the 5080 in price and keeping their latest posture of "No high-end stuff".
There, competition thru price with minimal overhead for AMD.
I'm still crossing fingers that Intel will go after the 300~400€ price-range with an offering above the B580 and maybe have a full BMG-20 die to take a shot at 499€.

As for this launch, guess for now I'll just keep recommend the 40x0 gen of nVidias , any time the prices lower due to discounts, for anyone still on Turing or earlier.
Maybe they don't want to or can't sell much. Maybe they don't get enough wafers from TSMC to cover the gaming market and they prefer to sell badly with a profit than good without a profit. Because even that "good without a profit" will be questionable. If there is demand and they don't have wafer supply, prices will go up and everyone will accuse AMD of doing the same s_it that Nvidia does by limiting supply. And unfortunately we all have seen that if people suspect AMD of doing the same with Nvidia, all negativity turns to AMD and Nvidia gets away with it.

Intel needs to fix it's manufacturing. What we see now, limited availability of B570 and B580, would never happened if Intel could build those chips in it's own fab. The market would be flooded with those cards. I don't think they have enough supply of wafers from TSMC either. I think only two companies enjoy enough wafers from TSMC. Apple and Nvidia.
 
Really good question, for the short term they will be included in the RT section, and not in the "regular" games list. Eventually these two sections will be merged, because RT will become the standard, and I'll have separate summary charts, avg all, raster only, rt only. At least that's what I came up while thinking about the problem. Happy to discuss this further, start a new thread please

Have you thought about removing Elden Ring and Doom Eternal from the RT benchmarks I know why they are there but then we won't have people saying they are holding back the 5080... By my napkin math it would only add about 2% vs the 4080 super if you did so the gap would still be disappointing.
 
Similar to GTX 770 but worse than GTX 580! It's a clear refresh nothing more.
480 --> 580 and 680 --> 770 weren't even new generations of GPU, making Blackwell look even more pathetic. Those were just Fermi --> Fermi and Kepler --> Kepler.

Lowest generational performance jump ever (at least back to the early '00s when I started paying attention) for a new architecture.
 
From what i see, the Nvidia/AMD sponsored title thing these days isn't much about performance but what is included in the game. Its not uncommon for a game to only have DLSS or FSR

we really need a vendor agnostic solution soon
 
It has been done like that since forever, I need to cut off at some point .. usually 125% and 75% relative performance. But good point, let me add the 3080, it's an important comparison

Graphs are rerendering now
Question, do you remember how Anandtech (RIP) had a semi-interactive benchmark results, which would help you compare your cpu or gpu (not combined), only one of the two) with newer results?

Would such a tool be possible given all the data that you have collected?

Just place a big disclaimer stating that it wont be 100% accurate for obvious reasons (different hardware combinations) but it would still offer some kind of info for everyone.
 
It feels like a reach to call this "highly recommended".

This release feels like a giant nothing burger. I don't really care about frame gen when it makes latency worse. I want the card to actually render the stupid game at a reasonable frame rate to begin with. Sure, we can use DLSS, which is yet another band-aid unless you're using it to render at a higher resolution than your display, but that also results in artifacts, and I don't want to have to use it.

Not to mention, an xx80 card used to be $750-800 adjusted for inflation, not $1,000 and sure as hell not $1,200. So, saying it's $200 less than a card that was absurdly overpriced by $400 isn't really something that belongs in the positive category. Same price as the super still isn't great because it's still $200 too much.

The ONLY reason anyone should buy this card is if the RTX 5070 Ti and RX 9070 XT just aren't enough for what you're doing and you can justify hundreds of dollar for a tiny improvement. Zero people with the previous gen 4070 Ti, 4080, 7900 xt/gre/xtx should buy it at this price and lack of performance gains.

I waited years to buy this generation, and now I'm probably going to wait until the next generation so that I can buy something that's remotely worth the money. Maybe.
 
Question, do you remember how Anandtech (RIP) had a semi-interactive benchmark results, which would help you compare your cpu or gpu (not combined), only one of the two) with newer results?

Would such a tool be possible given all the data that you have collected?

Just place a big disclaimer stating that it wont be 100% accurate for obvious reasons (different hardware combinations) but it would still offer some kind of info for everyone.
Isn't that basically what the comparison box in the GPU database is?
 
I feel a bit disappointed on this gpu :/ , i feel like this card does not deserve the 5080 naming and should have been better calling it at best 5070Ti , that's my personal opinion. Don't like it ...
That's right, this is the real 5070 Ti upsold. This card could have received a 330W TDP without any meaningful performance loss.
The 5080 Ti will be the legit 5080, a heavily cut-down GB202 die somewhat faster than the 4090; with a fake, seemingly good MSRP, what in practice still be high, with a likely created artificial scarcity.

At this point Nvidia is not that hard to predict in its market behaviour.
 
Last edited:
Isn't that basically what the comparison box in the GPU database is?
Not entirely.

One, its not interactive.

Two, as stated by W1zzard, he removes old ones after a certain threshold is reached.
 
Getting mad at consumers/customers here is a really weird take. Monopolies have nearly complete power over their customer base. The 4080 12GB / 4070 Ti had competition from AMD. The 5080 does not. This isn't a "why are customers so braindead" moment, it's a "monopolies suck and we need AMD or someone else to become competitive" moment.

Nvidia has zero competition on the high end, and the AI bubble means they will sell cards at whatever price Nvidia wants even if gamers completely ignore them. Blaming consumers for that is a little... odd.
I'm not mad I'm baffled!

Nvidia monopoly doesn't mean that you can't criticise their practices, specially when it worked before.

The problem here is that this community (the one with interest and knowledge on GPUs) that was vocal about the same problem when it was obvious, now is mildly disappointed with this card when it's the same thing that happened but even worst.

Word of mouth counts in the pc market, if the worst this card gets is a "nothing special", it will sell well.
 
That's right, this is the real 5070 Ti upsold. This card could have received a 330W TDP without any meaningful performance loss. The 5080 Ti will be the legit 5080, a heavily cut-down GB202 die which will be somewhat faster than the 4090; with a fake, seemingly good MSRP, which in practice still be high, with a likely created artificial scarcity.

At this point Nvidia is not that hard to predict with its market behaviour.

It's always hilarious reading these posts. Yes, the $3 trillion company should listen to you, random internet user, regarding how to name and price their products. Your word is the law, you have the divine right to declare that this particular GPU must be named what you choose and coincidentally priced at exactly the amount of money in your checking account. Everybody knows the fundamental constants of the universe: the speed of light, the mass of a proton, the gravitational constant, and that the second-largest chip can only be used to make an xx70 card. We all learned that in school.

Come on man. The names are just branding, and Nvidia is divvying up the chips and pricing them according to what they think the market can handle. If they're wrong, they'll correct it (like they did with the 4070-Ti and 4080). It's fine to not be happy about the price or the performance compared to the last gen, but it's just silly to quibble over the name. And again, if the price is too high, the "street prices" will eventually drop and Nvidia will readjust the lineup.
 
Really don't think 'not increasing price' compared against mid-gen refreshes (or more accurately, course corrections) is any kind of upside. Like its price, that is a nominal trait. Not good or bad. Just the extant status quo. If this thing was $2 cheaper than the 4080S at MSRP, sure. Whatever. But it's not. It's a holding pattern, and for barely 12ish% improvement overall and almost nothing for perf/watt? Just not worth it.

See y'all in about 8 months if/when Blackwell SUPER gets announced.
 
Last edited:
I'm not mad I'm baffled!

Nvidia monopoly doesn't mean that you can't criticise their practices, specially when it worked before.

The problem here is that this community (the one with interest and knowledge on GPUs) that was vocal about the same problem when it was obvious, now is mildly disappointed with this card when it's the same thing that happened but even worst.

Word of mouth counts in the pc market, if the worst this card gets is a "nothing special", it will sell well.
I think you are attributing to word of mouth what can only be properly attributed to marketplace competition. If there is no competition for the 5080 except for the even more hilariously expensive 4090 and 5090, word of mouth won't matter.

Just look at the Ampere generation. 100% word of mouth agreement that crypto scam prices were absurd on tech forums. Did nothing - you either got lucky with your F5 key when there was a restock or you paid scalpers price if you wanted a card.
 
Just because there was no outcry doesn't mean nvidia is going to have any more success than if there was an outcry.
You'll have to check the sales for that. $1000 for any computer part is too much, so anyone who buys such a thing must convince themselves it's worth it somehow.
I have no idea how anyone will convince themselves a 5080 is worth it, in the same way that a 4080 wasn't either. These are simply rip offs. If they aren't sexy, they won't sell, unless the market is held captive (which it sort of is)

Explain the RTX 4090, which absolutely nobody needs, but yet has outsold anything AMD has produced in the last several years. That tells me that $1,000 isn’t too much. Price is only part of the value equation.

Just a guess, could it be because of the introduction of ray reconstruction?

I think RR might be slower than other denoisers, just much higher quality.
 
Just because there was no outcry doesn't mean nvidia is going to have any more success than if there was an outcry.
You'll have to check the sales for that. $1000 for any computer part is too much, so anyone who buys such a thing must convince themselves it's worth it somehow.
I have no idea how anyone will convince themselves a 5080 is worth it, in the same way that a 4080 wasn't either. These are simply rip offs. If they aren't sexy, they won't sell, unless the market is held captive (which it sort of is)
The outcry worked didn't it?

No one can say how well it will sell, but it will sell a lot better than if there was an outcry.

For a lot of people the precepion of a product is more important than the product itself, specially when it's expensive.

Nvidia monopoly was only possible because of the inflated perception of their products in the wide public.

I've built a decent amount of PCs for other people when I was younger, and I can tell you that people can be oblivious in this subject.
I had people asking me to put more expensive parts even if in their use case was just wasting money or to use Nvidia because they thought that Radeon cards were all slower and broke easily.
 
I think you are attributing to word of mouth what can only be properly attributed to marketplace competition. If there is no competition for the 5080 except for the even more hilariously expensive 4090 and 5090, word of mouth won't matter.

Just look at the Ampere generation. 100% word of mouth agreement that crypto scam prices were absurd on tech forums. Did nothing - you either got lucky with your F5 key when there was a restock or you paid scalpers price if you wanted a card.
On tech forums yes, but on the other hand the average Joe was bombarded from people and social media that crypto was free money, but some didn't commit that mistake because of the tech forums.

For example I personally dissuaded 6 people from buying Vegas for crypto, but they were convinced by others to try their hand at crypto.

Perception of a product is very important if everyone in tech forums is just mildly disappointed at the 5080 there's no chance that the wider public will notice, if they don't notice they will buy when at least some wouldn't if they had the notion of what's happening.

And for Nvidia what matters is profit, if the card sells less but the increased profit outweighs that it's great for them. The only way to get prices lower from a monopoly is to create a bad perception of their products so the sales dip.
 
Performs a bit better than the 4080 Super, but also uses a bit more power.

It's like a... 4080 Ti? 4080 Super Duper?
 
Nvidia monopoly was only possible because of the inflated perception of their products in the wide public.

What sort of revisionist history is this? The last time AMD was competitive with the top GPU from Nvidia was the Radeon HD 7970... 13 years ago...

Nvidia has a monopoly because AMD has been producing second-class graphics cards for over a decade.
 
If you really are a game(s) programmer then you know we've reached the end of 3D rasterization. Everyone - Nvidia, AMD, Sony, Microsoft, Epic etc.. has told you this.



You're lying again.
Complete bull crap about rasterization limit; or a a game like Chernobylite wouldn't gain any performance in supported mGPU mode with two rlrtx 4090s.

Now that I can agree with.

There was a comment above about RT handles the draw distance. No, that is more a bandwidth and throughput problem.
You cannot over come that problem when the rops share all three of those resoucers at the same render times. Hence when nvidia says they lower inter Rt core latency by half or more. I said it wouldn't matter for the end render. Which I was correct.
 
Last edited:
alright, i'll feed the troll. what would you recommend to me if i had $800 to $1200 to spend?
Of course 5070, because its performance exceeds 4090!

You do these reviews, and we all read them, and you don't need to take users for naive simpletons, as Jensen does) You should boldly say that you will not receive any future cards for review, if you do not present this wretchedness in a better light than it actually is.

And you definitely know the answer to your question, but you wrote it wrong ;)

The other baffling thing with rDNA3 was the shaders. They doubled the shader count per CU for, apparently, 0benefit. The RT performance per CU isnt any better than rDNA2, but it DOES bloat the die sizes up for no reason.
It seems you are out of the loop, even though you are writing on this enthusiast forum!
If you read the C&C reviews, you would know that it was not the number of shaders that was doubled, and AMD has repeatedly pointed this out, but the number of ALUs in them. And yes, they work and increase performance under very specific known conditions. And yes, this has nothing to do with hw-Rt cores.
 
Complete bull crap about rasterization limit; or a a game like Chernobylite wouldn't gain any performance in supported mGPU mode with two rlrtx 4090s.
He's not talking about a limit on the performance of GPUs. Obviously you can just keep making bigger GPUs with more cores and more memory to keep calculating more polygons faster.

He's talking about the end of 3D rasterization as a means to improve how games look and work. We obviously reached the point years ago where adding more and more polygons to an asset is just not worth the diminishing returns. If you want proof, just look at The Last of Us on PS4 vs "PS5 Pro Enhanced." Looks the exact same despite a 10x increase in compute power. Just dumping more shaders into a GPU so the blade of grass can have 8000 polygons instead of just 5000 is silly.

Ray tracing actually improves how games look, which is why Nvidia is going all-in on ray tracing hardware and AI models to mimic it instead of improving their shader architecture.
 
NVIDIA's FE cooler uses five double-length heatpipes. The heatsink provides cooling not only for the GPU, but also for the memory chips and VRM circuitry.

This is a much simpler cooling design than on the RTX 5090, which used liquid metal and a vapor-chamber.

Isn't it basically the same?
 
Isn't it basically the same?
Thats what i thought and being reported by other reviewers. Exact same cooler. Vapor chamber. Just no liquid metal
 
Back
Top