• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Surface, Showing Larger SKU Segmentation

I will get the 5090.
I have no doubt it will exceed my 4090 and my 3090.
I am just sad it'll cost $2000 on top of the cost of the new 15900k and the new motherboard for it.
 

Attachments

  • Bran's build 2022 (6).jpg
    Bran's build 2022 (6).jpg
    180.2 KB · Views: 99
rtx 5090 32gb gddr7 512bit 600w = awesome!

rtx 5080 16gb gddr7 256bit tdp 400w = P O S!

Oh my gosh..... are they still defending 16gb vram?

Don't they know that 4k resolution for AAA PC games, vram usage can be more than 20gb?

why, nvidia??
 
I have a hunch about RTX 5080 as well. Just like when Lovelace came out, suddenly RTX 3090, 3090 Ti weren't good enough for DLSS 3.0 frame generation because they lacked "fourth-generation Tensor cores". Nvidia will offer a lot of new shiny AI-oriented tools with Blackwell release that will run much faster on new hardware, or it will be specifically locked to Blackwell hardware - because older cards lack electrolytes newest generation AI acceleration magical cores.

I think that's a pretty safe hunch. Their excuse for the 3000 and 2000 series not getting Frame generation was insufficient performance. By extension the same argument could be made anytime they add or improve specialized hardware or when there is a big enough increase in RT performance. In short, they can use any reason to lock features.
 
I can't wait for the 5090 that's going to be nuts. I love the 4090. I expect I'll have to pay 3 grand for it though
 
As NV is officially the only top end player now- Get used to it: only the x090 will see decent uplift in raw performance.
All other will stay in their raster place (see 3060 vs 4060 and get the picture for 4080 vs 5080) with the new line get dlssX (synthetic better fps) which will make for the only difference in performance.

We might also see a new tire structure to better NV $$$ bank account.
 
Because 4070TIS is hardly cutting it for flight sim VR usage and with MS FS2024 two months away I was really hoping that 5080 would at least trade blows with 4090 and be priced around 1200 bucks. I guess Ngreedia don't want us to buy it's dies anymore besides junk non usable for AI learning. It's a sad state of affairs, AMD giving up on high end, Intel's GPUs being a joke of a year(s) and Jensen not giving a F about us ordinary buyers with limited budgets.
Wtf are you on about, Nvidia has a nice 4090 waiting for your kidney

'Ordinary buyers with limited budgets?' Lol. I think you dont realize what you are saying. We have all already conceded that idea by continuously buying high end GPUs. There is nothing ordinary about it. You could buy a 4090. You just dont want to. Nvidia has you by the balls already.

I think that's a pretty safe hunch. Their excuse for the 3000 and 2000 series not getting Frame generation was insufficient performance. By extension the same argument could be made anytime they add or improve specialized hardware or when there is a big enough increase in RT performance. In short, they can use any reason to lock features.
Its either an artificially designed lock or its a well placed incentive but yeah thats the M.O and it always has been... except now the actual hardware stagnates hard on top of it.
 
Last edited:
Definitely 5080 Ti with memory/cuda somewhere in between this gen lol.

Unlikely, IMHO they want to have two gens on the marker at the same time to evaporate old gen stock at high prices.

So it will be 4080 (320W, $1200), 5080(400W, $???), 4090 (450W, $1600), 5090(600W, $???).

We won't get massive "free" performance increase alike with 10xx series.

If rumors are true the 5080 with 16GB VRAM, 256bit memory bus and ~11K CUDA cores seems really castrated. And all this running at least 400W power level, assuming AIBs will add 30W - 50W on top of that, it doesn't look good. I think 4080 Super owners might be laughing right now.

600W and 400W, that's a 150 (33%) and 80W (25%) increase compared to 4090 and 4080. They'd better offer performance uplift significantly higher than these % or it will be a flop

Don't worry it will be "5 times faster than previous generation"* on Nvidia charts.


*- in Cyberpunk 2077 at 8k, FrameGen 2, DLSS 5, Meme GPU Cache v3.0, VRAM compression 5.0, RTX dirt road finding, when you look at the wall.
 
Last edited:
So it will be 4080 (320W, $1200), 5080(400W, $???), 4090 (450W, $1600), 5090(600W, $???).

We won't get massive "free" performance increase alike with 10xx series.

The way how they cut RTX 5080 (if these rumours are correct of course) would really make it hard for Nvidia to get their usual approx. 50% performance increase over RTX 4080, but I think it won't matter - they will mostly show performance increase with RTX 5090, and of course all the performance increases from new tensor core technology in "AI" areas, DLSS, Ray Tracing that will make comparison to previous generation unfair.

And reviewers will be competing who will declare raster performance in games more dead and outdated.

And we will get price increase - even without the AI craze we would see 50% price increase for 50% performance increase (it won't matter it's not across the board).

With current market situation where making too many gaming cards would actually eat into Nvidia's profitability in Data Center that is now the only thing really bringing in money, I predict Nvidia will try to profit from people who will use these new cards for AI productivity and will be willing to fork serious money for their tools, and limit the actual volume of cards made - and they can do both these things with a single tweak - make the price ridiculously high.

So I don't think +100% is out of the question.
 
Definitely 5080 Ti with memory/cuda somewhere in between this gen lol.

Assuming these rumours are accurate. I'd be surprised to see an xx90 with full memory bus/die. 4090 was quite cut down. I don't think they're going to jump straight to 600 W from 450 W.

My guess 500 W.

20,000 cores.
Ofcourse, the binning for them just takes time. They'll start making them once they have enough chips that didn't cut it for the 5090.
 
Not a chance. The last word is that it will be CES so early January and that might only be the official announcement. I can't recall now.

Yeah looks like a January 2025 launch from the latest rumor.

Hopefully 5090 will be >60% faster than 4090, so 4k 240hz screens make more sense :D
 
RDNA 5 could be a real competitor since it will use a real MCM design with up to 3 chips on it, and it's aimed for High-End performance!
AMD aim for high-end results ever since and they have never been remotely close to achieving that. How do the things change after decades of constant underdelivery? Doesn't compute at all.

There's no reason to even think of RDNA5 as of something that can change the market. We can only hope RDNA4 won't be the AMD's swan song.
 
With current market situation where making too many gaming cards would actually eat into Nvidia's profitability in Data Center that is now the only thing really bringing in money, I predict Nvidia will try to profit from people who will use these new cards for AI productivity and will be willing to fork serious money for their tools, and limit the actual volume of cards made - and they can do both these things with a single tweak - make the price ridiculously high.

You (and others, actually) underestimate the amount of people willing to spend big money for LLMs and AI image models for mostly entertainment rather than "productivity".
There are many here: https://old.reddit.com/r/LocalLLaMA/
 
You (and others, actually) underestimate the amount of people willing to spend big money for LLMs and AI image models for mostly entertainment rather than "productivity".
There are many here: https://old.reddit.com/r/LocalLLaMA/

I bet there are many, but I think still insignificant compared to cryptominers buying cards by the truckloads, or now companies also ordering their AI accelerators by the truckload.
 
Seriously??
That 5080 should be sold as a 5070, considering the specs compared to the 5090. My old 3080 card has a bus of 320bit and 760GB bandwidth....
Seriously nGreedia??
 
400w TDP for a 256bit mid to high end gpu that's not normal. In average ATX Case it will be hot and loud i'm talking about good air cooling here. Also cpu temps and noise will be dramatically affected by that.

Seriously??
That 5080 should be sold as a 5070, considering the specs compared to the 5090. My old 3080 card has a bus of 320bit and 760GB bandwidth....
Seriously nGreedia??
Look at steam gpu market share RTX 4060 Ti is doing great for those crap specs and performance. It's not about nvidia it's all about people who are buying them!
 
I wonder how many people will have so much disposable income to buy one of these.
You underestimate stupidity.
Just like with the phones, brand clothes and accessories, people just buy them on the credit, with money they don't have.
 
Last edited:
That 5080 should be sold as a 5070, considering the specs compared to the 5090.
Yeah, and 4090 is an xx80 tier GPU at best. 4080 Super is a true 4070. 4070 is a true 4060. 4060 Ti is a true 4050. 4060 is true nonsense.
Why do better if it still won't have any trouble selling?
 
Nvidia keeps repeating 3090 as 4080,5080. granted 4nm node L2$, double the clock speed but thats a given every other gen. 400W is strictly water cooling territory. Too bad it's not 3nm.
 
I didn't said it can't work fine at 12V but we have 2025 just around the corner now and it would be more eco friendly at 24V level with the same power cables.
Not sure if this is a troll or just uninformed, but either way, no, it would not be more *eco*. 600w is 600w. Pushing it over 24v doesnt change anything. You may be thinking of heat, like with EV batteries, but that's to how GPUs work.
BTW
2 times No - You dont need different mobo or new PSU design - just minor changes in PSU - thats all.
Oh sure, just introduce a new voltage into a system, what could POSSIBLY go wrong?

Except now you need to isolate the VRMs on the GPU so 24v doesnt find its way onto the 12v rail, right now they use a common 12v ground. That will make GPUs more expensive. And now you need a new connector because otherwise you're gonna have people plugging 24v cables into 12v cards and causing some fireworks. Not to mention now your PSUs are more expensive because, well, you still need the 12v lines.

All this for what? To have some slightly cooler running cables? Congrats. There's a reason we never swapped out the 5 and 3.3v lines for 12v on the ATX standard......the juice aint worth the squeeze.

Nvidia keeps repeating 3090 as 4080,5080. granted 4nm node L2$, double the clock speed but thats a given every other gen. 400W is strictly water cooling territory. Too bad it's not 3nm.
400w is not strictly water cooling. IDK what you're smoking, but the 3090ti pulled 525w, peaked at over 600w, and ran fine on its air cooler.
 
You underestimate stupidity.
Just like with the phones, cars, or brand clothes and accessories, people just buy them on the credit, with money they don't have.
I honestly forget these things cause I don't use credit cards.You are right though.
 
IDK what you're smoking, but the 3090ti pulled 525w, peaked at over 600w, and ran fine on its air cooler.
Some people are a little too sensitive so fans running beyond 1K RPM and temperatures reaching 80C are a red flag compilation level nuisance for them.

I personally play with my headphones on so I can't hear it (neither can my neighbours and roommates), that's why fans must blow. However, I'm not the only valid metric here.
 
400w is not strictly water cooling. IDK what you're smoking, but the 3090ti pulled 525w, peaked at over 600w, and ran fine on its air cooler.
That shit is loud i mean really loud in closed average sized case. Temperatures are a different matter entirely, they are not annoying while running hot.
 
You underestimate stupidity.%
Just like with the phones, cars, or brand clothes and accessories, people just buy them on the credit, with money they don't have.
I feel like adding cars to the list is a bit much :D in some part of the world cars are a necessary expense on the same level as a home, and you need "stupid" people selling their older cars on the used market. New entry-level cars are a bit hard to get without credit unless you save money for a few years...but were I live you need a car to work. Because public transport is just bad, very bad. I had a young coworker who lost the job because his used car broke down, and he couldn't get another in a timely fashion. And that job isn't compatible with remote work. Young adults are often getting help from their parents, or they move to the continent were cars are not mandatory.

Now for the GPUs if computers/gaming are your main hobby, I feel like it's not that hard to put 1k on the side each four years. In three months I've already saved 200€ towards my next tech purchase. I've met few people who are buying the high-end of every generation without having the means to do so, modest people tend to hold those items for a few years.

But credit in the EU/related places also works in a different way vs the US it seems. x3 or x4 payments are asking an authorization for the full price of the item, Credit cards have a fairly low limit, and bank credits have a ton of background check
 
Back
Top