Thursday, September 26th 2024

NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Surface, Showing Larger SKU Segmentation

Thanks to the renowned NVIDIA hardware leaker kopite7Kimi on X, we are getting information about the final versions of NVIDIA's first upcoming wave of GeForce RTX 50 series "Blackwell" graphics cards. The two leaked GPUs are the GeForce RTX 5090 and RTX 5080, which now feature a more significant gap between xx80 and xx90 SKUs. For starters, we have the highest-end GeForce RTX 5090. NVIDIA has decided to use the GB202-300-A1 die and enabled 21,760 FP32 CUDA cores on this top-end model. Accompanying the massive 170 SM GPU configuration, the RTX 5090 has 32 GB of GDDR7 memory on a 512-bit bus, with each GDDR7 die running at 28 Gbps. This translates to 1,568 GB/s memory bandwidth. All of this is confined to a 600 W TGP.

When it comes to the GeForce RTX 5080, NVIDIA has decided to further separate its xx80 and xx90 SKUs. The RTX 5080 has 10,752 FP32 CUDA cores paired with 16 GB of GDDR7 memory on a 256-bit bus. With GDDR7 running at 28 Gbps, the memory bandwidth is also halved at 784 GB/s. This SKU uses a GB203-400-A1 die, which is designed to run within a 400 W TGP power envelope. For reference, the RTX 4090 has 68% more CUDA cores than the RTX 4080. The rumored RTX 5090 has around 102% more CUDA cores than the rumored RTX 5080, which means that NVIDIA is separating its top SKUs even more. We are curious to see at what price point NVIDIA places its upcoming GPUs so that we can compare generational updates and the difference between xx80 and xx90 models and their widened gaps.
Sources: kopite7kimi (RTX 5090), kopite7kimi (RTX 5080)
Add your own comment

181 Comments on NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Surface, Showing Larger SKU Segmentation

#76
arni-gx
rtx 5090 32gb gddr7 512bit 600w = awesome!

rtx 5080 16gb gddr7 256bit tdp 400w = P O S!

Oh my gosh..... are they still defending 16gb vram?

Don't they know that 4k resolution for AAA PC games, vram usage can be more than 20gb?

why, nvidia??
Posted on Reply
#77
evernessince
BwazeI have a hunch about RTX 5080 as well. Just like when Lovelace came out, suddenly RTX 3090, 3090 Ti weren't good enough for DLSS 3.0 frame generation because they lacked "fourth-generation Tensor cores". Nvidia will offer a lot of new shiny AI-oriented tools with Blackwell release that will run much faster on new hardware, or it will be specifically locked to Blackwell hardware - because older cards lack electrolytes newest generation AI acceleration magical cores.
I think that's a pretty safe hunch. Their excuse for the 3000 and 2000 series not getting Frame generation was insufficient performance. By extension the same argument could be made anytime they add or improve specialized hardware or when there is a big enough increase in RT performance. In short, they can use any reason to lock features.
Posted on Reply
#78
InVasMani
Just waiting on that RTX5060Ti 128-bit GDDR6 32GB with 4704 shaders bomb to drop... :laugh:
Posted on Reply
#79
TheEndIsNear
I can't wait for the 5090 that's going to be nuts. I love the 4090. I expect I'll have to pay 3 grand for it though
Posted on Reply
#80
Dirt Chip
As NV is officially the only top end player now- Get used to it: only the x090 will see decent uplift in raw performance.
All other will stay in their raster place (see 3060 vs 4060 and get the picture for 4080 vs 5080) with the new line get dlssX (synthetic better fps) which will make for the only difference in performance.

We might also see a new tire structure to better NV $$$ bank account.
Posted on Reply
#81
Vayra86
RedelZaVednoBecause 4070TIS is hardly cutting it for flight sim VR usage and with MS FS2024 two months away I was really hoping that 5080 would at least trade blows with 4090 and be priced around 1200 bucks. I guess Ngreedia don't want us to buy it's dies anymore besides junk non usable for AI learning. It's a sad state of affairs, AMD giving up on high end, Intel's GPUs being a joke of a year(s) and Jensen not giving a F about us ordinary buyers with limited budgets.
Wtf are you on about, Nvidia has a nice 4090 waiting for your kidney

'Ordinary buyers with limited budgets?' Lol. I think you dont realize what you are saying. We have all already conceded that idea by continuously buying high end GPUs. There is nothing ordinary about it. You could buy a 4090. You just dont want to. Nvidia has you by the balls already.
evernessinceI think that's a pretty safe hunch. Their excuse for the 3000 and 2000 series not getting Frame generation was insufficient performance. By extension the same argument could be made anytime they add or improve specialized hardware or when there is a big enough increase in RT performance. In short, they can use any reason to lock features.
Its either an artificially designed lock or its a well placed incentive but yeah thats the M.O and it always has been... except now the actual hardware stagnates hard on top of it.
Posted on Reply
#82
kawice
dgianstefaniDefinitely 5080 Ti with memory/cuda somewhere in between this gen lol.
Unlikely, IMHO they want to have two gens on the marker at the same time to evaporate old gen stock at high prices.

So it will be 4080 (320W, $1200), 5080(400W, $???), 4090 (450W, $1600), 5090(600W, $???).

We won't get massive "free" performance increase alike with 10xx series.

If rumors are true the 5080 with 16GB VRAM, 256bit memory bus and ~11K CUDA cores seems really castrated. And all this running at least 400W power level, assuming AIBs will add 30W - 50W on top of that, it doesn't look good. I think 4080 Super owners might be laughing right now.
N3utro600W and 400W, that's a 150 (33%) and 80W (25%) increase compared to 4090 and 4080. They'd better offer performance uplift significantly higher than these % or it will be a flop
Don't worry it will be "5 times faster than previous generation"* on Nvidia charts.


*- in Cyberpunk 2077 at 8k, FrameGen 2, DLSS 5, Meme GPU Cache v3.0, VRAM compression 5.0, RTX dirt road finding, when you look at the wall.
Posted on Reply
#83
Bwaze
kawiceSo it will be 4080 (320W, $1200), 5080(400W, $???), 4090 (450W, $1600), 5090(600W, $???).

We won't get massive "free" performance increase alike with 10xx series.
The way how they cut RTX 5080 (if these rumours are correct of course) would really make it hard for Nvidia to get their usual approx. 50% performance increase over RTX 4080, but I think it won't matter - they will mostly show performance increase with RTX 5090, and of course all the performance increases from new tensor core technology in "AI" areas, DLSS, Ray Tracing that will make comparison to previous generation unfair.

And reviewers will be competing who will declare raster performance in games more dead and outdated.

And we will get price increase - even without the AI craze we would see 50% price increase for 50% performance increase (it won't matter it's not across the board).

With current market situation where making too many gaming cards would actually eat into Nvidia's profitability in Data Center that is now the only thing really bringing in money, I predict Nvidia will try to profit from people who will use these new cards for AI productivity and will be willing to fork serious money for their tools, and limit the actual volume of cards made - and they can do both these things with a single tweak - make the price ridiculously high.

So I don't think +100% is out of the question.
Posted on Reply
#84
TheDeeGee
dgianstefaniDefinitely 5080 Ti with memory/cuda somewhere in between this gen lol.

Assuming these rumours are accurate. I'd be surprised to see an xx90 with full memory bus/die. 4090 was quite cut down. I don't think they're going to jump straight to 600 W from 450 W.

My guess 500 W.

20,000 cores.
Ofcourse, the binning for them just takes time. They'll start making them once they have enough chips that didn't cut it for the 5090.
Posted on Reply
#85
nguyen
64KNot a chance. The last word is that it will be CES so early January and that might only be the official announcement. I can't recall now.
Yeah looks like a January 2025 launch from the latest rumor.

Hopefully 5090 will be >60% faster than 4090, so 4k 240hz screens make more sense :D
Posted on Reply
#86
Beginner Macro Device
x4it3nRDNA 5 could be a real competitor since it will use a real MCM design with up to 3 chips on it, and it's aimed for High-End performance!
AMD aim for high-end results ever since and they have never been remotely close to achieving that. How do the things change after decades of constant underdelivery? Doesn't compute at all.

There's no reason to even think of RDNA5 as of something that can change the market. We can only hope RDNA4 won't be the AMD's swan song.
Posted on Reply
#87
Solid State Brain
BwazeWith current market situation where making too many gaming cards would actually eat into Nvidia's profitability in Data Center that is now the only thing really bringing in money, I predict Nvidia will try to profit from people who will use these new cards for AI productivity and will be willing to fork serious money for their tools, and limit the actual volume of cards made - and they can do both these things with a single tweak - make the price ridiculously high.
You (and others, actually) underestimate the amount of people willing to spend big money for LLMs and AI image models for mostly entertainment rather than "productivity".
There are many here: old.reddit.com/r/LocalLLaMA/
Posted on Reply
#88
Bwaze
Solid State BrainYou (and others, actually) underestimate the amount of people willing to spend big money for LLMs and AI image models for mostly entertainment rather than "productivity".
There are many here: old.reddit.com/r/LocalLLaMA/
I bet there are many, but I think still insignificant compared to cryptominers buying cards by the truckloads, or now companies also ordering their AI accelerators by the truckload.
Posted on Reply
#89
Prima.Vera
Seriously??
That 5080 should be sold as a 5070, considering the specs compared to the 5090. My old 3080 card has a bus of 320bit and 760GB bandwidth....
Seriously nGreedia??
Posted on Reply
#90
Krit
400w TDP for a 256bit mid to high end gpu that's not normal. In average ATX Case it will be hot and loud i'm talking about good air cooling here. Also cpu temps and noise will be dramatically affected by that.
Prima.VeraSeriously??
That 5080 should be sold as a 5070, considering the specs compared to the 5090. My old 3080 card has a bus of 320bit and 760GB bandwidth....
Seriously nGreedia??
Look at steam gpu market share RTX 4060 Ti is doing great for those crap specs and performance. It's not about nvidia it's all about people who are buying them!
Posted on Reply
#91
Prima.Vera
natr0nI wonder how many people will have so much disposable income to buy one of these.
You underestimate stupidity.
Just like with the phones, brand clothes and accessories, people just buy them on the credit, with money they don't have.
Posted on Reply
#92
Beginner Macro Device
Prima.VeraThat 5080 should be sold as a 5070, considering the specs compared to the 5090.
Yeah, and 4090 is an xx80 tier GPU at best. 4080 Super is a true 4070. 4070 is a true 4060. 4060 Ti is a true 4050. 4060 is true nonsense.
Why do better if it still won't have any trouble selling?
Posted on Reply
#93
N/A
Nvidia keeps repeating 3090 as 4080,5080. granted 4nm node L2$, double the clock speed but thats a given every other gen. 400W is strictly water cooling territory. Too bad it's not 3nm.
Posted on Reply
#94
TheinsanegamerN
pk67I didn't said it can't work fine at 12V but we have 2025 just around the corner now and it would be more eco friendly at 24V level with the same power cables.
Not sure if this is a troll or just uninformed, but either way, no, it would not be more *eco*. 600w is 600w. Pushing it over 24v doesnt change anything. You may be thinking of heat, like with EV batteries, but that's to how GPUs work.
pk67BTW
2 times No - You dont need different mobo or new PSU design - just minor changes in PSU - thats all.
Oh sure, just introduce a new voltage into a system, what could POSSIBLY go wrong?

Except now you need to isolate the VRMs on the GPU so 24v doesnt find its way onto the 12v rail, right now they use a common 12v ground. That will make GPUs more expensive. And now you need a new connector because otherwise you're gonna have people plugging 24v cables into 12v cards and causing some fireworks. Not to mention now your PSUs are more expensive because, well, you still need the 12v lines.

All this for what? To have some slightly cooler running cables? Congrats. There's a reason we never swapped out the 5 and 3.3v lines for 12v on the ATX standard......the juice aint worth the squeeze.
N/ANvidia keeps repeating 3090 as 4080,5080. granted 4nm node L2$, double the clock speed but thats a given every other gen. 400W is strictly water cooling territory. Too bad it's not 3nm.
400w is not strictly water cooling. IDK what you're smoking, but the 3090ti pulled 525w, peaked at over 600w, and ran fine on its air cooler.
Posted on Reply
#95
natr0n
Prima.VeraYou underestimate stupidity.
Just like with the phones, cars, or brand clothes and accessories, people just buy them on the credit, with money they don't have.
I honestly forget these things cause I don't use credit cards.You are right though.
Posted on Reply
#96
Beginner Macro Device
TheinsanegamerNIDK what you're smoking, but the 3090ti pulled 525w, peaked at over 600w, and ran fine on its air cooler.
Some people are a little too sensitive so fans running beyond 1K RPM and temperatures reaching 80C are a red flag compilation level nuisance for them.

I personally play with my headphones on so I can't hear it (neither can my neighbours and roommates), that's why fans must blow. However, I'm not the only valid metric here.
Posted on Reply
#97
Krit
TheinsanegamerN400w is not strictly water cooling. IDK what you're smoking, but the 3090ti pulled 525w, peaked at over 600w, and ran fine on its air cooler.
That shit is loud i mean really loud in closed average sized case. Temperatures are a different matter entirely, they are not annoying while running hot.
Posted on Reply
#98
dyonoctis
Prima.VeraYou underestimate stupidity.%
Just like with the phones, cars, or brand clothes and accessories, people just buy them on the credit, with money they don't have.
I feel like adding cars to the list is a bit much :D in some part of the world cars are a necessary expense on the same level as a home, and you need "stupid" people selling their older cars on the used market. New entry-level cars are a bit hard to get without credit unless you save money for a few years...but were I live you need a car to work. Because public transport is just bad, very bad. I had a young coworker who lost the job because his used car broke down, and he couldn't get another in a timely fashion. And that job isn't compatible with remote work. Young adults are often getting help from their parents, or they move to the continent were cars are not mandatory.

Now for the GPUs if computers/gaming are your main hobby, I feel like it's not that hard to put 1k on the side each four years. In three months I've already saved 200€ towards my next tech purchase. I've met few people who are buying the high-end of every generation without having the means to do so, modest people tend to hold those items for a few years.

But credit in the EU/related places also works in a different way vs the US it seems. x3 or x4 payments are asking an authorization for the full price of the item, Credit cards have a fairly low limit, and bank credits have a ton of background check
Posted on Reply
#99
Mr. Perfect
potsdaman70 series again with 12Gb :mad::rolleyes:
And an 8GB 5060 incoming... :kookoo:
Posted on Reply
#100
Sandbo
Just so people know, the LLM community will grab this card like it's free
Posted on Reply
Add your own comment
Oct 5th, 2024 05:35 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts