Thursday, September 26th 2024

NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Surface, Showing Larger SKU Segmentation

Thanks to the renowned NVIDIA hardware leaker kopite7Kimi on X, we are getting information about the final versions of NVIDIA's first upcoming wave of GeForce RTX 50 series "Blackwell" graphics cards. The two leaked GPUs are the GeForce RTX 5090 and RTX 5080, which now feature a more significant gap between xx80 and xx90 SKUs. For starters, we have the highest-end GeForce RTX 5090. NVIDIA has decided to use the GB202-300-A1 die and enabled 21,760 FP32 CUDA cores on this top-end model. Accompanying the massive 170 SM GPU configuration, the RTX 5090 has 32 GB of GDDR7 memory on a 512-bit bus, with each GDDR7 die running at 28 Gbps. This translates to 1,568 GB/s memory bandwidth. All of this is confined to a 600 W TGP.

When it comes to the GeForce RTX 5080, NVIDIA has decided to further separate its xx80 and xx90 SKUs. The RTX 5080 has 10,752 FP32 CUDA cores paired with 16 GB of GDDR7 memory on a 256-bit bus. With GDDR7 running at 28 Gbps, the memory bandwidth is also halved at 784 GB/s. This SKU uses a GB203-400-A1 die, which is designed to run within a 400 W TGP power envelope. For reference, the RTX 4090 has 68% more CUDA cores than the RTX 4080. The rumored RTX 5090 has around 102% more CUDA cores than the rumored RTX 5080, which means that NVIDIA is separating its top SKUs even more. We are curious to see at what price point NVIDIA places its upcoming GPUs so that we can compare generational updates and the difference between xx80 and xx90 models and their widened gaps.
Sources: kopite7kimi (RTX 5090), kopite7kimi (RTX 5080)
Add your own comment

181 Comments on NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Surface, Showing Larger SKU Segmentation

#101
Bwaze
SandboJust so people know, the LLM community will grab this card like it's free
Even if it's $4000 or more?

I hope I'm wrong, but we might be underestimating how much Nvidia doesn't need Gaming any more.
Posted on Reply
#102
Solid State Brain
Most amateurs who are serious with LLMs will probably keep purchasing multiple used RTX3090 if the 5090 is $4000 or more. That would be power-inefficient and cumbersome but relatively affordable. $4000 would be too close to the RTX A6000 (Ampere) which has a 2-slot form factor, 48GB of VRAM, 1 PCIe power input, 300W TDP, although at the same power level the 5090 will probably have more than twice the performance.
Posted on Reply
#103
SIGSEGV
KritLook at steam gpu market share RTX 4060 Ti is doing great for those crap specs and performance. It's not about nvidia it's all about people who are buying them!
Lmao. That's indeed irony. They have crap specs and performance, but somehow, these products have strong magnets for people to keep buying. Lmao.
Posted on Reply
#104
tfdsaf
As I've posted in my thread the RTX 5900 is going to cost $2500 or more while the 5800 is going to be around $1500, depending on what Nvidia thinks they can get away with, if they do sense an uprising over their absurdly expensive pricing they might drop the 5800 to $1300 or so, but again its going to have to be a huge uprising like with the 4080 being 12GB only, which Nvidia later named the 4070ti.

If you want sane prices you must boycott the next generation of Nvidia GPU's, otherwise we are headed into a future where GPU's are a luxury only for the super rich and the rest of us are F'ed over.
Posted on Reply
#105
Prima.Vera
dyonoctisI feel like adding cars to the list is a bit much :D ...

But credit in the EU/related places also works in a different way vs the US it seems. x3 or x4 payments are asking an authorization for the full price of the item, Credit cards have a fairly low limit, and bank credits have a ton of background check
That's correct. I meant as people changing their cars every 3 or 5 years, even if the cars run perfectly fine. Naturally, the comparison with video cards is off, since actually each generation brings at least a 50% improvement.
Regarding Credit cards, I don't own one, but I can split the payment if I want with a Debit one. Which I don't...
Mr. PerfectAnd an 8GB 5060 incoming... :kookoo:
Perfect for your 1024x600 resolution, using DLSS Ultra performance!!.
Hey, pull out that CRT monitor, and you'd have the best game play experience due to 0ms display lag.
Win-Win :)
Posted on Reply
#106
pk67
TheinsanegamerNNot sure if this is a troll or just uninformed, but either way, no, it would not be more *eco*. 600w is 600w. Pushing it over 24v doesnt change anything. You may be thinking of heat, like with EV batteries, but that's to how GPUs work.

Oh sure, just introduce a new voltage into a system, what could POSSIBLY go wrong?

Except now you need to isolate the VRMs on the GPU so 24v doesnt find its way onto the 12v rail, right now they use a common 12v ground. That will make GPUs more expensive. And now you need a new connector because otherwise you're gonna have people plugging 24v cables into 12v cards and causing some fireworks. Not to mention now your PSUs are more expensive because, well, you still need the 12v lines.

All this for what? To have some slightly cooler running cables? Congrats. There's a reason we never swapped out the 5 and 3.3v lines for 12v on the ATX standard......the juice aint worth the squeeze.
Clearly you have no idea what you are talking about. 24V changing a lot in the matter of efficiency.
Power supply is less susceptoble for sagging. And overall efficiency is higher cos a lot less energy is wasted in power rail.
You dont have to isolate anything if only one rail is connected to supply power section of GPU.
Posted on Reply
#107
evernessince
x4it3nRDNA 5 could be a real competitor since it will use a real MCM design with up to 3 chips on it, and it's aimed for High-End performance!
RDNA 4 high-end chips were having issues so they cancelled all of them and are going to sell Mainstream GPUs instead to make more money, but RDNA 5 is supposed to fix all those issues and be a real competitor to RTX 5090.
AMD might take the raster performance crown with a multiple GCD GPU but I don't think that alone will inherently make them competitive. The problem for AMD is their software features and pricing.

AMD still need to :

- Match CUDA. ROCm was a good step forward but it's got a long way to match CUDA's marketshare in the professional and enterprise space (where it's likely close to 100% marketshare)
- improve their encoder's performance and quality to match Nvidia.
- improve it's upscaling to match DLSS
- improve it's VR support
- ensure AI performance is good and that implementations in existing software is optimized. Currently AMD cards get about 60% of the inference performance of Nvidia cards when comparing an AMD and Nvidia card of equal raster performance.

This would just be to catch up, I still don't think they could charge Nvidia prices if they did all the above because in order to do that they have to start marking innovative features of their own which frankly has not happened under Lisa Sue. This also assume Nvidia doesn't announce any new software features with the 5000 series. Again AMD may fall further behind.

AMD really needs to compete on pricing until it fixes all the above. AMD likely retreated from the high end to focus on enterprise because the above barriers don't matter. AMD doesn't have to worry much about CUDA there because a enterprise customer looking at buying thousands of GPUs is likely tailoring it's software solution to the hardware.
Posted on Reply
#108
64K
tfdsafAs I've posted in my thread the RTX 5900 is going to cost $2500 or more while the 5800 is going to be around $1500, depending on what Nvidia thinks they can get away with, if they do sense an uprising over their absurdly expensive pricing they might drop the 5800 to $1300 or so, but again its going to have to be a huge uprising like with the 4080 being 12GB only, which Nvidia later named the 4070ti.

If you want sane prices you must boycott the next generation of Nvidia GPU's, otherwise we are headed into a future where GPU's are a luxury only for the super rich and the rest of us are F'ed over.
Good lord. Stop with the hysteria. There will be no huge uprising against Nvidia. The days of PC gaming will only be affordable to the Warren Buffetts of the world are not coming either. PC gaming has gotten more expensive even factoring in inflation but it's not time for panic. We've weathered several mining crazes and a pandemic that bred scalpers causing absurd prices and we will survive the AI craze as well. Nvidia isn't going to price themselves out of the gaming market. Huang is not a damn fool. We are talking billions and billions of dollars for them. Yes, the AI market is many times more but it's not the entire market. AMD will still be making entry level and midrange GPUs and there's even a possibility that Intel will survive in the dGPU market. Software will continue to improve. You don't have to buy a 5080 or 5090 to still have a great experience with PC gaming unless you are one of the 4% gaming at 4K but the other 96% will be fine.

Hell, even the argument that the dGPU prices are driving the masses to consoles is questionable. From what I've heard the PS5 Pro is $700 and the next gen consoles may even be more. Gaming is getting more expensive.
Posted on Reply
#109
chrcoluk
I had a hunch the 5000 series would have no VRAM bump below xx90 and also bump up power draw, which is a reason I jumped on the discounted 4080 super.
Posted on Reply
#110
dyonoctis
Prima.VeraThat's correct. I meant as people changing their cars every 3 or 5 years, even if the cars run perfectly fine. Naturally, the comparison with video cards is off, since actually each generation brings at least a 50% improvement.
Regarding Credit cards, I don't own one, but I can split the payment if I want with a Debit one. Which I don't...
Ah yes. That became so mainstream that car dealerships in the EU made an option for long-duration rent. You rent a car for 3 or 4 years, and at the end you give the car back to the dealer and rent a new car. You might eventually pay the full price of the car, but you never get ownership. I've noticed that there are a lot of Mercedes in the used market but few Toyota as if the latter tends to hold to their cars... :D

It's something that's creeping up in the tech industry (you can rent your phone, console, or gaming PC) as well, but as you said, at least you get something that actually better (most of the time)
Posted on Reply
#111
pk67
64KHell, even the argument that the dGPU prices are driving the masses to consoles is questionable. From what I've heard the PS5 Pro is $700 and the next gen consoles may even be more. Gaming is getting more expensive.
I would say yes if you mean Extreme Gaming. Older titles in lower resolutions are cheaper to play and at lower wattage. But most folks are fixed at most demanding titles and engines.

edit
In long term timeframe current newest titles will be cheaper to play too but folks are not patient enough to waitining 6-8 years more for future hardware generations.
Posted on Reply
#112
x4it3n
BwazePeople here are still expecting this to be gaming card release.

Nope. No way. Nvidia didn't become hottest share in market by selling gaming paraphernalia.

I expect Nvidia to clearly release RTX 5090, and to lesser extent RTX 5080 to home AI acceleration crowd. Data center revenue is up more than 150% compared to last year, and now represents vast majority of Nvidia's income, thanks to AI craze. "Gaming" cards will no longer be just "gaming", as is clear from the change of RTX logo that now includes this:



And I don't believe they have included "Powering Advanced AI" just because DLSS is technically what we already call AI (machine learned upscaling), or that you could accelerate upcoming NPCs with ChatGPT - like abilities of conversation.

These cards will be offered as "cheap" accelerators for smaller neural network servers and as tools for artists and content creators.

And their prices will reflect that they are tools for generating income, not playthings. I expect a 100% price hike, just like in the good old cryptomadness days.

But don't despair, lower end RTX 40x0 will still be offered alongside Blackwell for all those who still like to play with their computer.
170SM and 512-bit bus seems too good to be true for Consumers but I believe they will do it. It's not the full architecture (192SM) and GDDR7 chips can go up to 40Gbps already... and Nvidia can also definitely go for HBM3e for Professionals too!
If they don't do it for now let's hope AMD will have a 9900 XTX that will compe with the RTX 5090/Ti for competition sake!
dyonoctisAh yes. That became so mainstream that car dealerships in the EU made an option for long-duration rent. You rent a car for 3 or 4 years, and at the end you give the car back to the dealer and rent a new car. You might eventually pay the full price of the car, but you never get ownership. I've noticed that there are a lot of Mercedes in the used market but few Toyota as if the latter tends to hold to their cars... :D

It's something that's creeping up in the tech industry (you can rent your phone, console, or gaming PC) as well, but as you said, at least you get something that actually better (most of the time)
The perks of buying (even if more expensive) is that you own it and are free to sell it to buy a new one whereas leasing you always pay for something you'll never own and will never have some money back...
Posted on Reply
#113
Ruru
S.T.A.R.S.
potsdaman70 series again with 12Gb :mad::rolleyes:
Hasn't 4070 been so far the only 70-series SKU with 192-bit bus?

670 was 256-bit, 2GB or 4GB
770 was a 680 rebrand
970 was "256-bit", "4GB"
1070 was 256-bit, 8GB
2070 was 256-bit, 8GB
3070 was 256-bit, 8GB
Posted on Reply
#114
Krit
RuruHasn't 4070 been so far the only 70-series SKU with 192-bit bus?

670 was 256-bit, 2GB or 4GB
770 was a 680 rebrand
970 was "256-bit", "4GB"
1070 was 256-bit, 8GB
2070 was 256-bit, 8GB
3070 was 256-bit, 8GB
Looks like it's first now RTX 4070 is clear mid range gpu not mid/high end like it was before. Cost more get less! That's pride of nvidia and their loyal slaves....

GTX 470 was 320-bit true high end gpu the same as GTX 260 448-bit gpu.

Even stupid RTX 4070Ti for 800$ has GTX 1060 (250$) specs.... People are buying them like a hot cakes. :kookoo: When it's 500$ gpu at the best.
Posted on Reply
#115
64K
RuruHasn't 4070 been so far the only 70-series SKU with 192-bit bus?

670 was 256-bit, 2GB or 4GB
770 was a 680 rebrand
970 was "256-bit", "4GB"
1070 was 256-bit, 8GB
2070 was 256-bit, 8GB
3070 was 256-bit, 8GB
It's because the 4070 wasn't really a xx70 class GPU. It wasn't a xx60 class GPU either going by memory bus width but it was closer to a xx60 but still it was really a lower midrange GPU and not a solid midrange xx70 GPU. The 4060 wasn't really a xx60 class GPU either. It was really an entry level xx50 class GPU. Nvidia being deceptive with naming yet again. The first one that I can recall is the GTX 680 back in 2012 which up until then the '8' signified high end but the 680 was an upper midrange Kepler GPU for a high end price and it sold really well so here we are today with the naming BS. The real high end Kepler was the GTX 780 and the 780 Ti. You gotta watch Huang. He's a tricky one with naming to charge more from uninformed buyers. Well, it's actually the AIBs that charge more but Nvidia sells the GPUs so they make more too.
Posted on Reply
#116
mechtech
If it's over $300 cad it's out of my budget and I don't care lol
Posted on Reply
#117
RandallFlagg
mechtechIf it's over $300 cad it's out of my budget and I don't care lol
Pretty much the same, maybe $350. As much as these cards cost, I can buy 2 decent gaming laptops.
Posted on Reply
#118
Beginner Macro Device
64Kbut the 680 was an upper midrange Kepler GPU for a high end price and it sold really well
Which couldn't've been the case if, say, HD 7870 was sold for the same ~300 USD but never trailed behind GTX 680 in terms of performance and HD 7970 basically murdered everything NV could offer. It all rises from competition, or lack thereof.
Posted on Reply
#119
x4it3n
KritLooks like it's first now RTX 4070 is clear mid range gpu not mid/high end like it was before. Cost more get less! That's pride of nvidia and their loyal slaves....

GTX 470 was 320-bit true high end gpu the same as GTX 260 448-bit gpu.

Even stupid RTX 4070Ti for 800$ has GTX 1060 (250$) specs.... People are buying them like a hot cakes. :kookoo: When it's 500$ gpu at the best.
Agree
x50 have pretty much disappeared, Nvidia is not even displaying the Desktop RTX 4050 in their RTX 40 lineup on their website lol
x60 GPUs the new Low-end
x70 GPUs are the new Mainstream
x80 are the new x70 & 70 Ti aka High-End
x90 are the new x80 Ti or Enthusiast
x90 Ti is pretty much a TITAN without the 2x VRAM increase
KritHe needs more new leather jackets that's for sure!
Every 2 years he needs a new one because the old one has been improved by A.I. so he needs the new version! He's betting a lot on Blackleather!
Posted on Reply
#120
QUANTUMPHYSICS
How powerful a PSU will I need?

Will 1000W be enough for the 5090 and a z890 with 15900k?
Posted on Reply
#121
Ruru
S.T.A.R.S.
KritEven stupid RTX 4070Ti for 800$ has GTX 1060 (250$) specs.... People are buying them like a hot cakes. :kookoo: When it's 500$ gpu at the best.
The funniest thing was when it was going to be released as "RTX 4080 12GB" first. :D
x4it3nAgree

x90 are the new x80 Ti or Enthusiast
x90 Ti is pretty much a TITAN without the 2x VRAM increase
x90 is the Titan and x90 Ti is the Titan Black. ;) Remember, the first Titan wasn't even with full die, hell, even the 780 Ti had a full die (but only 3GB VRAM).

Though they did the same milking with Titan X (Pascal) and Titan Xp.
QUANTUMPHYSICSHow powerful a PSU will I need?

Will 1000W be enough for the 5090 and a z890 with 15900k?
How fortunate that Seasonic just released a new 2200W unit. :rolleyes:
Posted on Reply
#122
pk67
QUANTUMPHYSICSHow powerful a PSU will I need?

Will 1000W be enough for the 5090 and a z890 with 15900k?
If it had 24V output rail 1200W would be enough. But with standard 12V rail i quess 1500-1600W is safe minimum and a pair of thick cables of course.
Posted on Reply
#123
Prima.Vera
Yes, just like the previous gen card, this 5080 card will be a complete disgrace if is priced above 750$.
Which will most likely be....
Posted on Reply
#124
igormp
pk67If it had 24V output rail 1200W would be enough. But with standard 12V rail i quess 1500-1600W is safe minimum and a pair of thick cables of course.
You should just accept that 24v won't become a thing anytime soon in PCs lol
Posted on Reply
#125
Ruru
S.T.A.R.S.
igormpYou should just accept that 24v won't become a thing anytime soon in PCs lol
Yeah, at least in consumer market. Servers may be a different thing (I have no idea that do they already use it?)
Posted on Reply
Add your own comment
Oct 5th, 2024 05:27 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts