• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Surface, Showing Larger SKU Segmentation

24V would require an entirely different motherboard and PSU design. Not happening. EDI: also 3090ti already did 600w on 12v. it worked fine.
I didn't said it can't work fine at 12V but we have 2025 just around the corner now and it would be more eco friendly at 24V level with the same power cables.
BTW
2 times No - You dont need different mobo or new PSU design - just minor changes in PSU - thats all.
 
I don't believe it but if this rumor turns out to be true, 5080 will be a complete shitshow. The RTX 4090 has 16384 shading units so good luck matching it's rasterization performance with only 10,752 FP32 CUDA cores paired with a 256-bit memory bus. This thing will be around 20%, maybe 30% faster than 4080(S) at best. Who would want to buy this shit if it's not priced well under a grand?
 
I don't believe it but if this rumor turns out to be true, 5080 will be a complete shitshow. The RTX 4090 has 16384 shading units so good luck matching it's rasterization performance with only 10,752 FP32 CUDA cores paired with a 256-bit memory bus. This thing will be around 20%, maybe 30% faster than 4080(S) at best. Who would want to buy this shit if it's not priced well under a grand?
Dont judge too fast. If you care just pure rasterization performance just consider Radeon. I wouldnt said it will be shitshow but if its 4nm node it wouldnt be revolutionary performance breaker.
 
Dont judge too fast. If you care just pure rasterization performance just consider Radeon. I wouldnt said it will be shitshow but if its 4nm node it wouldnt be revolutionary performance breaker.
What radeon? RDNA4 will compete with 4080(S) at best. As an owner of 4070TIS I have no upgrade path besides used 4090 or 5090 for the next 3 years :(
 
I don't believe it but if this rumor turns out to be true, 5080 will be a complete shitshow. The RTX 4090 has 16384 shading units so good luck matching it's rasterization performance with only 10,752 FP32 CUDA cores paired with a 256-bit memory bus. This thing will be around 20%, maybe 30% faster than 4080(S) at best. Who would want to buy this shit if it's not priced well under a grand?

Well, remember that the original 4080 almost was a shit show until Nvidia released the real specs over the rumored ones and anyway the $1,200 MSRP was the biggest joke of the entire Ada stack but there were other jokes as well. I'm hoping Nvidia leaves the BS behind with Blackwell but I suspect they won't.
 
What radeon? RDNA4 will compete with 4080(S) at best. As an owner of 4070TIS I have no upgrade path besides used 4090 or 5090 for the next 3 years :(
3 years for a graphics card it's more than half of it's life :) Long time.... Why you need to upgrade so soon ?
 
What radeon? RDNA4 will compete with 4080(S) at best. As an owner of 4070TIS I have no upgrade path besides used 4090 or 5090 for the next 3 years :(
Try to look at this thru pink glasses. This way you will have more time to collect more money for even more significant upgrade like 6080(Ti/S) :laugh:
 
3 years for a graphics card it's more than half of it's life :) Long time.... Why you need to upgrade so soon ?
Because 4070TIS is hardly cutting it for flight sim VR usage and with MS FS2024 two months away I was really hoping that 5080 would at least trade blows with 4090 and be priced around 1200 bucks. I guess Ngreedia don't want us to buy it's dies anymore besides junk non usable for AI learning. It's a sad state of affairs, AMD giving up on high end, Intel's GPUs being a joke of a year(s) and Jensen not giving a F about us ordinary buyers with limited budgets.
 
Last edited:
I'm hoping Nvidia leaves the BS behind with Blackwell but I suspect they won't.
That's not nVIDIAS fault that people are dumb and buying them like a crazy maniacs. Look at Next-Gen GPUs: What Matters Most to You? voting results. Something similar is going on there. :laugh:

 
Definitely 5080 Ti with memory/cuda somewhere in between this gen lol.

Assuming these rumours are accurate. I'd be surprised to see an xx90 with full memory bus/die. 4090 was quite cut down. I don't think they're going to jump straight to 600 W from 450 W.

My guess 500 W.

20,000 cores.
Ti/Ti Super/Ultra/LE/Super Duper :laugh: there will likely be another couple of SKU's in between them with such a large gap between the 5090/80
Can see 5080 being priced the same as 4080 @$1200 and likely $2-2.5k for the "mightyyyyyyyyyy" 5090 with such specs, 1.5TB/s vRAM though o_O
AMD better pull there finger out and give me an worthy upgrade to my 6800 for $300 almost 4 years since it's release and there is nothing without spending $500+ for a meaningful upgrade, stagnant :mad:
 
I didn't said it can't work fine at 12V but we have 2025 just around the corner now and it would be more eco friendly at 24V level with the same power cables.
BTW
2 times No - You dont need different mobo or new PSU design - just minor changes in PSU - thats all.
Here we go again. I will suggest 48V for 5090, 24V for 5080 and 12V for the rest. Of course the 5090 and 5080 will support also 12V. Its so simple to add new power rail to the PSUs, its just new standard that will took just 10 years minimum to reach the market, just in time for the RTX 5000 launch so PK67 won't hear electricity noises from his fanless low end GPU :roll:
 
Because 4070TIS is hardly cutting it for flight sim VR usage and with MS FS2024 two months away I was really hoping that 5080 would at least trade blows with 5090 and be priced around 1200 bucks. I guess Ngreedia don't want us to buy it's dies anymore besides junk non usable for AI learning. It's a sad state of affairs, AMD giving up on high end, Intel's GPUs being a joke of a year(s) and Jensen not giving a F about us ordinary buyers with limited budgets.
If you are not satisfied with performance of your super duper GPU you can always trying RenPy games or just level up your Python skills.
 
And why is there such a disparity between the power gap relative to shader gap between these two generations?
Take AMD for example. RX 6700 XT: 40 CUs; RX 6800: 60 CUs. Wattage? Almost the same. Why? RX 6800 runs much lower clocks.

If this leak is correct then it's almost certain RTX 5080 is indeed an overclocked 4080. Which is foul. Especially if for >1000 USD we get <50% performance uplift (which is a given, thus it's foul).

I don't believe NVIDIA will stop shitposting but a GPU like 4070TS but for <500 USD and with >=16 GB VRAM would be much appreciated. Almost zero chance it happens before RTX 7000 series but wishing doesn't hurt.

Dont judge too fast.
He still got a point. GPU prices are horribly high, and there's no way they become reasonable. Anti-monopoly departments and other organisations will have hard time proving NV guilty before the 2020s end. And even if they prove there's still no good answer. No good solution.
 
I don't believe it but if this rumor turns out to be true, 5080 will be a complete shitshow.
You better believe it, they literally tried the same strategy this generation, originally the 4070ti was supposed supposed to be a "4080", so why is this so hard to believe lol.
 
He still got a point. GPU prices are horribly high, and there's no way they become reasonable. Anti-monopoly departments and other organisations will have hard time proving NV guilty before the 2020s end. And even if they prove there's still no good answer. No good solution.
Yes and No. It depend on if you are patient or not. What making prices horribly high is progress at ultra fast pace ( despite Jensen trying to fool us Moore Law is dead, so progress will must be dead too ). Short practical lifespan making these prices higher. After 4-5 years super duper GPU is obsolete brick even if totally new condition still.
But ultra fast progress is what we want - isnt it ? Just be more patient and you can find used GPU at affordable price I'm sure.
 
There isnt a need to release a 5000 series as they already the market leaders, this shows to me, they going to release it as they want to sell 4090 owners another 2000+ USD GPU.

If the rumour is true about the power, it also says to me whatever they got ready isnt ready for prime time if it requires another jacking up the power.
 
Knowing Nvidia, that's a whole 17 product classes worth of space between the 5080 and 5090. It makes me wonder, why 5090? Why not just call it a Titan or something?
 
Knowing Nvidia, that's a whole 17 product classes worth of space between the 5080 and 5090. It makes me wonder, why 5090? Why not just call it a Titan or something?
If it is 4 nm node it would be Titan on jelly legs :laugh:
 
I wonder how many people will have so much disposable income to buy one of these.
 
There isnt a need to release a 5000 series as they already the market leaders, this shows to me, they going to release it as they want to sell 4090 owners another 2000+ USD GPU.

There is a big reason to release though besides just wanting to sell new GPUs. AMD will be releasing their next gen around the same time and Nvidia detests being outdone and they know that if they aren't first it means potential lost sales even if it's not a whole lot.

I wonder how many people will have so much disposable income to buy one of these.

For the 5090 not a lot. Probably less than 1% of gamers as usual with the xx90 GPUs. I have no idea what the appeal will be with the prosumers though. The AI craze is still a thing.
 
There is a big reason to release though besides just wanting to sell new GPUs. AMD will be releasing their next gen around the same time and Nvidia detests being outdone and they know that if they aren't first it means potential lost sales even if it's not a whole lot.
AMD won't be competing with these, though.
 
Just be more patient and you can find used GPU at affordable price I'm sure.
Previously, it took two gens for a top tier GPU to fall so low in price it's hard to justify efforts to sell it (GTX 680, for example, cost about 100 dollars in 2018 when it turned 6 years). Today, same level old 2080 Tis still are sold for more than $250. Ampere cards are also still expensive. Ada Lovelace GPUs, even used, are horribly expensive. With upcoming GPUs being even more expensive, it'll require me to abstain from upgrades till my very deathbed for good upgrades to be available on acceptable terms.

Progress is currently excruciatingly slow: previously, $300 GPUs made fun of $600 GPUs of the last gen (GTX 1060 VS GTX 980; HD 7870 VS HD 6970). Today, $300 GPUs match or slightly defeat yesterday's $300 GPUs. You can account inflation and DLSS all you please, it never denies the fact we need a full-scale price war but we can't get one because there's literally not a single company to unload grass for NVIDIA to touch. Notice I'm not even saying, "enough grass."

5080 will be a very expensive piece of rubbish. 5090 will be even more out of reach. 5070 downwards will just be a smidge better than their predecessors and won't cost noticeably less. We'll probably see some good news on the lowest end where Intel and AMD still have their saying but something more than casual gaming will be a millionaires' thing for a long while.
 
AMD won't be competing with these, though.

Not with the 5090 as they've already said but with the 5080 I think they will. Why wouldn't they? AMD had a competitor for the 1080, 2080, 3080, 4080. When AMD says high end I think they are referring to the xx90 GPUs which a lot of people call high end but some call enthusiast.
 
Nvidia can near enough do what they like and still sell the cards.
 
Just look at the transistor counts we’re at these days. To gain significantly more performance, it costs a lot. It’s an exponential growth curve, where 15% more cores gets steeper with each generation. The only other way to gain is more clockspeed, but again, more clocks over more cores has the same exponential penalty. It’s why new products are slowing down. New nodes only fix this so much, and I get the feeling they can’t keep pace with the demands needed for “generational” performance gains.
 
Back
Top