• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5080 SUPER Could Feature 24 GB Memory, Increased Power Limits

That prehistoric 5090 is more efficient than the 9070xt. In fact even limited at the same power it will still be a ton faster. Please, stop this antinvidia madness.
RTX 5090 is not as efficient as other RTX 50 Series gpus even some (almost all) RTX 40 Series gpus have better efficiency. RTX 5090 does not scale that well the same as RX 9070 XT when it comes to power consumption because they are highly overclocked gpus out of the box. Efficiency sweet spot is not optimal and UV is a big must have.

Energy Efficiency calculations are based on measurements using Cyberpunk 2077, Stalker 2 and Spider-Man 2. We record power draw and FPS rate to calculate the energy efficiency of the graphics card as it operates.
energy-efficiency.png


RTX 5080 being only 15% faster than RTX 4080 it's still way better deal than a RTX 5090

RTX 5090 is 52.5% faster @4k but costs 140% more in comparison terrible deal as a gaming gpu or difference between "worst" and "the worst". :laugh:
 
Last edited:
RTX 5090 is not as efficient as other RTX 50 Series gpus even some (almost all) RTX 40 Series gpus have better efficiency. RTX 5090 does not scale that well the same as RX 9070 XT when it comes to power consumption because they are highly overclocked gpus out of the box. Efficiency sweet spot is not optimal and UV is a big must have.
Still more efficient than anything the competition has. Yet you called it prehistoric, lol.

Nothing has better efficiency than the 5090. Limit it to 400w and off you go, you top both the efficiency and the performance charts at the same time. Truly prehistoric card...
 
Yet you called it prehistoric, lol.
In 2025 600w to me it's not acceptable where are you seeing that technology is going backwards not forward ? It's a downgrade form RTX 4090 (450w tdp) and RTX 3090 (350w tdp)

It's literally manhandled by prehistoric wisdom in a very dumb primitive way just by raising power slider to max.
 
Last edited:
In 2025 600w to me it's not acceptable where are you seeing that technology are going backwards not forward ? It's a downgrade form RTX 4090 (450w tdp) and RTX 3090 (350w tdp)
What stops you from...uhm,just limiting the card to 400w? There you go, problem solved.
 
What stops you from...uhm,just limiting the card to 400w? There you go, problem solved.
This is actually a very good idea ! You don't lose that much of FPS if you undervolt at the same time. Been running my 5080 at 250-300 Watts lately instead of 360.
 
This is actually a very good idea ! You don't lose that much of FPS if you undervolt at the same time. Been running my 5080 at 250-300 Watts lately instead of 360.
Yeah, im running my 4090 at 320w. Add in uv and some memory oc for better performance
 
Contract pricing is typically considerably higher than spot prices. You pay a premium for guaranteed supply of a specific SKU that has specific electrical characteristics. Consistency in price and product is the key. Contract pricing also includes engineering support and warranty, which the spot market doesn’t provide.
Makes complete and total sense, but with the current prices of GPUs, they aren't making it back 1000 fold already (I'm exaggerating with the 1000 fold, but you get it)?
 
Check a TPU review from the 2025 test suite, it's the 3080 that's consistently ahead too. Nothing big, but it's consistent.
Yes, this is all within a margin of error on 1080p/1440p, coming usually from selection of games in specific reviews. Nothing to lose one's head about.
The extra VRAM is nice, and unusual for Nvidia to be this generous.
They have been consistently slammed by tech community for being stingy on VRAM. Even if we assume that Nvidia does not care about it, then the next question is why offer more VRAM at all if majority of GPU market has already swallowed their cards with less VRAM? A dominant player has nobody left to impress, right?
 
A dominant player has nobody left to impress, right?
Owners of previous gen cards. Say someone is rocking a 4070ti super 16gb for example, might make the upgrade to a 5080 more tolerable when it comes with an increased vram as well.
 
Owners of previous gen cards. Say someone is rocking a 4070ti super 16gb for example, might make the upgrade to a 5080 more tolerable when it comes with an increased vram as well.
I doubt that, perhaps a few thousand purchases, as owners of 4070Ti will be close to 6000 card launch, on a far better node and new architecture.
 
Makes complete and total sense, but with the current prices of GPUs, they aren't making it back 1000 fold already (I'm exaggerating with the 1000 fold, but you get it)?
Well yes, but don’t think the cost to deliver a chip to you on a finished product is anywhere near what commodity memory costs.

In general, at the end of the assembly line component cost is doubled.
 
How about the Zuper Titanium or Ti variant?
 
While the extra VRAM is much appreciated. We also want faster cards for cheaper.

Always something with Nvidia. You either get this or that. Why can't we have both?

I guess they don't really need to try very hard as AMD keeps shooting themselves in the foot as well.
Can’t make it faster because it’s already using the maxed out GB203 die. They can’t add more cores without putting it on GB202, which is reserved for the 5090.
 
The performance of the 5080 is not good enough for the money they ask. Too much perf difference between the 5080 and the 5090 this gen. The 5080 should have come with 24GB in the first place, and is not worth investing in one unless you have money to burn. There is absolutely no longevity in a 16GB card in 2025, and as soon as the next gen consoles get released 16GB cards will not be enough.

6-8GB - Budget basement cards for 1080p and old and retro gaming with obsolete grade consumer AI
12GB - Low end 1440p gaming and very low end consumer grade AI
16GB - Midrange 1440p+ gaming and low end consumer grade AI
24GB - High end 4K gaming, with texture mods, mid range consumer grade AI
32GB - Enthusiast 4K+ gaming with texture mods, high end consumer grade AI
48-64GB - Enthusiast 5K+ gaming with texture mods and very high end consumer grade AI

Haha, youthful idealism. Let me guess, you want a 64 GB gaming card to run on that crusty socket AM4 PC with 16GB of DDR4... at least it has a 5800X3D! :laugh:

RTX 5090 is not as efficient as other RTX 50 Series gpus even some (almost all) RTX 40 Series gpus have better efficiency. RTX 5090 does not scale that well the same as RX 9070 XT when it comes to power consumption because they are highly overclocked gpus out of the box. Efficiency sweet spot is not optimal and UV is a big must have.

Energy Efficiency calculations are based on measurements using Cyberpunk 2077, Stalker 2 and Spider-Man 2. We record power draw and FPS rate to calculate the energy efficiency of the graphics card as it operates.
energy-efficiency.png


RTX 5080 being only 15% faster than RTX 4080 it's still way better deal than a RTX 5090

RTX 5090 is 52.5% faster @4k but costs 140% more in comparison terrible deal as a gaming gpu or difference between "worst" and "the worst". :laugh:

It's 2x faster than the 9070 XT with 2x the VRAM, almost 3x the memory bandwidth and about 1.75x the power consumption of one. No big secret there. Try one of these cards for yourself and you'll see what I mean.
 
Usually Super cards have some changes at least on hardware level so they will skip it for the first time ? Extra VRAM and more OC that's a comedy. The same gpu but with different name.

RTX 5080 already is very weak upgrade over original RTX 4080 only +15% boost @4k and now this shit appears on horizon. :laugh:

qjk20qIonkzlH3Fa.jpg


Even RTX 4080 Super have some hardware changes more cores and a huge 1.3% boost over RTX 4080 @4k
View attachment 405927
This card only makes sense to me as I'm coming from RTX 3090, I want a performance upgrade and features like frame generation without using FSR-FG spoofing, my PC is hooked up to a Denon 7.1.4 AVR but there are HDMI audio problems on 30 series cards, and I don't want a VRAM downgrade since I game at 4K and play a lot of VR.

Also, I'm not paying RTX 4090 or 5090 pricing. Even used 4090 cards are too close to £2,000 for my liking.
 
Last edited:
Back
Top