• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5080 SUPER Could Feature 24 GB Memory, Increased Power Limits

I respectfully disagree with some of the scaling presented here : I have a 9070XT, I can crank 80% of my games to 1440p high/ultra and get 60 to triple digit FPS (depending on the title of course) it'll show TBP drawing 450W and raise my room's temperature by 4-8°C but hey, it'll crank out those frames ! (and yes, that's without FSR/FG, I barely tolerate upscaling, you'll never catch me dead using FG) so 16GB's range quality really depends on the GPU being used here : a 5060 16GB ? yeah, low-mid range 1440p, 5070Ti/9070XT ? absolutely high/ultra 1440p no issues

And all that while multistreaming AND playing VR, doesn't hit 16GB (only ever maxxed out around 14GB when really pushing it) so if 16GB really doesn't have longevity, something TERRIBLY WRONG has happened in the game industry.
As I said, 16GB is perfect for 1440p and some 4K depending on the game and upscaling etc.

16GB is not going to be enough when the next-gen consoles are released in 18-24 months from now. That's a fact. Consoles today have access to about 12GB of graphics memory, and what do you see on the PC side, when running a modern game, a 12GB card will be 90-100% utilised! So following that logic, when consoles can access more VRAM, then the games built to push it will also require more VRAM when ported to PC. This is what happened in the industry every time a new console generation is released since the PS3, so why would this time be any different? Why is that "terribly wrong"?

Maybe because at 3440x1440 i never saw VRAM usage higher than 16GB but i also don't use "mods".
Then 16GB is fine for YOUR use case, and that's fine because luckily we can now get access to 16GB cards. But what you say upsets the 8GB of VRAM is all you ever need crowd, and 8GB is fine for all modern games, which this forum is absolutely full of, yet YouTube is full of videos proving that 12GB is not enough now in some cases... 16GB IS fine for now, what I'm saying is that it is fine for casual gamers and e-sports types. It's not fine for those that mod games for better quality, and there are many people into that scene, more that I suspect you realise. And as many have pointed out, including myself, is that the 5080 24GB will come attached to a crap underpowered GPU, and will be VASTLY overpriced, and it is the only card with more than 16GB of VRAM for the next 2 years, and that's a problem!
 
Last edited:
But what you say upsets the 8GB of VRAM is all you ever need crowd, and 8GB is fine for all modern games, which this forum is absolutely full of.
I'm not investigated it a lot but for 1080p 12gb and 16GB for 1440p is probably comfortable bare minimum now and for future 2 years. 4k may be different story but still even when added extra 24GB to RTX 5080 will not make it way faster only in few specific scenarios not in every case. If you are sitting 6 years on single gpu then yes it will make the difference but not for 2 years that's for sure.
 
Last edited:
Many games have mods that will not work on a 16GB card,
A "mod" is, by definition, outside the design targets of any software or hardware made to work with the unmodded product.
Using them as a case is no different than dismissing all packaged coolers because they can't accommodate >7GHz CPUs (or whatever goes for high OCs these days). They exist, sure, but they are irrelevant when discussing the typical use case.

All depends on consumers and it's definitely consumers fault not nvidia's fault. If consumers right now will allow this to happen then next time nvidia will downgrade RTX 60 Series even more.

Don't buy that piece of shit to get a better future performance/cost it's very simple.
Blame the market?! Blasphemy!
</sarcasm>
 
Was it? I don't remember reading that from any tech press or nvidia themselves.
The main reason I don't buy class 80 cards from Nvidia is absurdly low VRAM capacity. They have destroyed this class of cards. Complete nonsense.

6800XT had 16GB of VRAM in 2020 already. It is taking Nvidia 5 long years to decide to offer more than 16GB on class 80 cards. It feels like consumers need to beg for extra VRAM on a product in $1,000 price class. That's why I showed them a middle finger long ago.

An equivalent of class 80 card from AMD, 7900XTX, offered 24GB of VRAM in 2022 already. So, not on a refresh, but from day one. As 5080 Super is already maxed out on die, the only thing they can offer is upgraded VRAM modules from 16Gb to 24Gb, aka 3GB. I hope they dont charge two kidneys for 8GB extra VRAM.
 
Last edited:
The vanilla 5080 is less than 15% slower than the 4090 in games, at half the price.
On W1zzards test rig, he is seeing much less of a difference between the 5080/4090 that other reveiwers, but Ive looked at the specs and cant see why.

As for the half the price comment, I was about to comment on that. But when prices are still massivly over MSRP whats the point.
 
How about giving us crippled GB202 instead of 203 like Nvidia did with 4070 TI Super?

How about returning to 320/352/384 bit bus like we used to get with Super/TI variants of xx80 class GPUs?
It's shrinkflation everywhere I look or it's Jensen Vs gamers:nutkick:
 
The main reason I don't buy class 80 cards from Nvidia is absurdly low VRAM capacity. They have destroyed this class of cards. Complete nonsense.
That's only on paper reallity is that RTX 5080 suffers way more from weak gpu which is only 15% faster than midrange RTX 4080 for 1200$.

Where are you using more than 16GB of VRAM ?
 
You will get more AI... but not more silicon. The trend is to give you less and less silicon on each new generation.
It's called pay more to get less.
 
On W1zzards test rig, he is seeing much less of a difference between the 5080/4090 that other reveiwers, but Ive looked at the specs and cant see why.

As for the half the price comment, I was about to comment on that. But when prices are still massivly over MSRP whats the point.
Even at current street prices, the 5080 is half the price of a 4090. The 5080 isn't a cheap card, but the 4090 is egregiously expensive.
 
You will get more AI... but not more silicon. The trend is to give you less and less silicon on each new generation.
Yeah. DLSS is awesome and all but not at the expense of less silicon. Nvidia wants us to effectively pay more for less with each new gen. Maybe it's time we all say FU to Jensen and move on to other hobbies. It looks like DIY PC is dead man walking thanks to crypto and now AI hype.
 
Maybe it's time we all say FU to Jensen and move on to other hobbies.
Average IQ is to low for that. :kookoo:

That's the problem No.1 and it's the main problem. Wait until RTX 60 Series and will see even worse moves.
 
16GB is not going to be enough when the next-gen consoles are released in 18-24 months from now. That's a fact. Consoles today have access to about 12GB of graphics memory, and what do you see on the PC side, when running a modern game, a 12GB card will be 90-100% utilised! So following that logic, when consoles can access more VRAM, then the games built to push it will also require more VRAM when ported to PC. This is what happened in the industry every time a new console generation is released since the PS3, so why would this time be any different? Why is that "terribly wrong"?
the PS5 has 16GB of unified memory, I doubt 12 of it is allocated to graphics but wouldn't be surprised either if Sony/PS managed to optimized the heck out of the OS' own RAM consumption though there's still the game itself to take into account, of which many mainstream titles routinely use more than 4GB on PC

and honestly, if they truly have 12GB just for graphics, then they really aren't making proper use of it, considering they have to use upscaling just to get 1080p, let alone 1440p graphics nevermind 4K. Well it's not like I care that much since 99% of the games I play aren't console native/ported games, PC games are... fairly optimized, depending on which studios you look at... and at the very least, VR games are an actual hurdle devs HAVE to optimize for since upscaling makes VR unplayable, especially when streamed over wifi.

Yeah. DLSS is awesome and all but not at the expense of less silicon. Nvidia wants us to effectively pay more for less with each new gen. Maybe it's time we all say FU to Jensen and move on to other hobbies. It looks like DIY PC is dead man walking thanks to crypto and now AI hype.
AMD literally exists. Intel needs better marketing but Xe3 is also on its way.


Don't declare PC gaming dead when alternatives exist.
 
Plus increase to 415 W TGP has real world consequences. You pay more for more power hungry device for miniscule performance boots in times when outside temperatures are regularily hitting +40C (at least here in Europe) and we all try to keep our homes as cold as possible. Stupid, just stupid.
 
AMD literally exists.
True but they're sadly not offering any upgrade path to someone like me with 4070TIS atm. I have a feeling upper mid to lower high end just died on us. 5080 should be really called 5070 TI, given the bus and core counts. The only OK options remain super expensive 4090 and 5090. I'd expect to get at least 4090 level of performance with 5080, but no not really. Even if new super variant comes close to it in 1440p and maybe 4K it will lag behind at any higher res due to it's bus limitations.
 
What’s interesting to me is that both AMD and Nvidia seem to be going out of their way to offer only 128-, 256- and 512-bit memory interfaces. First time in awhile that’s happened if ever.
 
Maybe because at 3440x1440 i never saw VRAM usage higher than 16GB but i also don't use "mods".
Requiring more than 16 GB VRAM is not that scarce nowadays. Multiple games benchmarked by TPU are nearing or surpassed 16 GB limit.
Example: Alan Wake 2 in 4K with all maxed out (RT, PT, incl.).

Or this one:
1751284096838.png
 
What’s interesting to me is that both AMD and Nvidia seem to be going out of their way to offer only 128-, 256- and 512-bit memory interfaces. First time in awhile that’s happened if ever.
It makes sense. 128-, 192, 256 are cheap and easy to make as I understand it, so why spend 512-bit memory interfaces on gaming plebs when majority is still gaming at 1080p, when you can sell it to datacenters? Well you can sell 512-bit junky dies in form of 4090/5090 to enthusiast gamers with deep pockets while selling good ones rebranded to A6000 labels for +5,000 bucks to enterprise and governments.
 
It makes sense. 128-, 192, 256 are cheap and easy to make as I understand it, so why spend 512-bit memory interfaces on gaming plebs when majority is still gaming at 1080p, when you can sell it to datacenters? Well you can sell 512-bit junky dies in form of 4090/5090 to enthusiast gamers with deep pockets while selling good ones rebranded to A6000 labels for +5,000 bucks to enterprise and governments.
But there are no 192 bits this gen from AMD and Nvidia. That’s what’s so weird. Nvidia especially is known for 192, 320, 384 and sometimes even 448-bit buses in the past. Not one of those this gen even with the supers as they are using 3GB dies to offer mid memory sizes on 128- and 256-bit only.
 
True but they're sadly not offering any upgrade path to someone like me with 4070TIS atm.
1751284943801.png

I have a feeling upper mid to lower high end just died on us. 5080 should be really called 5070 TI, given the bus and core counts. The only OK options remain super expensive 4090 and 5090. I'd expect to get at least 4090 level of performance with 5080, but no not really.
It'd be more accurate to say that Nvidia's gaming segment died on us. [Before anyone else mentions it, I'm not talking about sales.]
They released the waste of sand that is the 5050 like it's not subpar on every levels, the 3060 beats the 5060 as soon at the VRAM ceiling is hit, the 5080 is a 4080Ti and Blackwell's IPC uplift is essentially margin of error, if anything came out of RTX50, it's a massive AI uplift that nobody asked for and fast VRAM that no one who buys those cards will actually fully profit from, those who benefit from it are studios who can't be bothered to optimize their games and use DLSS as a crutch for their garbage and people buying RTX50 are the actual suckers paying for it.

Case in point : Hell Is Us requires permanently enabled upscaling and a 4090 to get 4K@30, what the actual fuck ?
 
But there are no 192 bits this gen from AMD and Nvidia. That’s what’s so weird. Nvidia especially is known for 192, 320, 384 and sometimes even 448-bit buses in the past. Not one of those this gen even with the supers as they are using 3GB dies to offer mid memory sizes on 128- and 256-bit only.
Here's your answer: "RTX A5000, which are connected using a 384-bit memory interface" NVIDIA RTX A3000 ; Memory Bus Width, 192 Bit" It looks like all these buses went to enterprise market this time around. Gamers are getting only leftovers nowdays.
 
Back
Top