• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 5000 Blackwell Memory Amounts Confirmed by Pre-Built PC Maker

I think for me personally, right now, the most stupid and retarded thing I can do, is to buy the 5080 card, which is no way future proof due to low VRAM amount and the narrow 256-bit bus. I'm hoping for a 5080 Ti card with 320-bit bus and 20GB of VRAM or for the 6080 with those specs, but that's just wishful thinking.
However atm, I see no reason to upgrade my 3080 card, since I can play absolutely all possible games right now without any issue. Only my vanity pushes me to upgrade, and that's to match the 100fps with my 100Hz monitor, from 50-60 fps I'm getting now with ultra details, including the overhyped RT....
I'm gonna get a 9070 XT. I'm on Linux, and AMD makes my life easier there with their kernel-integrated drivers. Not to mention, 7900 GRE/XT level performance with upgraded RT, 2-300 W power and 16 GB VRAM for ~500 is the sweet spot. I don't need anything more expensive or beefier. The 5080 is just way excessive, and 16 GB VRAM is really not enough at that level. And the 5090... pff... why even talk about it?
 
I'm gonna get a 9070 XT. I'm on Linux, and AMD makes my life easier there with the kernel-integrated drivers. Not to mention, 7900 GRE/XT level performance with upgraded RT, 2-300 W power and 16 GB VRAM for ~500 is the sweet spot. I don't need anything more expensive or beefier.

Clever plan, but you can also consider the lower tier cards - RX 9600, RX 9600 XT, or RX 9700.
 
Clever plan, but you can also consider the lower tier cards - RX 9600, RX 9600 XT, or RX 9700.
I could, but with a 6750 XT, they wouldn't be so much of an upgrade.

Also, I've bought too much PC hardware in recent years, I want to slow down a bit, and plan something long-term for a change. :)
 
CP 2077 Phantom Liberty, a 2023 game, uses 18.3 GB VRAM at 4K with PT and DLSS3 frame gen, without textures mods from the community.
Alan Wake 2, a 2023 game: 17.8 GB.
Indiana Jones and the Great Circle have very high VRAM usage, even a 3080 10GB can't run it at max settings 1080p.
Avatar Frontiers of Pandora with unobtanium settings at 3440x1440 also uses 18/19 GB on a 4090.

For so expensive cards 16 GB is outrageous and DOA.

Vote with your wallet.
VRAM allocation does not necessarily means it really needs 18,3 GB VRAM used, I think for gaming alone it is gonna be fine.

They are not giving more than 16 GB VRAM because they don't want people running LLMs/AI art generators/training AI locally on these cards ---> those capacities are for Quadros etc which are I dont even wanna know how much these days. They obviously want to sell them very expensive but if they had 32-64 GB cards for reasonable prices companies would buy them in bulk for AI training and we would be back to where we were when everyone was mining on GPUs in their basements...
 
VRAM allocation

No one talks about VRAM allocation. Enough with this inappropriate excuse.

They are not giving more than 16 GB VRAM because they don't want people running LLMs/AI art generators/training AI locally on these cards

but if they had 32-64 GB cards


There is a very substantial difference between the number 16 and the number 32, even more so between the number 16 and the number 64.
We know that they are not giving because they are cutting corners and because of the greed.
 
CP 2077 Phantom Liberty, a 2023 game, uses 18.3 GB VRAM at 4K with PT and DLSS3 frame gen, without textures mods from the community.
Alan Wake 2, a 2023 game: 17.8 GB.
Indiana Jones and the Great Circle have very high VRAM usage, even a 3080 10GB can't run it at max settings 1080p.
Avatar Frontiers of Pandora with unobtanium settings at 3440x1440 also uses 18/19 GB on a 4090.

For so expensive cards 16 GB is outrageous and DOA.

Vote with your wallet.
I absolutely played ALL those games you mentioned on 3440x1440 with my 10GB 3080 , and I absolutely had ZERO issues maintaining 60+ fps on max details.
Again, stop being brainwashed by all those stupid youtube videos and crappy articles. Just test the game yourself.
And again, for the millionth time, the game engines cache almost all VRAM, but that doesn't mean it uses all.
Stop believing all this retarded propaganda and just verify, if possible, by yourself.
Actually, even the TPU has very good benchmarks where all those top hungry games are tested. It's all here.
 
I'm keep saying the same thing over and over.
The RTX 5080 should have been a 320-bit card with 20GB VRAM.
This 5080 with 256-bit/16GB scam pulled by nVidia is actually the real 5070, while the 5070 Ti, is actually a 5060 Ti for the price of a 5080.

Weird that historically the xx80 card usually are 256bit bus, the 3080 is more of the outlier (because RX6000 was too close in performance, or Samsung 8nm yield was just terrible, or both)

GTX 980 - 256bit bus
GTX 1080 - 256bit bus
RTX 2080/2080Super - 256bit bus
RTX 3080 - 320bit bus
RTX 4080 - 256bit bus
RTX 5080 - 256bit bus
 
No one talks about VRAM allocation. Enough with this inappropriate excuse.
VRAM allocation is exactly what we are talking about
1735742919943.png

You can cleary see that 8GB is indeed not enough here anymore but there is no difference on a 12GB card and a 24 GB card which means that while it is allocating 18GB when available it does not in fact need more than 12 GB
 
This could drive more users to 5070 ti than 5080, depending of course on the price. Even though GDDR7, 16gb seem more ‘appropriate’ for the 5070 ti.
 
a 5060Ti card on a +300w gb103 die ? your thinking is really amusingily naive. or maybe you've seen too much clickbait on youtube, telling you a 4080 was really a 4060ti.
was 7900xtx supposed to be a 7600xt then ?
I'm not watching youtube junk videos, but it seems you are, since you are so brainwashed.
It doesn't matter. You can make the leather jacket guy even richer, while I enjoy my games for another 2 or 3 years, while spending my money on a nice holiday or something, instead of wasting it on some overpriced hardware.
 
a 5060Ti card on a +300w gb103 die ? your thinking is really amusingily naive. or maybe you've seen too much clickbait on youtube, telling you a 4080 was really a 4060ti.
was 7900xtx supposed to be a 7600xt then ?

RX 7900 XTX is indeed a real RX 7800 XTX in the best case. In the worst case it trades blows indeed with RTX 4070 and RTX 4070 Ti, so calling the "4080" a 4060 Ti is plausible.
Very fast forgot the 4080-12 ?
 
Geeze Jensen,

You could at least give your 5070Ti Super 20GB, same for the 5080Ti, for the 5080 Ti Super, 24GB. You know it takes a lot for me to start hating someone, but you Jensen, I hate, a lot.
 
Geeze Jensen,

You could at least give your 5070Ti Super 20GB, same for the 5080Ti, for the 5080 Ti Super, 24GB. You know it takes a lot for me to start hating someone, but you Jensen, I hate, a lot.
The message is clear - you either buy a 5090, or you're a barefoot peasant and should suffer with the consequences of your choice. I'm not a fan of any company, but this makes Nvidia quite unsympathetic in my eyes. I'm not saying that 16 GB isn't enough for mainstream gaming, but if you fork out the cash for a 5080, then you should be getting a tad more for a bit of longevity, imo.
 
CP 2077 Phantom Liberty, a 2023 game, uses 18.3 GB VRAM at 4K with PT and DLSS3 frame gen, without textures mods from the community.
Alan Wake 2, a 2023 game: 17.8 GB.
Indiana Jones and the Great Circle have very high VRAM usage, even a 3080 10GB can't run it at max settings 1080p.
Avatar Frontiers of Pandora with unobtanium settings at 3440x1440 also uses 18/19 GB on a 4090.

For so expensive cards 16 GB is outrageous and DOA.

Vote with your wallet.
I chose you 5060 Ti 16GB! Is my vote valid?
 
Last edited:
I would have hoped for a bit more critical thinking here. Both have "Super" suffixes, so they're just as likely to be typos for 4070 Ti Super and 4080 Super, which means that nothing has been confirmed.
 
well that's terrible news but as expected. I barely fit Star Wars with secret Outlaws settings 2k in 16gb vram.
via dramexchange website prices are sub 3$ per 8gb.
I also found that "With current-gen GDDR6 expected to get cheaper by 8-13%, while next-gen GDDR7 likely to become cheaper by max 5%. But don’t expect graphics cards makers like Nvidia and AMD to give that benefit to the consumers. For two reasons. They (Nvidia) want to keep their high profit margins and two, they have likely signed a long time deal with VRAM makers. Means price decrease of graphics cards will only follow the usual trends, not anything unusual.The reason for all this is simple, lesser demand of computers, mobiles, notebooks and datacenters worldwide. This has ensured that the prices go down everywhere."
conclusion: vote with your wallet 2025 edition
 
The message is clear - you either buy a 5090, or you're a barefoot peasant and should suffer with the consequences of your choice. I'm not a fan of any company, but this makes Nvidia quite unsympathetic in my eyes. I'm not saying that 16 GB isn't enough for mainstream gaming, but if you fork out the cash for a 5080, then you should be getting a tad more for a bit of longevity, imo.

I get it though Nvidia does not want companies using GeForce cards for professional work and instead wants them to buy insanely high margin professional cards. It just sucks gamers get the short end of the stick over it. A lack of any real competition is probably a factor but a really distant 2nd.
 
Is not about charity, is about playing fair to the group that made them who they are today. Renaming a x070 card to an x080 is beyond callous , especially when they doubled the price in just a couple of years.
The problem is that almost nobody calls them in. The so called tech influencers keep praising the company (naturally because of the big paycheck they received from them for good reviews), while unbiased tech sites don't emphasise enough those practises, afraid they will be taken out of the free samples for review or something...
How the heck is the 4080 just a 4070 - when the 4080 is just as fast as the 7900xtx in raster and faster in RT? That doesn't make sense man.
 
How the heck is the 4080 just a 4070 - when the 4080 is just as fast as the 7900xtx in raster and faster in RT? That doesn't make sense man.

Cuz the 7900XTX is the real RX 6800 replacement duh..... Even though it uses a ton of silicon....

Honestly the only issue with the 4080 was it's price.... Even 50% more expensive than the 3080 would have been a massive improvement over the 70% it ended up being it did allow the 7900XTX the real 6800 successor to be priced like a 6900XT though lol...
 
Back
Top