• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5080 SUPER Could Feature 24 GB Memory, Increased Power Limits

Nvidia doesn't regulate anything. Nvidia CANNOT stop Intel or AMD from launching a GPU that roflstomps the 5090 in performance for 500$.
Of course because nvidia knows that technically they can't do it.
 
AMD also is doing nothing just following nvidia steps and copying its behavior look at RTX 5060 8GB vs RX 9060 XT 8GB how do you think amd will sell that shit?
Nvidia has a near monopoly and a stranglehold on the market, AMD follows as a business decision because why even try like they did with rdna3 with the high end when people will buy Nvidia no matter what. Intel is barely in the dgpu market and they could be investing even more than AMD if they cared to compete.
But the 9060xt 8gb is mostly going to be in oem systems according to HW unboxed. Though selling an 8gb card period was a dumb move as everyone criticized it well before launch, while the 5060 8gb gets defended as a "budget" card.

Why can't they? What's stopping them?
Wafer costs, manufacturing costs, shipping costs, the cost to even develop such a card to compete with a $3000+ card. Oh wait, only team green gets defense for any of those reasons.
 
After seeing the supposed Super refresh, is this card really for gamers? I would think the card is more for content creation and AI vs gaming. Don't get me wrong, it can game, but it should be the same as the regular 5080, the VRAM really only improves features other than gaming.
 
After seeing the supposed Super refresh, is this card really for gamers? I would think the card is more for content creation and AI vs gaming. Don't get me wrong, it can game, but it should be the same as the regular 5080, the VRAM really only improves features other than gaming.
Yeah it's mostly for AI. Good move from nvidia, separate the gaming from the productivity lines like that, more gaming gpus for us gamers :clap:
 
Man cone on, amd didn't even release a card faster than their last gen 7900xtx. It's not technology that is stopping them.
No look at IPC and power consumption nvidia is ahead! No high end because of fail in production if i'm not mistaken.
 
Yeah it's mostly for AI. Good move from nvidia, separate the gaming from the productivity lines like that, more gaming gpus for us gamers :clap:
I agree i'm just worried about how the pricing will of the more AI centered GPUs will affect the gaming GPUs.
 
Even at current street prices, the 5080 is half the price of a 4090. The 5080 isn't a cheap card, but the 4090 is egregiously expensive.
Well in the Uk atleast you can buy 4080's for a lot less than orignal MSRP and will a 5yr warranty.
 
If the 5080 Super is really just a power limit OC+24GB, it's yet another generation of disappointing super cards from Nvidia.

Would really like to upgrade from my 6800XT, but Nvidia and AMD aren't making it easy. Not going to upgrade until I can get ~2x the performance for ~$700.
 
The 5080 should have come with 24GB in the first place, and is not worth investing in one unless you have money to burn.
Agree, a cut-down 5090 with 24GB/384-bit bus. Just like the rumoured 5090DD but with fewer shaders etc.
 
Copied from the overly smart Google AI.

AI Overview

The exact cost for manufacturers to purchase 16GB of GDDR7 memory is difficult to determine because it is a new technology with limited public market data
. However, general pricing information for GDDR7 and related data can provide an estimated cost:
  • GDDR7 is currently a premium technology: As a new and advanced type of memory, GDDR7 is likely more expensive than previous versions, such as GDDR6, due to development and manufacturing costs.
  • A 3GB GDDR7 module was reportedly available for approximately $10 in China in June 2025. This suggests a rough cost of around $3.33 per GB. Based on this, 16GB might cost approximately $53 ($3.33/GB x 16GB). However, this is a single instance, and the actual cost for a manufacturer buying in bulk could differ significantly.
  • GDDR7 initially carries a 20-30% premium compared to GDDR6, according to a June 2024 report. If GDDR6 8Gb modules were available for roughly $2.30 per GB in September 2024, as one source indicates, then GDDR7, with a premium, might be around $2.76 to $2.99 per GB. This would put the cost for 16GB of GDDR7 at approximately $44 to $48.
  • Industry estimates suggest GDDR7 is likely at most $5 to $6 per GB, potentially leading to an approximate cost of $80 to $96 for 16GB.
  • The cost of GDDR7 will likely decrease over time, as the technology matures and production volumes increase.
In summary, based on the limited information available, the manufacturer cost for 16GB of GDDR7 memory is likely between approximately $44 and $96, but it could be higher or lower depending on various factors such as bulk purchasing agreements and the specific time of purchase.

*******

AI Overview

Based on the provided search results, here's an overview of the approximate cost of 16GB of GDDR6 memory for manufacturers:
  • Spot Market Prices:
    • As of September 30th, 2024, GDDR6 8Gb (1GB) spot pricing was approximately $2.30. This would suggest a price of approximately $37 ($2.30 x 16) for 16GB. However, another source suggests an 8GB GDDR6 module costs approximately $18. This would mean a 16GB module costs approximately $36 ($18 x 2). These are general market estimates, and manufacturers may have contracts with suppliers that influence their purchase price.
    • Back in June 2023, 8GB of GDDR6 was approximately $27 on the spot market.
  • Manufacturer Purchase Prices:
    • Large buyers like NVIDIA, Intel, and AMD negotiate directly with memory producers (SK Hynix, Micron, Samsung). Their actual purchase price is likely much lower than the spot prices.
    • One source from March 2025 mentioned Samsung's 20Gbps 16Gb GDDR6 modules being approximately $8 a piece on AliExpress. This is just an AliExpress price and may not reflect the cost for manufacturers.
Important Considerations:
  • Spot vs. Contract Prices: Spot market prices reflect current market conditions but can differ significantly from the long-term contracts manufacturers have with suppliers.
  • Production Volume: The cost to manufacture GDDR6 memory chips depends heavily on the production volume.
  • Additional Costs: The cost of the memory itself is only one part of the total cost for manufacturers. Other factors include:
    • Design and development costs
    • Chip design tools
    • Manufacturing facilities and utilities
    • Reference board designs and driver development
    • Packaging costs
In summary, while spot prices suggest that 16GB of GDDR6 memory could be approximately $36 to $37, the actual cost for manufacturers with large-volume contracts is likely lower and influenced by various other factors beyond just the memory itself.
 
I want to see a 5060 Super so badly.

The 5060's core count bump over the 4060, coupled with the extra bandwidth that GDDR7 brings to the table means that the 5060 would actually be goddamn AMAZING if it just had 12GB VRAM.

C'mon Nvidia. ONE olive branch, that's all the masses are asking for. Yes, 24Gbit GDDR7 modules cost almost double what a 16Gbit module costs, but you only need four of them for a 5060. Put the price up to $329 and it's the 3060 successor that the 4060 failed to be.
 
And yet..the 6800xt (and even the 6900xt) are overall slower than a 3080 10gb, 5 years later. Even in vram heavy titles.. 5080 is NOT a 4k GPU, the situation will be the same as with the 3080 - the core will bend the knee before it hits the VRAM limit. And when it does - DLSS will keep the VRAM usage at bay. Mentioning 6800xt and 7900xtx as an example isn't giving you any credit, either. While a 3080 10gb is perfectly cabale of 60fps in newer titles at 1440p with DLSS.., 6800xt isn't. FSR 3 is just unusable while the gpu is not up to the task at native. Next year is 7900xtx's turn..

In addition - nvidia and amd handle textures and vram differently. AMD cards usually need more vram at the same settings than an equivalent nvidia gpu.

On the topic - if the core remains the same and the only difference is the vram and clocks - 5080 Super provides nothing more and will last just as long as the vanila 5080. Playing a game with 15 fps instead of 5 makes no difference..
Last time I looked at HUB charts, 6800XT and 3080 were neck-in-neck in 1440p, with RDNA2 card being a few fps ahead, on average, but nothing big. I don't use upscalers with games I play at the moment, so it doesn't matter to me, but if it helps 3080 owners deal with some modern titles, good for them. I am pleased.

I don't think Nvidia is able to brainwash us all into thinking that upscalers are panacea for low VRAM problem. Even recent Transformer model optimizations saving up to 20% of VRAM will not be able to hide the issue with basic hardware.

Hence, they will release Super cards with proper VRAM capacity. Finally! If they had thought they could get away with philosophy of low VRAM+DLSS=gamer happiness, they would have never decided to make such a move with Super cards. The message about low VRAM is clear and loud from tech community. And it applies to some AMD cards too. 24GB 5080 Super will certainly be more interesting for AI workloads and better offload of LLMs to more VRAM capacity. So, not only gamers will benefit.

They cannot be selling a $1,000 card with 16GB in 2026. It's not decent anymore. Upscaling software is not there to do compensation magic for insufficient VRAM. It is there to make it excel, where needed. Growing number of Nvidia card buyers are also becoming more appreciating of this basic idea.

New consoles will also need to upgrade VRAM, so that hardware is relevant until next generation in ~2035. If new consoles can come with at least 24GB of VRAM, I cannot see any reason to keep excusing Nvidia for offering only 16GB on 80 series in 2026.

Man cone on, amd didn't even release a card faster than their last gen 7900xtx. It's not technology that is stopping them.
In this case it is technology because it turned out that Navi '41' with 9 compute chiplets needed more time to develop and work properly, so it wasn't ready for RDNA4 release.

Would really like to upgrade from my 6800XT, but Nvidia and AMD aren't making it easy. Not going to upgrade until I can get ~2x the performance for ~$700.
You will need to wait then.
 
If by that you mean
No, I said what I meant already.
Last time I looked at HUB charts, 6800XT and 3080 were neck-in-neck in 1440p, with RDNA2 card being a few fps ahead, on average, but nothing big.
Check a TPU review from the 2025 test suite, it's the 3080 that's consistently ahead too. Nothing big, but it's consistent.
I don't use upscalers with games I play at the moment, so it doesn't matter to me, but if it helps 3080 owners deal with some modern titles, good for them. I am pleased.
Not using upscalers is a personal choice that doesn't undo the overall positive that is has been and continues to be for RTX cards. In a 3080 vs 6800XT scenario, it's Upscaling is definitely to the 3080's favour if and when a user would choose to deploy it with roughly the same rasterization performance to start with.
Even recent Transformer model optimizations saving up to 20% of VRAM will not be able to hide the issue with basic hardware.
And I certainly wouldn't claim reducing just the models footprint by 20% to save a given "low VRAM" card that's up against it's limit, but it's nice they're refining it anyway. Talking about issues with basic hardware.... Ampere stands well ahead of RDNA2 with it's weak RT performance and no dedicated Tensor/Matrix hardware acceleration, which will increasingly let it down as time goes on. Now I'm not trying to say 6800XT buyers made the wrong choice necessarily, I hope they made the right choice for themselves because they needed that VRAM at the time, but buying it and having this conversation in 2025? I'm very glad I chose the 3080 as it was fantastic at launch and it's forward looking architecture has only become more relevant and leveraged over time.
 
In summary, while spot prices suggest that 16GB of GDDR6 memory could be approximately $36 to $37, the actual cost for manufacturers with large-volume contracts is likely lower and influenced by various other factors beyond just the memory itself.
Contract pricing is typically considerably higher than spot prices. You pay a premium for guaranteed supply of a specific SKU that has specific electrical characteristics. Consistency in price and product is the key. Contract pricing also includes engineering support and warranty, which the spot market doesn’t provide.
 
The extra VRAM is nice, and unusual for Nvidia to be this generous. However, it is clear by now that the RTX 5000 series is mostly not a worthy upgrade if one is using a similar RTX 4000 card, with the exception of the RTX 5090 which has its own problem like high power draw and runs the risk of melting its single connector. Regardless of the bump in specs, a card like the RTX 5080 is not going to become a gaming champ. In today’s context, the RTX 5080 series is mostly suited for 1440p gaming and any Super upgrade will not change this.
 
I'm really hoping that Nvidia will release a fully die 24576 shading core RTX 5090 Super with 48GB, using the 3GB modules, while keeping the same $2000 MSRP
 
In this case it is technology because it turned out that Navi '41' with 9 compute chiplets needed more time to develop and work properly, so it wasn't ready for RDNA4 release.
That's not a technological limitation there, they just didn't invest as much time and workforce as they needed to to have it ready on time - and / or they released RDNA4 too early. Your whole point can act as a justification on why nvidia is can and should command a premium. They released on time cards faster than the 9070xt, apparently a monumental feat considering no one else managed such a feat. That should cost something, no?
 
256 bit bus too narrow for 5080 super. It needs a nicer 384 bit bus like the 4090
 
That's not a technological limitation there
It is to beat RTX 5090 amd also needs a 512bit bus and even more power consumption something like 750-800w which is completely absurd to even think about. Even 600 watts on RTX 5090 is just ridiculous caveman style camp fire it's not even stove. :kookoo: And you pay for that prehistoric shit 2700€ which will not be usable in summer.
 
Last edited:
It is to beat RTX 5090 amd also needs a 512bit bus and even more power consumption something like 750-800w which is completely absurd to even think about. Even 600 watts on RTX 5090 is just ridiculous caveman style camp fire it's not even stove. :kookoo: And you pay for that prehistoric shit 2700€ which will not be usable in summer.
That prehistoric 5090 is more efficient than the 9070xt. In fact even limited at the same power it will still be a ton faster. Please, stop this antinvidia madness.
 
I think a higher power limit isnt a selling point anymore.
 
Back
Top