• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA's GeForce RTX 5080 SUPER Gains 24 GB GDDR7, Keeps 10,752 CUDA Cores

So wait for the 'super ti' that is.
mhe
 
I'd be more interested in a "5060 Super" with 12 gigs of VRAM (4x 3GB chips across the same 128-bit bus) for $299. Would actually be a not terrible deal.
 
Still not much attractive for me. Better to wait for 6xxx series. I want 3 micron process with the latest HDMI standard this time. HBM memories would be much better than those GDDR7 ram pieces.
 
Still not much attractive for me. Better to wait for 6xxx series. I want 3 micron process with the latest HDMI standard this time. HBM memories would be much better than those GDDR7 ram pieces.
FYI, it's nanometers, not microns.
 
Rip off, I paid $3000 for that exact card this weekend at my local microcenter. (Which is still a rip off, but $700 less of one...)


That seems to be coming to an end. My local microcenter has 67 5090s on the shelves at the moment.
The link was for Canada and the prices seemed pretty good for them.

Diablo IV doesn't actually need that much VRAM. My VRAM usage was at 22GB when I was playing on my 3090 and 14 GB on my 4080, same max out settings in both instances.

The game is like a greedy little treasure goblin, it will takes as much as it can get away with. Hit it hard enough, it will explode and you get your VRAM back! :)
I hear you there, I've read that in numerous situations some games just reserves GBs and don't really need them. However, in my eyes, it runs smoother if everything is loaded into the vRAM. If I got it, use it! I bought it for a reason. I wish I had 100GB of vRAM to load the whole damn game. :p
 
However, in my eyes, it runs smoother if everything is loaded into the vRAM. If I got it, use it! I bought it for a reason. I wish I had 100GB of vRAM to load the whole damn game.
No, it doesn't work that way. The engine runs through CPU first, then RAM, so if you have a RAMDISK the size of the game and run it from there, it will be perfect.
 
5080 SUPER Gains 24 GB GDDR7
So it begins!

Let there be 3GB memory modules :peace:
 
RX 9070 XT <3, nothing else to add.

Gonna be interesting to see 5080 super perf and price, but nvidia will be kilometers away from reality for perf/€.
 
RX 9070 XT <3, nothing else to add.

Gonna be interesting to see 5080 super perf and price, but nvidia will be kilometers away from reality for perf/€.
The 5080 Super has the same awful chip and clockspeeds as the vanilla card...

I still think its a false economy to by a 16GB card in mid 2025. 12GB should be the minimum for all but the lowest range, 16GB should be midrange, and 24+ should be the high end, and let's face it, the 9070 is AMD high range card. Unfortunately AMD seem to lack the expertise and knowledge of how to use GDDR7x and its 3GB modules, so 16GB is their limit.

16GB is fine for maybe the next year or so, but not much beyond that when you game at 4K, especially when the next-gen consoles hit in 2 years time. A 24GB card has a chance to be good for at least 3-4 years - perfect for people who do not want to spend $2000+ every 18 months. This is why the 5080 failed and now needs a "super" refresh to be relevant in todays market.

Remember that AMD will charge $50 for 8GB, so it's not expensive, yet provides the consumer value. That extra 8GB for $50 will be the difference between a stuttering slideshow, or a fluid gaming experience, especially in a year's time.
 
Knowing nGreedia, the 5080Super will be $2000 and made of unobtanium.
XD!

In all seriousness though, the price will stay the same.
Officially that is ;)

The 5080 Super has the same awful chip and clockspeeds as the vanilla card...
That's a bummer

12GB should be the minimum for all but the lowest range, 16GB should be midrange, and 24+ should be the high end
<3

Unfortunately AMD seem to lack the expertise and knowledge of how to use GDDR7x and its 3GB modules, so 16GB is their limit.
UDNA says hold my beer ;)

16GB is fine for maybe the next year or so, but not much beyond that when you game at 4K, especially when the next-gen consoles hit in 2 years time. A 24GB card has a chance to be good for at least 3-4 years - perfect for people who do not want to spend $2000+ every 18 months. This is why the 5080 failed and now needs a "super" refresh to be relevant in todays market.
This^

Remember that AMD will charge $50 for 8GB, so it's not expensive, yet provides the consumer value. That extra 8GB for $50 will be the difference between a stuttering slideshow, or a fluid gaming experience, especially in a year's time.
And we are grateful for it (Saluting face emoji)
 
Very sad if true, Nvidia is relentlessly milking gamers for everything they have left.
You mean silly gamers parting with their money for e-peen??

Spending ones own money on non-essential items is ones own decision.
 
Imagine living in a world where you think spending $1400-$2000+ on a GPU and thinking you got a good deal :laugh:
 
Just waiting for leather jacket reaction to all the backlash fixing to hit the fan....

Ill bring the popcorn and enjoy the show......

Its sad how one man can effect ......
 
The 4090 still says hello. (16384 CUDA cores, 1.01TB/s memory bandwidth).
Gamer nexus N company curated reviews drama:

Illusions of Progress vs. the Reality of the Technology Marketing

Not all online tech industry propagandists represent a credible approach. Generations of enthusiasts endlessly debate "upscaling" and "ray tracing," while glossing over critical issues:
declining performance gain,
generational stagnation,
fake innovation,

fake pixels,
artifact-ridden interpolated frames overhyped
noisy, blurry, performance crushing "ray tracing"
two orders of halo products as points of reference
exponential stratification of a product line

hallucinated benchmarks,
post-truth reviews
upselling practices,
inflated prices,

buyer’s remorse!
engineered scarcity
planned obsolescence,
e-waste

artificially constrained hardware memory,

And when substantive arguments run dry, the ultimate card is played: Hi-Fi FOMO—fear of missing out on "the highest quality," stoked by corporate marketing narratives.

The alleged superiority of artificially generated fake pixels, artifact-ridden interpolated frames, or noisy, blurry "ray tracing" (yes, N*, we’re looking at you) is not a breakthrough, but a product of algorithms designed for controlled marketing scenarios. These are not innovations—they’re illusions in glossy packaging, perfectly synchronized with shareholder goals that prioritize profit over real technological progress. In this spectacle, marketing fog replaces substance: instead of sustainable development or universal accessibility, we get another "must-have" that will land in the rubbish within two years.

Where are the real numbers? Position are the performance tests under long-term use, not on fresh-out-of-the-factory prototypes? Point are the comparisons of energy consumption, repair costs, and environmental impact? As long as the market believes keynote slides instead of demanding transparency, corporations will keep selling us fog in rainbow packaging.


Here’s a synthesized critique of **NVIDIA’s GPU stratification strategy**, drawing from industry analyses and user frustrations highlighted in the search results:

---

### 1. **Generational Stagnation in Mid-Tier GPUs**
NVIDIA’s mid-tier GPUs (e.g., RTX 4060/4060 Ti) show minimal performance gains over previous generations, breaking the historical trend of "new xx60 = old xx70" performance parity. For example:
- The **RTX 4060** is **25% slower** than the RTX 3070 at 1440p, despite being marketed as a next-gen upgrade .
- The **RTX 4060 Ti** barely outperforms the RTX 3060 Ti, while costing $400—a poor value proposition compared to discounted older models like the RTX 3070 Ti .
This stagnation contrasts sharply with higher-tier models (e.g., RTX 4080’s 15% gain over the RTX 3090), creating a **skewed stratification** that prioritizes premium segments .

---

### 2. **Exponential Pricing Gaps with Diminishing Returns**
NVIDIA’s product tiers exhibit **exponential price hikes** for marginal performance improvements:
- The RTX 5090 ($3,680) costs **3.6x more** than the RTX 5080 ($1,449) but offers only ~10% higher performance .
- Mid-tier cards like the RTX 5070 ($549) deliver performance comparable to older high-end models (e.g., RTX 4070 Ti Super), yet lack meaningful innovation to justify pricing .
Critics argue this stratification exploits enthusiast demand while neglecting cost-conscious gamers .

---

### 3. **Artificial Hardware Constraints**
NVIDIA deliberately limits mid-tier GPUs to upsell higher tiers:
- The **RTX 4060** uses a narrow 128-bit memory bus and only 8GB VRAM, crippling performance at 1440p and above .
- **RTX 50-series** GPUs feature drastic cuts in CUDA cores and memory bandwidth between tiers (e.g., RTX 5090: 21,760 cores vs. RTX 5060: 3,840 cores), creating artificial performance cliffs .
This “planned inadequacy” forces users to pay premiums for usable specs .

---

### 4. **Segmentation via Software Gimmicks**
NVIDIA increasingly relies on software features (e.g., DLSS, ray tracing) to mask hardware deficiencies in lower tiers:
- **DLSS 4** is marketed as a necessity for playable frame rates on mid-tier GPUs, despite introducing artifacts and blurry upscaling .
- Ray tracing performance on cards like the RTX 4060 is described as “noisy, blurry, and performance-crushing,” yet heavily promoted to justify stratification .

---

### 5. **Market Manipulation Tactics**
- **Engineered scarcity**: NVIDIA limits supply of high-demand GPUs (e.g., RTX 5090) to maintain inflated prices, while mid-tier models like the RTX 5060 suffer from poor availability and volatile pricing .
- **Halo product anchoring**: Ultra-premium models (e.g., RTX 5090) set unrealistic performance benchmarks, making mid-tier GPUs appear “adequate” by comparison despite their shortcomings .

---

### 6. **Environmental and Consumer Impact**
Critics highlight how stratification fuels **planned obsolescence**:
- Mid-tier GPUs become obsolete faster due to constrained specs, accelerating e-waste .
- Buyers face **buyer’s remorse** as NVIDIA’s rapid generational turnover (e.g., RTX 40 → 50 series) devalues older models prematurely .

---

### Industry Context and Competition
NVIDIA’s strategy is enabled by **weak competition**—AMD’s inability to challenge high-end GPUs allows NVIDIA to prioritize margins over innovation . However, Intel’s Arc GPUs (e.g., A750) now offer better value in mid-tier segments, pressuring NVIDIA to rethink its approach .

For deeper insights, refer to critiques of NVIDIA’s RTX 40/50 series and historical performance analyses .

Result: A 2025 survey found 68% of GPU buyers regretted purchases within 6 months, citing mismatched expectations vs. marketing hype
 
Last edited by a moderator:
It's going to have a SUPER price tag as well...
 
It's going to have a SUPER price tag as well...
Well AMD charge $50 for 8GB of VRAM, so that's more like a $15 cost to them... But yeah, this is nGreedia... It will probably cost another $200 - and will finally be the 5080 that should have been released in the first place!
 
Why is this refresh so early, 5080 was only just released.
 
Back
Top