• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Tunes GeForce RTX 5080 GDDR7 Memory to 32 Gbps, RTX 5070 Launches at CES

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,231 (1.12/day)
NVIDIA is gearing up for an exciting showcase at CES 2025, where its CEO, Jensen Huang, will take the stage and talk about, hopefully, future "Blackwell" products. According to Wccftech's sources, the anticipated GeForce RTX 5090, RTX 5080, and RTX 5070 graphics cards should arrive at CES 2025 in January. The flagship RTX 5090 is rumored to come equipped with 32 GB of GDDR7 memory running at 28 Gbps. Meanwhile, the RTX 5080 looks very interesting with reports of its impressive 16 GB of GDDR7 memory running at 32 Gbps. This advancement comes after we previously believed that the RTX 5080 model is going to feature 28 Gbps GDDR7 memory. However, the newest rumors suggest that we are in for a surprise, as the massive gap between RTX 5090 and RTX 5080 compute cores will be filled... with a faster memory.

The more budget-friendly RTX 5070 is also set for a CES debut, featuring 12 GB of memory. This card aims to deliver solid performance for gamers who want high-quality graphics without breaking the bank, targeting the mid-range segment. We are very curious about pricing of these models and how they would fit in the current market. As anticipation builds for CES 2025, we are eager to see how these innovations will impact gaming experiences and creative workflows in the coming year. Stay tuned for more updates as the event approaches!



View at TechPowerUp Main Site | Source
 
Gimped right out the gate. Now I am just curious to see the clown show prices accompanied by these crippled cards.
 
Last edited by a moderator:
Gimped right out the gate. Now I am just curious to see the clown show prices accompanied by these crippled cards.

I think die shrinking is going to produce less and less actual gains, so they have to milk each release as much as possible. personally, I think there is a limit to physics, and we are approaching that. and unless we move to a new non-silicon based advanced computing chip (not sure if IBM is still working a light based carbon nanotube chip or not?)

i could be wrong and its just Nvidia being an asshole, probably a little of both though imo. I mean what happens in 2035 when we can't get anymore die shrinks and chiplet designs have been maxed out, what product you going to sell? (assuming AI wasn't a thing)
 
Last edited by a moderator:
RTX 5080 looks very interesting with reports of its impressive 16 GB of GDDR7 memory running at 32 Gbps
Epic fail

5080 DOA lol

4333.png
 
Gimped right out the gate. Now I am just curious to see the clown show prices accompanied by these crippled cards.

I expect $1,100 - $1,500

Another 5080 12GB fiasco if the specs are true.

As anticipation builds for CES 2025, we are eager to see how these innovations will impact gaming experiences and creative workflows in the coming year

If prices just continue to increase for the same or similar level of performance the vast majority of people are not going to be able to enjoy these "innovations".

less a clown world and more that I think die shrinking is going to produce less and less actual gains, so they have to milk each release as much as possible. personally, I think there is a limit to physics, and we are approaching that. and unless we move to a new non-silicon based advanced computing chip (not sure if IBM is still working a light based carbon nanotube chip or not?)

This is why chiplets needs to be a thing in the GPU space.

i could be wrong and its just Nvidia being an asshole, probably a little of both though imo. I mean what happens in 2035 when we can't get anymore die shrinks and chiplet designs have been maxed out, what product you going to sell? (assuming AI wasn't a thing)

I very much doubt the limits of chiplet design are exhausted by 2035. They are just so many options dis-aggregation of a chip allows, particularly as interconnect and die stacking technology evolves and enables more creative ways of doing so. There are a lot of innovations happening outside of that as well including backside power delivery, gate all around, stacking, interconnect innovation, different substrate materials (like glass for example), ect. People have been saying we'd stop shrinking chips time and time again but the industry keeps proving them wrong. This is a combination of human ingenuity and the massive amount of growth we continue to see.

Nvidia has a gross margin of 75%. Part of the price increase is due to increase in costs but most of it is simply because Nvidia can charge that much and get away with it. They are are virtually uncontested outside of AI (88% marketshare beats even Bell Systems at 85%, which was broken up as a monopoly) and in AI competitors have to play catchup to a company known to employ anti-competitive practices to lock customers in or coerce vendors (or else they'll pull your allotment in typical Nvidia fashion if you don't bend to their will).
 
Last edited by a moderator:
I think die shrinking is going to produce less and less actual gains, so they have to milk each release as much as possible. personally, I think there is a limit to physics, and we are approaching that. and unless we move to a new non-silicon based advanced computing chip (not sure if IBM is still working a light based carbon nanotube chip or not?)

i could be wrong and its just Nvidia being an asshole, probably a little of both though imo. I mean what happens in 2035 when we can't get anymore die shrinks and chiplet designs have been maxed out, what product you going to sell? (assuming AI wasn't a thing)

No, the joke is that the 5070 has 6,400 Shaders, my 3070Ti has 6144.

This thing will only have what, maybe 15% more performance than mine with the new memory? Then it will only have 12GB, also, Frame Generation and DLSS do not count as rasterization performance increase, just stop already, just stop. This pandering to nVidia is just idiotic, if you don't buy the top card, you are just essentially screwing yourself, there is no good card to be had.

It's all so tiresome.
 
Last edited by a moderator:
I’m very curious to see if Nvidia skips the mid range below the 5070.

As for performance,

5070 = 4080
5080 = 4090
5090 = 30% higher than 4090
 
No, the joke is that the 5070 has 6,400 Shaders, my 3070Ti has 6144.

This thing will only have what, maybe 15% more performance than mine with the new memory? Then it will only have 12GB, also, Frame Generation and DLSS do not count as rasterization performance increase, just stop already, just stop. This pandering to nVidia is just idiotic, if you don't buy the top card, you are just essentially screwing yourself, there is no good card to be had.

It's all so tiresome.

well I am not pandering, my work laptop is an amd 7840u, and my desktop is a ryzen x3d and 7900 xt. and both my Steam Decks run AMD APU's.

I just like trying to think of different things is all, I have enough backlog I am set for ten years, so I don't give a damn really, I'm set
 
The more budget-friendly RTX 5070 is also set for a CES debut, featuring 12 GB of memory.

No thanks. There has to be a 5070Ti with 16GB now, they wouldn't dare to go back from 4070TiS or they will have to cancel that and rename it 5060Ti last minute like 4080 12GB. 12GB was already not enough in 4K in some games when 4070Ti was released.
 
yeah 12gb on a 5070 sounds pretty dumb, I have to admit
 
The more budget-friendly RTX 5070 is also set for a CES debut, featuring 12 GB of memory. This card aims to deliver solid performance for gamers who want high-quality graphics without breaking the bank, targeting the mid-range segment.
$700 is the new "budget" mid-range, seriously I wish NV would FO, can't wait until the AI shit bubble bursts as it's only going to continue getting worse for the average gaming consumer, honestly the 1 company I would be happy with going bust, their arrogance knows no bounds
 
I mean what happens in 2035 when we can't get anymore die shrinks and chiplet designs have been maxed out, what product you going to sell? (assuming AI wasn't a thing)

Jensen is probably going to leave it to Ai to design a GPu.

They are going to feed a load of blueprints into an off grid custom built chatGPT then tell Alexa to get to work.
 
Willing to bet money that the 5070 will be $1K or more , 5080 1.5K or more and 5090 2.0K. I just don’t see these cards being any cheaper especially with AIB variants
 
RTX 5080 looks very interesting with reports of its impressive 16 GB of GDDR7 memory running at 32 Gbps.
RTX 5050 Ti (better known as RTX 5080) appears to be an arguable SKU and the 16 GB VRAM buffer, probably impressive as of five years ago, only contributes to relatively acceptable 1440p experience. GDDR7 at 32 Gbps helps this thing to achieve a bare minimum of comfort in the games of yesterday thanks to a puny 256-bit bus.

Fixed. No need to thank me.
 
I’m very curious to see if Nvidia skips the mid range below the 5070.

As for performance,

5070 = 4080
5080 = 4090
5090 = 30% higher than 4090
The gap between the 4070 and 4080 is much greater than in previous generations. I would be very surprised if the 5070 actually manages to match the 4080.
 
5080 having only 16GB of VRAM and 5070 having 12GB is criminal.

Nvidia need to stop gimping their GPUs with pitable ammount of VRAM. They obviously love to do that because they do not want people to use their consumer cards instead of enterprise cards to train AI but it's a complete joke at this point.
 
So, what are we thinking, ladies and gents? 650 bucks for a 12 gig 5070 I reckon? Really feel that mid-range pricing for those not wanting to break the bank, not gonna lie.
I mean, who knows, maybe NV goes full Pascal and surprises us with good value. I can huff some sweet copium, right?
 
maybe NV goes full Pascal and surprises us with good value
For that to happen, Intel must stop smoking their E-cores and come up with GPUs that make absolute fun of NVIDIA's, let's say, "offerings."

This is at least possible, unlike AMD doing something right.
 
5090, 5060ti and 5050 looking mighty good.
 
@Beginner Macro Device
As long as we are making risky bets, I am throwing all my chips on the table for Moore Threads becoming a true-bred NV competitor after being bought out by… I dunno, Xiaomi?
In other words - I too like science fiction and cocaine.
 
Jensen is probably going to leave it to Ai to design a GPu.

They are going to feed a load of blueprints into an off grid custom built chatGPT then tell Alexa to get to work.

100%, Jensen already said at a big tech event this past summer that an on-site offgrid AI already helped them develop Blackwell and Hopper, and it gets better and better over time. so yeah, 100%, combine that with those factories in China that are all robots and only like 5% human workers, I mean in some ways we are already there. I don't think a single human touches the gpu in ASUS factories for example, I remember reading that a long time ago. so yeah, we are already there
 
I think die shrinking is going to produce less and less actual gains, so they have to milk each release as much as possible. personally, I think there is a limit to physics, and we are approaching that. and unless we move to a new non-silicon based advanced computing chip (not sure if IBM is still working a light based carbon nanotube chip or not?)

i could be wrong and its just Nvidia being an asshole, probably a little of both though imo. I mean what happens in 2035 when we can't get anymore die shrinks and chiplet designs have been maxed out, what product you going to sell? (assuming AI wasn't a thing)
Oh dude, you say this in the face of 1,5x shader deficit between the 5080 and the 5090? There's just no excuse. Even Nvidia is saying they can make larger GPUs just fine. They just don't want to, because then they can command the maximum price for not only the x90 but the rest of the stack too.

Also you might want to double check the avg margin on Nvidia products.

All we see here is indeed, a clown world of marketing and supposed 'leaks' that really serve only and exclusively for Nvidia to test the waters in the consumer base. Not to give us 'the best product'; but to see how low they can go before the outrage will bleed into their sales numbers (remember the '4080 12GB?'). And then we have gullible idiots saying life is hard for them. With all due respect... what the fuck? You're seeing a monopolist act like one and then find the excuses for them. With customers and thought processes like that, we're all doomed indeed. You just in a nutshell answered why Nvidia can get away doing as they have done for decades.
 
Last edited by a moderator:
Oh dude, you say this in the face of 1,5x shader deficit between the 5080 and the 5090? There's just no excuse. Even Nvidia is saying they can make larger GPUs just fine. They just don't want to, because then they can command the maximum price for not only the x90 but the rest of the stack too.

Also you might want to double check the avg margin on Nvidia products.

won't the cores themselves be more powerful though, Jensen did say in-house AI helped to develop Blackwell. I think 5000 series cards across the line are probably going to impress people honestly, we will see early next year anyway, am I buying one? no, I am content with my backlog of games and 7900 xt. it will be interesting to see all this play out over next couple of years though as AI helps the designs
 
Back
Top