Wednesday, October 9th 2024

NVIDIA Tunes GeForce RTX 5080 GDDR7 Memory to 32 Gbps, RTX 5070 Launches at CES

NVIDIA is gearing up for an exciting showcase at CES 2025, where its CEO, Jensen Huang, will take the stage and talk about, hopefully, future "Blackwell" products. According to Wccftech's sources, the anticipated GeForce RTX 5090, RTX 5080, and RTX 5070 graphics cards should arrive at CES 2025 in January. The flagship RTX 5090 is rumored to come equipped with 32 GB of GDDR7 memory running at 28 Gbps. Meanwhile, the RTX 5080 looks very interesting with reports of its impressive 16 GB of GDDR7 memory running at 32 Gbps. This advancement comes after we previously believed that the RTX 5080 model is going to feature 28 Gbps GDDR7 memory. However, the newest rumors suggest that we are in for a surprise, as the massive gap between RTX 5090 and RTX 5080 compute cores will be filled... with a faster memory.

The more budget-friendly RTX 5070 is also set for a CES debut, featuring 12 GB of memory. This card aims to deliver solid performance for gamers who want high-quality graphics without breaking the bank, targeting the mid-range segment. We are very curious about pricing of these models and how they would fit in the current market. As anticipation builds for CES 2025, we are eager to see how these innovations will impact gaming experiences and creative workflows in the coming year. Stay tuned for more updates as the event approaches!
Sources: Wccftech, via VideoCardz
Add your own comment

112 Comments on NVIDIA Tunes GeForce RTX 5080 GDDR7 Memory to 32 Gbps, RTX 5070 Launches at CES

#1
Legacy-ZA
Gimped right out the gate. Now I am just curious to see the clown show prices accompanied by these crippled cards.
Posted on Reply
#2
Space Lynx
Astronaut
Legacy-ZAGimped right out the gate. Now I am just curious to see the clown show prices accompanied by these crippled cards.
I think die shrinking is going to produce less and less actual gains, so they have to milk each release as much as possible. personally, I think there is a limit to physics, and we are approaching that. and unless we move to a new non-silicon based advanced computing chip (not sure if IBM is still working a light based carbon nanotube chip or not?)

i could be wrong and its just Nvidia being an asshole, probably a little of both though imo. I mean what happens in 2035 when we can't get anymore die shrinks and chiplet designs have been maxed out, what product you going to sell? (assuming AI wasn't a thing)
Posted on Reply
#3
Lifeless222
RTX 5080 looks very interesting with reports of its impressive 16 GB of GDDR7 memory running at 32 Gbps
Epic fail

5080 DOA lol

Posted on Reply
#4
evernessince
Legacy-ZAGimped right out the gate. Now I am just curious to see the clown show prices accompanied by these crippled cards.
I expect $1,100 - $1,500

Another 5080 12GB fiasco if the specs are true.
AleksandarKAs anticipation builds for CES 2025, we are eager to see how these innovations will impact gaming experiences and creative workflows in the coming year
If prices just continue to increase for the same or similar level of performance the vast majority of people are not going to be able to enjoy these "innovations".
Space Lynxless a clown world and more that I think die shrinking is going to produce less and less actual gains, so they have to milk each release as much as possible. personally, I think there is a limit to physics, and we are approaching that. and unless we move to a new non-silicon based advanced computing chip (not sure if IBM is still working a light based carbon nanotube chip or not?)
This is why chiplets needs to be a thing in the GPU space.
Space Lynxi could be wrong and its just Nvidia being an asshole, probably a little of both though imo. I mean what happens in 2035 when we can't get anymore die shrinks and chiplet designs have been maxed out, what product you going to sell? (assuming AI wasn't a thing)
I very much doubt the limits of chiplet design are exhausted by 2035. They are just so many options dis-aggregation of a chip allows, particularly as interconnect and die stacking technology evolves and enables more creative ways of doing so. There are a lot of innovations happening outside of that as well including backside power delivery, gate all around, stacking, interconnect innovation, different substrate materials (like glass for example), ect. People have been saying we'd stop shrinking chips time and time again but the industry keeps proving them wrong. This is a combination of human ingenuity and the massive amount of growth we continue to see.

Nvidia has a gross margin of 75%. Part of the price increase is due to increase in costs but most of it is simply because Nvidia can charge that much and get away with it. They are are virtually uncontested outside of AI (88% marketshare beats even Bell Systems at 85%, which was broken up as a monopoly) and in AI competitors have to play catchup to a company known to employ anti-competitive practices to lock customers in or coerce vendors (or else they'll pull your allotment in typical Nvidia fashion if you don't bend to their will).
Posted on Reply
#5
Legacy-ZA
Space LynxI think die shrinking is going to produce less and less actual gains, so they have to milk each release as much as possible. personally, I think there is a limit to physics, and we are approaching that. and unless we move to a new non-silicon based advanced computing chip (not sure if IBM is still working a light based carbon nanotube chip or not?)

i could be wrong and its just Nvidia being an asshole, probably a little of both though imo. I mean what happens in 2035 when we can't get anymore die shrinks and chiplet designs have been maxed out, what product you going to sell? (assuming AI wasn't a thing)
No, the joke is that the 5070 has 6,400 Shaders, my 3070Ti has 6144.

This thing will only have what, maybe 15% more performance than mine with the new memory? Then it will only have 12GB, also, Frame Generation and DLSS do not count as rasterization performance increase, just stop already, just stop. This pandering to nVidia is just idiotic, if you don't buy the top card, you are just essentially screwing yourself, there is no good card to be had.

It's all so tiresome.
Posted on Reply
#6
Daven
I’m very curious to see if Nvidia skips the mid range below the 5070.

As for performance,

5070 = 4080
5080 = 4090
5090 = 30% higher than 4090
Posted on Reply
#7
Space Lynx
Astronaut
Legacy-ZANo, the joke is that the 5070 has 6,400 Shaders, my 3070Ti has 6144.

This thing will only have what, maybe 15% more performance than mine with the new memory? Then it will only have 12GB, also, Frame Generation and DLSS do not count as rasterization performance increase, just stop already, just stop. This pandering to nVidia is just idiotic, if you don't buy the top card, you are just essentially screwing yourself, there is no good card to be had.

It's all so tiresome.
well I am not pandering, my work laptop is an amd 7840u, and my desktop is a ryzen x3d and 7900 xt. and both my Steam Decks run AMD APU's.

I just like trying to think of different things is all, I have enough backlog I am set for ten years, so I don't give a damn really, I'm set
Posted on Reply
#8
Tigerfox
The more budget-friendly RTX 5070 is also set for a CES debut, featuring 12 GB of memory.
No thanks. There has to be a 5070Ti with 16GB now, they wouldn't dare to go back from 4070TiS or they will have to cancel that and rename it 5060Ti last minute like 4080 12GB. 12GB was already not enough in 4K in some games when 4070Ti was released.
Posted on Reply
#9
SirB
What a joke. I have a 3060 12GB. Never going back to less than 12GB. So no 5060 8GB for me. It will be overpriced any way. 8GB is not enough, IMO.
Posted on Reply
#10
Space Lynx
Astronaut
yeah 12gb on a 5070 sounds pretty dumb, I have to admit
Posted on Reply
#11
Marcus L
AleksandarKThe more budget-friendly RTX 5070 is also set for a CES debut, featuring 12 GB of memory. This card aims to deliver solid performance for gamers who want high-quality graphics without breaking the bank, targeting the mid-range segment.
$700 is the new "budget" mid-range, seriously I wish NV would FO, can't wait until the AI shit bubble bursts as it's only going to continue getting worse for the average gaming consumer, honestly the 1 company I would be happy with going bust, their arrogance knows no bounds
Posted on Reply
#12
FreedomEclipse
~Technological Technocrat~
Space LynxI mean what happens in 2035 when we can't get anymore die shrinks and chiplet designs have been maxed out, what product you going to sell? (assuming AI wasn't a thing)
Jensen is probably going to leave it to Ai to design a GPu.

They are going to feed a load of blueprints into an off grid custom built chatGPT then tell Alexa to get to work.
Posted on Reply
#13
Hxx
Willing to bet money that the 5070 will be $1K or more , 5080 1.5K or more and 5090 2.0K. I just don’t see these cards being any cheaper especially with AIB variants
Posted on Reply
#14
Beginner Macro Device
AleksandarKRTX 5080 looks very interesting with reports of its impressive 16 GB of GDDR7 memory running at 32 Gbps.
RTX 5050 Ti (better known as RTX 5080) appears to be an arguable SKU and the 16 GB VRAM buffer, probably impressive as of five years ago, only contributes to relatively acceptable 1440p experience. GDDR7 at 32 Gbps helps this thing to achieve a bare minimum of comfort in the games of yesterday thanks to a puny 256-bit bus.

Fixed. No need to thank me.
Posted on Reply
#15
AnotherReader
DavenI’m very curious to see if Nvidia skips the mid range below the 5070.

As for performance,

5070 = 4080
5080 = 4090
5090 = 30% higher than 4090
The gap between the 4070 and 4080 is much greater than in previous generations. I would be very surprised if the 5070 actually manages to match the 4080.
Posted on Reply
#16
persondb
5080 having only 16GB of VRAM and 5070 having 12GB is criminal.

Nvidia need to stop gimping their GPUs with pitable ammount of VRAM. They obviously love to do that because they do not want people to use their consumer cards instead of enterprise cards to train AI but it's a complete joke at this point.
Posted on Reply
#17
Onasi
So, what are we thinking, ladies and gents? 650 bucks for a 12 gig 5070 I reckon? Really feel that mid-range pricing for those not wanting to break the bank, not gonna lie.
I mean, who knows, maybe NV goes full Pascal and surprises us with good value. I can huff some sweet copium, right?
Posted on Reply
#18
Beginner Macro Device
Onasimaybe NV goes full Pascal and surprises us with good value
For that to happen, Intel must stop smoking their E-cores and come up with GPUs that make absolute fun of NVIDIA's, let's say, "offerings."

This is at least possible, unlike AMD doing something right.
Posted on Reply
#19
Dristun
5090, 5060ti and 5050 looking mighty good.
Posted on Reply
#20
Onasi
@Beginner Macro Device
As long as we are making risky bets, I am throwing all my chips on the table for Moore Threads becoming a true-bred NV competitor after being bought out by… I dunno, Xiaomi?
In other words - I too like science fiction and cocaine.
Posted on Reply
#22
Space Lynx
Astronaut
FreedomEclipseJensen is probably going to leave it to Ai to design a GPu.

They are going to feed a load of blueprints into an off grid custom built chatGPT then tell Alexa to get to work.
100%, Jensen already said at a big tech event this past summer that an on-site offgrid AI already helped them develop Blackwell and Hopper, and it gets better and better over time. so yeah, 100%, combine that with those factories in China that are all robots and only like 5% human workers, I mean in some ways we are already there. I don't think a single human touches the gpu in ASUS factories for example, I remember reading that a long time ago. so yeah, we are already there
Posted on Reply
#23
Vayra86
Space LynxI think die shrinking is going to produce less and less actual gains, so they have to milk each release as much as possible. personally, I think there is a limit to physics, and we are approaching that. and unless we move to a new non-silicon based advanced computing chip (not sure if IBM is still working a light based carbon nanotube chip or not?)

i could be wrong and its just Nvidia being an asshole, probably a little of both though imo. I mean what happens in 2035 when we can't get anymore die shrinks and chiplet designs have been maxed out, what product you going to sell? (assuming AI wasn't a thing)
Oh dude, you say this in the face of 1,5x shader deficit between the 5080 and the 5090? There's just no excuse. Even Nvidia is saying they can make larger GPUs just fine. They just don't want to, because then they can command the maximum price for not only the x90 but the rest of the stack too.

Also you might want to double check the avg margin on Nvidia products.

All we see here is indeed, a clown world of marketing and supposed 'leaks' that really serve only and exclusively for Nvidia to test the waters in the consumer base. Not to give us 'the best product'; but to see how low they can go before the outrage will bleed into their sales numbers (remember the '4080 12GB?'). And then we have gullible idiots saying life is hard for them. With all due respect... what the fuck? You're seeing a monopolist act like one and then find the excuses for them. With customers and thought processes like that, we're all doomed indeed. You just in a nutshell answered why Nvidia can get away doing as they have done for decades.
Posted on Reply
#24
Space Lynx
Astronaut
Vayra86Oh dude, you say this in the face of 1,5x shader deficit between the 5080 and the 5090? There's just no excuse. Even Nvidia is saying they can make larger GPUs just fine. They just don't want to, because then they can command the maximum price for not only the x90 but the rest of the stack too.

Also you might want to double check the avg margin on Nvidia products.
won't the cores themselves be more powerful though, Jensen did say in-house AI helped to develop Blackwell. I think 5000 series cards across the line are probably going to impress people honestly, we will see early next year anyway, am I buying one? no, I am content with my backlog of games and 7900 xt. it will be interesting to see all this play out over next couple of years though as AI helps the designs
Posted on Reply
#25
Zazigalka
Legacy-ZANo, the joke is that the 5070 has 6,400 Shaders, my 3070Ti has 6144.

This thing will only have what, maybe 15% more performance than mine with the new memory?
4070 has 5888 and it's already 15% faster than 3070ti w/o gddr7 and clock speed increase.
btw, that 3070ti is already dead at anything higher than 1080p, and 12gb will soon go the same way.
If 5070 has 12gb, I'm not touching it. If these leaks are true, it smells of another mid-cycle refresh, similar to 20/40 super series.
Space Lynxwon't the cores themselves be more powerful though, Jensen did say in-house AI helped to develop Blackwell.
wasn't that mentioned in the context of utilizing die space to its fullest ? from what I understood, this method was just a way of coping with diminishing returns of node shrinks.
Posted on Reply
Add your own comment
Nov 6th, 2024 07:08 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts