• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Testing GeForce RTX 50 Series "Blackwell" GPU Designs Ranging from 250 W to 600 W

Given the fact that we no longer have massive node shrinks to give us massive generational perf/W improvements, coupled with the fact that GPUs are becoming ever wider, sadly these power numbers are only going to increase until or unless we switch to the replacement for silicon.
 
If they want to try to balance compactness maybe this time xx80 cards will be dual slot, instead of being as big as a tank.
 
My god, the pearl clutching. "muh power, muh TDP!" Funny, I remember the SAME type of people whining and crying about Fermi's power use. "300w is insane, a GPU should pull more then 150w". Now it's switched to "Waah 600w is too much, 350w is the limit".

Just....buy a smaller GPU and stop crying? Just go get a 6750xt or 4060ti and revel in the power efficiency!
If they want to try to balance compactness maybe this time xx80 cards will be dual slot, instead of being as big as a tank.
So the high end should be artificially gimped so you can have a dual slot cooler? Because the only way they are doing this is to dramatically under-clock the thing out of the box or cut down the GPU die size, at which point you have a xx7x card with a xx8x name and price, which I seem to remember people didnt like with the 4000 series.

Too bad they cannot make a 250 Watt version do the same output of the 600 watt version now that would be an actual milestone. The Orginial road map they had when they had the 1000's series said they would look into increasing performance while reducing power. Then they threw that into the garbage.
The 4000s series is LIGHTYEARS ahead of the 1000 series in perf/W. Just because they didnt artificially limit their GPUs to 1080ti performance doesnt mean theres not been major efficiency improvements.

Yeah but you tend to not run them for hours.
So if you dont want the heat, buy a lower TDP card then? The point is, its not a risk, nor is it going to electrocute your cat or burn down your house.
 
Last edited:
As long as the 600W GPUs can communicate with my furnace to balance heating duties for 8months of the year i’ll be happy :p. I’ll just sit the exhaust near one of the air returns as a pre-heater :p.
 
That 20% than Ada figure has largely been conjecture from rumors and extrapolating as little architectural improvements as possible because "it's still on TSMC N4".

The RTX 4080 is already 2x faster than the 6750 XT, it's just 3 times as pricy. Generational uplift should bring cheaper cards at this performance level at the bare minimum.

3000 > 4000 series didn’t move the performance per dollar at all until the 4070s. Depending on where AMD 8000 series ends up on performance segments, I expect Nvidia GPUs to be the same or worse value next gen respective to price. Likely to end up with a 5060 card at 4070 performance for $500-550 or more.
 
I would like chip designers to continue putting the OC headroom in the boost clock so I don't have to spend hours diddling with various tools to do what GPU engineers can do better than me. They have advanced degrees in electrical engineering, mathematics, physics, etc. plus years of experience as paid professionals not dilettantes. I don't. And I already paid them when bought my card.
Meanwhile, I spent hours troubleshooting stuttering, all due to the boost algorithm thinking it's a good idea to drop the clock mid game.
 
3000 > 4000 series didn’t move the performance per dollar at all until the 4070s. Depending on where AMD 8000 series ends up on performance segments, I expect Nvidia GPUs to be the same or worse value next gen respective to price. Likely to end up with a 5060 card at 4070 performance for $500-550 or more.

Yes, Ada has a pricing problem compared to Ampere, at least at the official sphere and at SKUs other than the 4090. But I don't think that the 5060 will be $550, no.
 
I see the argument that 600W is manageable and sure, it is manageable. I think the concern is moreso that the competitive pressures in the market combined with stagnating returns on process tech development are putting us in a position where every generation has a massive increase in power draw, which is simply unsustainable in the long term. Will we one day be arguing "2000W is too much, I will only buy a 1250W GPU?" At some point power grids will not be able to keep up.
 
My PSU came with a 600w plug, I am ok :laugh:
 
"Ranging from 250 W" ... I'm more scared of this part of the statement. Where did the 120-150 W mainstream cards that we all loved (like the 1060) go? :(
 
"Ranging from 250 W" ... I'm more scared of this part of the statement. Where did the 120-150 W mainstream cards that we all loved (like the 1060) go? :(
It could mean that we'll need more info from Nvidia about this or it could mean that Nvidia no longer has any interest in entry level to lower midrange GPUs because the profit margins are too low for them and they will surrender that segment of the market to AMD and Intel. We'll have to wait and see.
 
Ignoring games. If 5090 will deliver the same performance jump as switch from 1080Ti to 3090 then I see no problem. Compute wise that was 350W in one card instead 4 or 5 x250W 1080Tis to do same kind of rendering job. Absolute no-brainer. I'm only bloody terrified of those connectors and all the calamities which follow new power plugs. For that very reason I completely skipped 4000 and waiting for new stuff to get out with new PSU with all bells and whistles + 5090. Looking back to previous switch 4-5x uplift for 250W more I can live with this.
 
"Ranging from 250 W" ... I'm more scared of this part of the statement. Where did the 120-150 W mainstream cards that we all loved (like the 1060) go? :(
Why would they be gone? You just get a tinier slice of silicon.
 
Why would they be gone? You just get a tinier slice of silicon.
Power requirements have been steadily increasing in the last 2-3 generations, and judging by the quoted part of the statement, the trend seems to continue.
 
Power requirements have been steadily increasing in the last 2-3 generations, and judging by the quoted part of the statement, the trend seems to continue.

Will they remain on the old TSMC 4N process, or will they prefer to wait for something better - TSMC 2N or 3N?
At least, they still have a quite significant "FPS" lead over the old Radeon RX 7900 XTX, so why even bother to release useless products?

If efficiency increases (work done/power used), which it has with RTX 40xx compared to RTX 30xx, significantly, the power limits don't bother me.

but ..600w for a GPU is a bit scary....... it should have "Eletric Hazard" sticker on it....

It's not scary at all.

It wont be scary if it has a functional power connector.

I see pictures with heavy cards bent, sagging is quite an issue.
Broken, cracked PCBs with molten connectors will become a norm.
I choose to avoid, though :D

Solution - water cooling.
 
At least, they still have a quite significant "FPS" lead over the old Radeon RX 7900 XTX, so why even bother to release useless products?
Because few people care about $1000+ graphics cards. What you see here in the forum is not representative of the general public.
 
but ..600w for a GPU is a bit scary....... it should have "Eletric Hazard" sticker on it....
They're not going to release no 600+ watt graphics card for the regular consumer like us. I don't even think something like that could be properly cooled without resorting to some kind of large exotic water cooling solution, the large custom case to go with it, and the expected AC cost for keeping the ambient room temp down to reasonable levels.
Also, something like that for rendering would suck ass because you would only be able to feasibly have one of those things in the same case without going all-out full custom system, cooling, wall outlets, etc., and probably a beefy 2000-watt PSU & a decent UPS w/ surge protection. That's easily looking at around $10k+ right there if you're a DIY.
Like I said, they're going to keep the wattage down, and I'm sure if they wanted to, they could make them more efficient, but its cheaper for them to stick with their current manufacturing processes.
 
Like I said, they're going to keep the wattage down, and I'm sure if they wanted to, they could make them more efficient, but its cheaper for them to stick with their current manufacturing processes.

Only if they can extract higher transistor density, significantly improve the FPS count over the current offerings.
Otherwise, there will be little to no initiative for anyone to throw another large sum of money. For 10-15-20% more FPS it won't be worth it.
 
Power requirements have been steadily increasing in the last 2-3 generations, and judging by the quoted part of the statement, the trend seems to continue.
Erm.....the 4060 pulls 110w under load. The sub 150w cards havent gone anywhere. :slap:
Because few people care about $1000+ graphics cards. What you see here in the forum is not representative of the general public.
Enough people care that, so far as Steam is concerned, more people buy 4090s then the entirety of AMD and Nvidia has seen fit to release $1000 GPUs since the late 2000s.

but you know not many people buy this type of card. Few bought the 8800 GTX ultra or the GTX 590 or the 1080ti, those are just impossible to find today because nobody bought them right?
They're not going to release no 600+ watt graphics card for the regular consumer like us. I don't even think something like that could be properly cooled without resorting to some kind of large exotic water cooling solution, the large custom case to go with it, and the expected AC cost for keeping the ambient room temp down to reasonable levels.
Also, something like that for rendering would suck ass because you would only be able to feasibly have one of those things in the same case without going all-out full custom system, cooling, wall outlets, etc., and probably a beefy 2000-watt PSU & a decent UPS w/ surge protection. That's easily looking at around $10k+ right there if you're a DIY.
Like I said, they're going to keep the wattage down, and I'm sure if they wanted to, they could make them more efficient, but its cheaper for them to stick with their current manufacturing processes.
Ahem....the 3090ti exists, and peaks at over 600 watts. It's sustained draw at stock os over 500 watts.
 
"Ranging from 250 W" ... I'm more scared of this part of the statement. Where did the 120-150 W mainstream cards that we all loved (like the 1060) go? :(

The 4060, compared to the 1060 6GB:

uses less power, 110W vs. 120W
delivers more 1080p fps at time of review (96 vs 83) in W1zz's reviews
costs less after inflation, $300 vs $323
delivers 2.4x the fps at 1080p in today's games (a reasonable 3-gen uplift, ~34%/gen)

I bought the 1060 6G as my first PC gaming GPU and the 4060 is a well-matched successor. I'll bet the 5060 will be a little higher power but still delivers more fps/W.
 
Erm.....the 4060 pulls 110w under load. The sub 150w cards havent gone anywhere. :slap:
Good point.

Enough people care that, so far as Steam is concerned, more people buy 4090s then the entirety of AMD and Nvidia has seen fit to release $1000 GPUs since the late 2000s.

but you know not many people buy this type of card. Few bought the 8800 GTX ultra or the GTX 590 or the 1080ti, those are just impossible to find today because nobody bought them right?
Nvidia halo cards are a different matter altogether. They kind of exist within their own market space. People will buy them regardless of their price.
 
I went Almont 6 years with my 1080Ti as my main rig, my 4090 looks to do the same , I went thru more CPU , 1055 8370 2700X and now 5950X.
 
Power requirements have been steadily increasing in the last 2-3 generations, and judging by the quoted part of the statement, the trend seems to continue.
Ampere increased them even in the midrange but Ada didnt, really. The 4060 gets by on 130W. The 1060 does the same.

I think what we are seeing is that the range of performance between the x60 and x80ti/90 has increased quite a bit. The TDP is stretched up mostly on the higher end.

So; you really do get a tinier slice of silicon. The x60 is less card today than it was relatively 3-6 generations back

I see the argument that 600W is manageable and sure, it is manageable. I think the concern is moreso that the competitive pressures in the market combined with stagnating returns on process tech development are putting us in a position where every generation has a massive increase in power draw, which is simply unsustainable in the long term. Will we one day be arguing "2000W is too much, I will only buy a 1250W GPU?" At some point power grids will not be able to keep up.
Power grids have bigger issues, like EVs and solar. Its precisely NOT usage but power input that destroys network stability.
 
Conveniently, new AAA games generally aren't very good. As games, that is--they're fairly impressive in terms of graphical fidelity, but of course the rate of improvement there has slowed to a crawl over the last decade, too.

If we ever reach a point where the newest GPUs threaten to max out the power of the average domestic circuit, it won't be especially difficult to opt out.
 
Back
Top