• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Testing GeForce RTX 50 Series "Blackwell" GPU Designs Ranging from 250 W to 600 W

ARF

Joined
Jan 28, 2020
Messages
4,103 (2.59/day)
Location
Ex-usa | slava the trolls
I see the argument that 600W is manageable and sure, it is manageable. I think the concern is moreso that the competitive pressures in the market combined with stagnating returns on process tech development are putting us in a position where every generation has a massive increase in power draw, which is simply unsustainable in the long term. Will we one day be arguing "2000W is too much, I will only buy a 1250W GPU?" At some point power grids will not be able to keep up.

I think no, because many houses will be then at a threat of real fire hazard. You will need to build new cities and buildings infrastructure to handle that.
The trend now is actually the opposite. You buy lower emission TVs, refrigerators, washing machines... your average light bulbs went from 100-watts down to 9-watts or so.

Conveniently, new AAA games generally aren't very good. As games, that is--they're fairly impressive in terms of graphical fidelity, but of course the rate of improvement there has slowed to a crawl over the last decade, too.
If we ever reach a point where the newest GPUs threaten to max out the power of the average domestic circuit, it won't be especially difficult to opt out.

The last good game was Crysis 3 back in 2013. After that and the original Far Cry, there is nothing even remotely appealing in graphics realism and breakthrough revolutionary innovative graphics.
Ray-tracing alone is still an embarrassing hoax.
 
Joined
Jun 30, 2017
Messages
50 (0.02/day)
The last good game was Crysis 3 back in 2013. After that and the original Far Cry, there is nothing even remotely appealing in graphics realism and breakthrough revolutionary innovative graphics.
Ray-tracing alone is still an embarrassing hoax.
The thing is, that games(mostly game engines) nowdays are purely dumping things to the GPU to figure out how to do things. The development is not optimized or tailored to a specific performance threshold... if the game runs bad, buy a new gpu, or wait for performance patch if the game sells enough... Nvidia is making money of this because they can "tailor" the drivers to specific games to achieve performance.

Does that gives power to Nvidia to cripple a game if they want ? ... oh yes. But the one to blame is GameEngines and GameDev's ...
If games didn't need a 9090 to run optimal, there wasn't the need to powerdraw a GPU just because of that.
 
Joined
May 16, 2023
Messages
45 (0.12/day)
The coolers are just oversized, I doubt they'll actually try and push 600W. Just look at the 4080's cooler, it's probably rated for over 500W, but in truth the 4080 uses as much power as a 3070ti, which is insane for the amount of horsepower it provides. The cooler on my 4080 Gigabyte Eagle is so oversized that the fans don't even run half the time I'm in game.
 
Joined
Sep 17, 2014
Messages
21,214 (5.98/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
The thing is, that games(mostly game engines) nowdays are purely dumping things to the GPU to figure out how to do things. The development is not optimized or tailored to a specific performance threshold... if the game runs bad, buy a new gpu, or wait for performance patch if the game sells enough... Nvidia is making money of this because they can "tailor" the drivers to specific games to achieve performance.

Does that gives power to Nvidia to cripple a game if they want ? ... oh yes. But the one to blame is GameEngines and GameDev's ...
If games didn't need a 9090 to run optimal, there wasn't the need to powerdraw a GPU just because of that.
The games dont need that hardware at all to sell, thats the most ironic thing here. The biggest sales cannons are the best immersive worlds. The bigger scope of games is where its at now: we have fast storage, we have strong CPUs... none of this requires a killer GPU.

And that IS really where games should seek to impress. It is the world you enter that matters, how it looks is secondary at best. RT and more realtime brute forcing is the only way to 'undo' the efficiency gains of raster technologies and keep selling fools the dream that graphics really do keep getting 'better', while the biggest advances are not RT, but all sorts of other engine advancements.
 
Joined
Jan 14, 2019
Messages
10,145 (5.16/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
The thing is, that games(mostly game engines) nowdays are purely dumping things to the GPU to figure out how to do things. The development is not optimized or tailored to a specific performance threshold... if the game runs bad, buy a new gpu, or wait for performance patch if the game sells enough... Nvidia is making money of this because they can "tailor" the drivers to specific games to achieve performance.

Does that gives power to Nvidia to cripple a game if they want ? ... oh yes. But the one to blame is GameEngines and GameDev's ...
If games didn't need a 9090 to run optimal, there wasn't the need to powerdraw a GPU just because of that.
Games don't need that amount of performance at all. We just all want shiny ultra RT graphics at 4K running at 120 FPS minimum. Yet, here I am, playing Hogwarts Legacy on a 6500 XT on low graphics and having tons of fun. ;)
 
Joined
Dec 5, 2017
Messages
155 (0.07/day)
I think no, because many houses will be then at a threat of real fire hazard. You will need to build new cities and buildings infrastructure to handle that.
The trend now is actually the opposite. You buy lower emission TVs, refrigerators, washing machines... your average light bulbs went from 100-watts down to 9-watts or so.



The last good game was Crysis 3 back in 2013. After that and the original Far Cry, there is nothing even remotely appealing in graphics realism and breakthrough revolutionary innovative graphics.
Ray-tracing alone is still an embarrassing hoax.
At 2000W this is true to some extent. I can't speak for other parts of the world but any wiring here can handle close to 1800W, I have a 1600W space heater...
And while I don't quite agree with the last part, I do like your spirit. I sure hope Crysis 4 comes with a worthy campaign and the same graphics ambition, and doesn't water anything down. The idea of TAA ruining the vegetation like all modern games scares me though.
Have you played the Crysis 3 remaster with RT though? It's not a massive difference but it sure is more visually immersive.
 
Joined
Apr 21, 2022
Messages
57 (0.07/day)
According the NEC, which is the code most of the USA follows, Cord-and-plug-connected equipment is not allowed to exceed 80% of the circuit's rating.

15A 120V (1800W) circuits can't exceed 12A (1440W)
20A 120V (2400W) circuits can't exceed 16A (1920W)

A 80+ Titanium rated PSU is at worst 90% efficient at 100% load.

15A 120V is limited to a 1290W on a 80+ Titanium PSU (1440W * 0.9 = 1290W)
20A 120V is limited to a 1728W on a 80+ Titanium PSU (1920W * 0.9 = 1728W)

The max power a PSU can output goes lower as the efficiency goes lower. You also have to account for ignorance or neglect in the average home. Most people don't replace a loose outlet. Even some of my tight fitting receptacles get uncomfortably hot when drawing 12A consistently from a space heater.

I guess what I'm trying to say is I don't see top end GPU's getting much more power hungry. 600W is probably going to be the max they pull at stock, since they need to account for people using 2 GPUs. I don't mean 600W continuosly, just 600W spikes.
 
Joined
Feb 18, 2005
Messages
5,352 (0.76/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
According the NEC, which is the code most of the USA follows, Cord-and-plug-connected equipment is not allowed to exceed 80% of the circuit's rating.

15A 120V (1800W) circuits can't exceed 12A (1440W)
20A 120V (2400W) circuits can't exceed 16A (1920W)

A 80+ Titanium rated PSU is at worst 90% efficient at 100% load.

15A 120V is limited to a 1290W on a 80+ Titanium PSU (1440W * 0.9 = 1290W)
20A 120V is limited to a 1728W on a 80+ Titanium PSU (1920W * 0.9 = 1728W)

The max power a PSU can output goes lower as the efficiency goes lower. You also have to account for ignorance or neglect in the average home. Most people don't replace a loose outlet. Even some of my tight fitting receptacles get uncomfortably hot when drawing 12A consistently from a space heater.

I guess what I'm trying to say is I don't see top end GPU's getting much more power hungry. 600W is probably going to be the max they pull at stock, since they need to account for people using 2 GPUs. I don't mean 600W continuosly, just 600W spikes.
This is a USA-only problem because your country is dumb and uses 120V instead of 240V, requiring double the amperage for the same wattage.
 
Joined
Apr 21, 2022
Messages
57 (0.07/day)
This is a USA-only problem because your country is dumb and uses 120V instead of 240V, requiring double the amperage for the same wattage.
It's North America that uses 120V, not just the USA. There's other countries outside of North America, such as Brazil and Saudi Arabia, that use 120V. Then there's Japan at 100V. Besides, I doubt anyone wants a 2000W space heater next to their desk in the middle of summer.
 
Top