Wednesday, September 21st 2022

NVIDIA Details GeForce RTX 4090 Founders Edition Cooler & PCB Design, new Power Spike Management

NVIDIA detailed the design of its GeForce RTX 4090 Founders Edition graphics card in a briefing with us today.
While the new Founders Edition looks very similar to the RTX 3090 Ti Founders Edition, NVIDIA says it has made several changes to its design. The metal outer frame now comes with a more pronounced gunmetal tinge. The heatsink array underneath has been redesigned to improve airflow between the two fans.

The fan itself has been updated, too. NVIDIA says it tested as many as 50 new fan designs before choosing this one, which offers up to 20% higher airflow than the one you have in the RTX 3090 Ti. Besides the heatsink and fan, the company also redesigned the vapor-chamber plate that pulls heat from the GPU and surrounding memory chips; as well as the base plate drawing heat from the VRM components. The heatpipes follow an improved heat-distribution.
The card uses the same compact PCB design like the GeForce 30 Series and allows airflow through the card. RTX 4090 draws power from a single 12+4 pin PCIe Gen 5 ATX 12VHPWR connector that's capable of pulling up to 600 W of power over a single cable. NVIDIA made it clear the existing power supplies will work fine through the use of adapter cables, a ATX 3.0 PCI-Express 5.0 power supply is not required for Ada Lovelace GPUs.

While the RTX 4090 operates at 450 W by default, the power delivery capability allows you to increase the power limit up to 600 W for overclocking. The card features a 23-phase VRM (20-phase GPU + 3-phase memory). NVIDIA claims that it has re-architected its VRM to offer superior transient voltage management. This is specifically to minimize the kind of spikes or excursions we've seen with previous-gen high-end graphics cards such as the RTX 3090 Ti and Radeon RX 6900 Series. These spikes often causes spontaneous shutdowns on some power supplies, even if they had a much higher wattage rating than required.
In the chart above, NVIDIA shows how current spikes get mitigated by their new VRM design, which uses a PID controller feedback loop. While the RTX 3090 bounces up and down, the RTX 4090 stays relatively stable, and just follows the general trend of the current loading pattern thanks to a 10x improvement in power management response time. It's also worth pointing out that the peak power current spike on the RTX 3090 is higher than the RTX 4090, even though the RTX 3090 is rated 100 W lower than the RTX 4090 (350 W vs 450 W).

This should come as big news for those with older PSUs that are counting on the in-box adapter that's bundled with every Founders Edition and that converts three 8-pin PCIe power connectors into the 12+4 pin ATX 12VHPWR connector; without having to wait for newer-generation ATX 3.0 + PCIe Gen 5 compliant PSUs. This however doesn't necessarily mean that custom-design graphics cards will meet the same transient-response standards.

A special focus was put on memory temperature, which has been decreased by 10°C. In collaboration with Micron, NVIDIA selected memory chips fabricated on a new process, and all chips are 16 Gbit (density), which means they all fit on the same side of the PCB, so they can be cooled more effectively.

The GeForce RTX 4080 16 GB Founders Edition shares the same cooler design as the RTX 4090, despite being a notch lower in positioning.
Just as reminder, GeForce RTX 4090 will be available on-shelf on October 12th, for $1600. NVIDIA's new flagship graphics card uses the 4 nanometer AD102 Ada Lovelace GPU and comes with 384-bit 24 GB GDDR6X memory from Micron. The number of GPU cores is 16384, with 512 TMUs and 192 ROPs.
Add your own comment

77 Comments on NVIDIA Details GeForce RTX 4090 Founders Edition Cooler & PCB Design, new Power Spike Management

#51
PapaTaipei
Are they intentionally using non conventional fan size so that when it break the whole card is going to the garbage can?
Posted on Reply
#52
trsttte
ChomiqIt's weird that they're hiding the holes for support brackets that are located at the end of the GPU. 30 series had them (covered with screws), this one seems to have them too but they're covered up with something.

Seeing how thick this cooler is it sure could use some support.

Although I only saw one case from Silverstone that supported it:
Those support holes are a standard-ish way to support pcie cards on workstations and servers. Instead of the rather silly vertical supports it would be cool if more case manufacturers incorporated solutions that used this more "standard" and proper way of doing it like Silverstone did.

PapaTaipeiAre they intentionally using non conventional fan size so that when it break the whole card is going to the garbage can?
I don't know what you mean, other than the Asus Noctua one-offs aren't all graphics cards using more or less random fan sizes mounted in different ways?
Posted on Reply
#53
80-watt Hamster
trsttteI don't know what you mean, other than the Asus Noctua one-offs aren't all graphics cards using more or less random fan sizes mounted in different ways?
Pretty much. There are a few common-ish configs for fan diameter and mounting pattern, but the odds of being able to just go out and buy replacement fans for a given card are basically nil.
Posted on Reply
#54
Bomby569
“Moore’s Law’s dead,” Huang said, referring to the standard that the number of transistors on a chip doubles every two years. “And the ability for Moore’s Law to deliver twice the performance at the same cost, or at the same performance, half the cost, every year and a half, is over. It’s completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past.”

www.marketwatch.com/story/moores-laws-dead-nvidia-ceo-jensen-says-in-justifying-gaming-card-price-hike-11663798618


want cheap cards, go get a real job you low lifes
Posted on Reply
#55
progste
I hate Moore's "law", people give it too much importance.
Posted on Reply
#56
Bomby569
Wasn't moore's law created with CPU's in mind, those seem to have kept decent pricing. Huang is a master of saying stupid shit, remember the "the more you buy, the more you save"
Posted on Reply
#57
progste
Well yes, it's just a projection he made about the advance of manufacturing nodes, but it's constantly used by people like Huang to say "it's not crap, just buy it".
Posted on Reply
#58
Bomby569
he really deserves this 4000 series pricing to blow up in his face. It will never happen, i'm sure, but man did he deserved it.
Posted on Reply
#59
Chrispy_
It still has the really irritating power connector in a really dumb location.
Posted on Reply
#60
80-watt Hamster
Bomby569Wasn't moore's law created with CPU's in mind, those seem to have kept decent pricing. Huang is a master of saying stupid shit, remember the "the more you buy, the more you save"
Moore's Lawapplies to transistor density of any integrated circuit or microchip.
Posted on Reply
#61
CyberCT
I'll wait for reviews of the 4090 vs 4080 16GB, and see what ATI will offer .... but if it really is double the rasterization performance of a 3080ti, and noticeably faster than 4080 16GB, I'm buying one. Finally there's a GPU that can drive the high end VR headsets at a solid frame rate of 120fps. If I didn't care about VR, I'd be content with my 3080ti for the next 2 years.
Posted on Reply
#62
pavle
Chrispy_It still has the really irritating power connector in a really dumb location.
Well, it's coming from a very irritating and dumb company of people, yet with lots and lots of customers, go ponder on that one. :)
Posted on Reply
#63
Kickus_assius
N3utroThey're intentionally pricing high to prevent the 3xxx prices to fall down. When the 3xxx is gone they might drop current prices by a lot.
Also, is there not inflation in the US vs when the 3xxx series was launched? And we can actually buy at MSRP this time, whereas last time, the prices skyrocketed way above MSRP.

In Canada, our prices for everything have increased, so it stands to reason the prices of 4xxx series would be higher, not to mention higher prices being charged by TSMC for the fab process to their customers.
Posted on Reply
#64
trsttte
Bomby569“Moore’s Law’s dead,” Huang said, referring to the standard that the number of transistors on a chip doubles every two years. “And the ability for Moore’s Law to deliver twice the performance at the same cost, or at the same performance, half the cost, every year and a half, is over. It’s completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past.”
Yeah right, thank god there's still some very small competition at least, if they don't deliver someone else will
Posted on Reply
#65
mechtech
ir_cowlooks exactly the same. I don't see the "redesign".


Good luck with AMD drivers :)
Use the default MS windows driver ;) lol
Posted on Reply
#66
chrcoluk
Profiteering on the 16 gig 4080, if the VRAM really was that cost to Nvidia then it would mean they giving away the PCB, cooling, and GPU haha, but they at least uprate the cooler and spec of the chip to compensate for the big price gap.

Interestingly it looks like the encoder chip is getting a beefy jump in spec, and I wonder how much attention that will get by the media (no mention in the topic title). Maybe a recognition that NVENC is still really inefficient with the AV1 support?

I didnt really have much intention in upgrading this gen even though my 10 gig VRAM feels like its already a bottleneck but the NVENC change might make me reconsider, will see how this plays out in the market.
Posted on Reply
#67
Pepamami
dayum look at this big chungus. We need new ATX formfactor now.
ir_cowGood luck with AMD drivers :)
U can download them from amd.com, u need separate drivers for MB and VGA.
For linux distros its already build in usually.
Posted on Reply
#68
zoom314
Nice card, too bad the drivers for 7 pro x64 are missing, not everyone has the money for 10 or knows how to make the OS work like 7, and 12 is out to replace both 10 and 11, OTB...

Some 3080Ti's will do for me.
Posted on Reply
#69
Zach_01
FE cards are at least 3 slot and (air-cooled)AIBs will be from 2.5 up to 4+. Most of them are around 3.5slots at least for 4090.
What did we really expect with 450W
My entire PC peaks (PSU out) around 490W...lol


Posted on Reply
#70
KarymidoN
i really hope AMD comes with atleast some Power/Price/Performance viable options, my 1070 is overworked and i'd hate to give Ngreedia more money right now.
Posted on Reply
#71
c2DDragon
GunShot*sighs*

Are we, REALLY, still saying this in late 2022?!

*NVIDIA says (in the joker's voice) "wait until they get a LOAD of me:"

Last driver update - GRD(?) 516.94


Open Issues

- [RTX 30 series] PC monitor may not wake from display sleep when GPU is also connected to an HDMI 2.1 TV, and the TV is powered off.
-Toggling HDR on and off in-game causes game stability issues when non-native resolution is used.
- Videos played back in Microsoft Edge may appear green if NVIDIA Image Scaling is enabled upon resuming from hibernate or booting with fastboot.
- [DirectX 12] Shadowplay recordings may appear over exposed when Use HDR is enabled from the Windows display settings.
- Monitor may briefly flicker on waking from display sleep if DSR/DLDSR is enabled.
- [RTX 30 series] Lower performance in Minecraft Java Edition.
- External display may not be detected when connected via USB-C on certain Razer notebooks.
- [Forza Horizon 5] Rainbow like artifacts in game after driver update.
- [Reallusion Hub] App will crash when launched on PC using a CPU with 32+ logical processors
- On rare occasion, video playback in browser may result in bugcheck code: 0x116
- [Marvel Spider-Man Remastered] Cutscene edges may appear blurry when DLSS is enabled on an ultra-wide monitor

And a few more extra issues were added to this list much later (see NVIDIA's forum) to this embarrassing column and... NVIDIA has chosen to LIE about fixing outstanding issues, that goes all the way back to Ampere's released date, as being fixed (the LIE) just to remove these huge issues from its HUGE GROWING inferior problems. e.g. HDMI eARC issues, etc. Oddly, no Ampere "professional reviewers" ever mention any of these serious flaws in their early glorifying Ampere's reviews. Weird! /s

So, please... pump the brakes with that fanboyism! :shadedshu:
AMD FRTC (www.amd.com/en/technologies/frtc) never worked for me since I have a 6700XT.
Today with the last beta 22.9.1, it still doesn't work. FPS still go to the moon instead of being locked. I have to use Radeon Chill to lock FPS and I don't like how it acts : FPS going down at the exact moment you stop moving your mouse and goes up to the maximum set with just a move. So I just put the same minimum & maximum FPS xD
I tried RivaTuner Statistic Server to lock the FPS instead of using the drivers but it just added weird stutters randomly.
Posted on Reply
#72
Totally
64KThis is comical but the price isn't unexpected.

Gamers, we don't need the 4090 (with the exception of a very, very few). Leave it to the people that need it for work purposes.
Why even market it to gamers? Why not as a prosumer card e.g. Titan 'ABC'? or you know A PROFFESIONAL card e.g. Quadro ####? There's no trying to spin that angle.
Posted on Reply
#73
cvaldes
TotallyWhy even market it to gamers? Why not as a prosumer card e.g. Titan 'ABC'? or you know A PROFFESIONAL card e.g. Quadro ####? There's no trying to spin that angle.
I give up. Gross margins? Mindshare? Bragging rights?

I think the 3090 shows up with 1% usage on the latest Steam survey so clearly there is some consumer interest.

And if NVIDIA slapped Titan on the box, some people would still buy it primarily for playing games.
Posted on Reply
#74
Totally
cvaldesI give up. Gross margins? Mindshare? Bragging rights?

I think the 3090 shows up with 1% usage on the latest Steam survey so clearly there is some consumer interest.

And if NVIDIA slapped Titan on the box, some people would still buy it primarily for playing games.
That still doesn't explain away, all the daft, "derrr, iTz nOt foR gAMerZ!" Comments when gamers balk at the price Comments. So is it or is it not for gamers?
Posted on Reply
Add your own comment
May 7th, 2024 20:27 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts