Monday, June 26th 2023
More Pictures of NVIDIA's Cinder Block-sized RTX 4090 Ti Cooler Surface
Back in January, we got our first look at the cinder block-like 4-slot cooling solution of NVIDIA's upcoming flagship graphics card (called either the RTX 4090 Ti, or the TITAN (Ada). "ExperteVallah" on Twitter scored additional pictures of the cooler. Its design sees the heat dissipation surface pushed to the entire thickness of the cooler, and ventilated the entire length.
The card's PCB isn't conventional—not perpendicular to the plane of the motherboard like any other add-in card—but is rather along the plane of the motherboard, with additional breakaway daughter cards interfacing with the sole 12VHPWR power connector, and the PCIe slot. This slender, ruler-shaped PCB spans the entire length of the card, without coming in the way of its heat dissipation surfaces. The length is used for the large AD102 ASIC that's probably maxed out (with all its 144 SM enabled), twelve GDDR6X (possibly faster 23 Gbps), and a mammoth VRM that nearly maxes out the 600 W continuous power delivery design limit of the 12VHPWR.
Sources:
ExperteVallah (Twitter), Hassan Mujtaba (Twitter), VideoCardz
The card's PCB isn't conventional—not perpendicular to the plane of the motherboard like any other add-in card—but is rather along the plane of the motherboard, with additional breakaway daughter cards interfacing with the sole 12VHPWR power connector, and the PCIe slot. This slender, ruler-shaped PCB spans the entire length of the card, without coming in the way of its heat dissipation surfaces. The length is used for the large AD102 ASIC that's probably maxed out (with all its 144 SM enabled), twelve GDDR6X (possibly faster 23 Gbps), and a mammoth VRM that nearly maxes out the 600 W continuous power delivery design limit of the 12VHPWR.
145 Comments on More Pictures of NVIDIA's Cinder Block-sized RTX 4090 Ti Cooler Surface
Jokes aside, it went outta control 350 W ago. And they just can't stop making efforts to produce even hotter stuff.
Also reduced cost to manufacture (pretty sure this is true) and targets the same group of people that are going to buy this, rich fanbois.
But then realized that was one slot too high up the stack 'Will it drop like a brick'
Confirmed
For the life of me I dont know why Nvidia will not let the AIB partners design the cooler. Obviously they know what they are doing better than Nvidia, the proof is in the putting.
The industry has ALWAYS pushed for bigger chips. The GPUs of the past were not limited to 350/200/75w because they didnt want bigger stuff, it was because the process nodes of the time couldnt handle such large chips with tens of billions of transistors running at multiple GHz.
When the GTX 480 came to, you had people lamenting on how the GPU was too hot and 300 watt was just too much power for a GPU and how the industry was going too far, etc etc. Now we have people lamenting on why we didnt limit ourselves to that same 300w barrier. That's what I use mine for, its way worse when you have an old case with side fans. Plus side, in winter I can keep my house cold and a few hours of gaming is sufficient to raise the room temp 8-10f.
It only really becomes an issue in the summer when its 90f outside, but then I'm usually doing things outside anyway, so meh. Nvidia's coolers for ampere and ada seems to work very well, they're quite, keep temps decent, ece. Sure, you can do better, but they're a LONG shot from the fermi coolers of 2009.
And that ties with the very real overengineered coolers, with wasted materials and money.
This is not all a product of healthy evolution of the gpu's in my opinion, more a trend to make this products seem more premium and justify the hike in prices in the gpu die from Nvidia. Won't even go intto talking about the pricing curve vs the performance curve, or the massive downgrade in performance per dollar.
I agree that GPu tuning has gotten out of hand, but by the same token, if you could easily raise the GPU clock and OC another 15% out of your card, then you can just as easily lower the voltage and UV your GPU too. Getting better coolers that don't sound like jet engines is somehow a Bad Thing (tm). I love my 6800xt cooler, it maintains under 40dba under full load when pulling near 300 watts. Compared to the "good" coolers from the mid 2010s I'll take these overbuilt monsters any day.
I still remember the 2080Ti dual fan screaming like nightmare.
I prefer 3+ slots coolers than the atrocious ones of the past.
do you fully use your GPU? I use it %100 load most of the time. My 3090 even at %100 wasnt producing as much heat and were silent as well
I had to change my case, took PCs radiator from front to top, put extra fans but nothing helped. it is just big, and even at 70 it can produce heat more than a smaller card that runs at 90-100 degree
Now I guess they are "crappy".
I understand them, they want their GPUs to be competitive but x2 wattage a decade is a complete clown fiesta. Especially considering how much these cards consume in low loads like media playback hitting dozens watts.
Time will pass, and video cards of <200 W TDP will cease to exist. This is what "oh come on, it's fine" attitude is doing with the industry.
Tossing more cores at an iGPU will only be a bigger and bigger waste of silicon until shared system memory bandwidth significantly improves.