• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's Next-Gen Reference Cooler Costs $150 By Itself, to Feature in Three SKUs

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,677 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Pictures of alleged next-generation GeForce "Ampere" graphics cards emerged over the weekend, which many of our readers found hard to believe. It's features a dual-fan cooling solution, in which one of the two fans is on the reverse side of the card, blowing air outward from the cooling solution, while the PCB extends two-thirds the length of the card. Since then, there have been several fan-made 3D renders of the card. NVIDIA is not happy with the leak, and started an investigation into two of its contractors responsible for manufacturing Founders Edition (reference design) GeForce graphics cards, Foxconn and BYD (Build Your Dreams), according to a report by Igor's Lab.

According to the report, the cooling solution, which looks a lot more overengineered than the company's RTX 20-series Founders Edition cooler, costs a hefty USD $150, or roughly the price of a 280 mm AIO CLC. It wouldn't surprise us if Asetek's RadCard costs less. The cooler consists of several interconnected heatsink elements with the PCB in the middle. Igor's Lab reports that the card is estimated to be 21.9 cm in length. Given its cost, NVIDIA is reserving this cooler for only the top three SKUs in the lineup, the TITAN RTX successor, the RTX 2080 Ti successor, and the RTX 2080/SUPER successor.



All three will use the same cooling solution, and a common PCB design codenamed PG132. Further, all three cards will be based on a common ASIC, codenamed "GA102," with varying hardware specs. The "SKU10" (TITAN RTX successor) could ditch the TITAN brand to carry the model name "GeForce RTX 3090," max out the 384-bit wide memory bus of the GA102 ASIC, and feature a whopping 24 GB of GDDR6X memory, with 350 W typical board power.

The next SKU, the SK20, which is the RTX 2080 Ti successor, will be cut down from SKU10. It will feature 11 GB of GDDR6X memory across a 352-bit wide memory interface, and have a 320 W typical board power rating. This board will likely feature the RTX 3080 Ti branding. Lastly, there's the SKU30, which is further cut-down, features 10 GB of GDDR6X memory across a 320-bit wide memory interface, and it bears the RTX 3080 model number, succeeding the RTX 2080 / RTX 2080 Super.

When launched, "Ampere" could be the first implementation of the new GDDR6X memory standard, which could come with data-rates above even the 16 Gbps of today's GDDR6, likely in the 18-20 Gbps range, if not more. Lesser SKUs could use current-gen GDDR6 memory at data-rates of up to 16 Gbps.

View at TechPowerUp Main Site
 
Don't buy that it costs $150/pc in volume manufacturing cost in China, that is just ridiculous, the cost is much much lower.

$150 is so much money in manufacturing that they could have made some kind of water cooling instead.

As an example i can buy cpu water cooling "Cooler Master MasterLiquid Lite 120 120mm" at $50 retail and that even includes 25% VAT.

Without VAT that is $40 and then the retailer and Cooler Master both makes some kind of profit on that so maybe it cost $20-25 in manufacturing cost or even lower.

Then Imagine nVidia paying $150 in manufacturing cost for air cooling, no f**king way. nVidia is a lot of things but stupid is not one of them when it comes to money.
 
Last edited:
How are they gonna sell that 3080 for $1000 then? The Gigarays gig only works once, I think :p
 
seems the card price will be 800 $ at least
 
While the design looks cool, I am not sure about the cooling capability of this cooler. Great for those that like to exhibit their rig, but I prefer they spend the money on better cooling solution, than better looks.
 
So it's probably better to just wait for AIBs to make their own traditional coolers.
 
Looks like Pascal all over again.

Not complaining!
 
long as a low end graphics card. I know its the 7nm. And if its true i see the power delivery VRM ecc is detachable like 2 separate PCB.

More complexity like 2 PCB equals + fragility of the solderings.
 
Still not convinced that any of this is real, because of the simple reason that it doesn't help NVIDIA in any way shape or form.

Two PCBs? More expensive than one, and now NVIDIA needs to design and test a separate reference PCB for partners.
Unnecessarily convoluted and expensive cooler? Why do they need it, given that the Turing Founders Edition cooler is perfectly capable?

I wouldn't put it past NVIDIA to do something weird and wacky, but this isn't weird and wacky, it's just stupid and expensive.
 
Still not convinced that any of this is real, because of the simple reason that it doesn't help NVIDIA in any way shape or form.

Two PCBs? More expensive than one, and now NVIDIA needs to design and test a separate reference PCB for partners.
Unnecessarily convoluted and expensive cooler? Why do they need it, given that the Turing Founders Edition cooler is perfectly capable?

I wouldn't put it past NVIDIA to do something weird and wacky, but this isn't weird and wacky, it's just stupid and expensive.
If it is true, maybe NVidia is going with the old saying "new is always better"? A lot of people think that's the case. new better graphics, new better cooler and so on.
 
This still isn't an official render but this version raises two more drawbacks.

  1. It's very clear that the front fan needs to be a radial fan. An ordinary axial fan without the outer ring would be a better choice than the one Nvidia have picked that effectively prevents the blade tips from acting as a psuedo-radial blower.

  2. With the PCIe plugs on the end of the card connected via that daughterboard, half of the effective cooling from the rear fan is blocked.

I mean, it didn't look like the best design to start off with but if this rendition is accurate then it's even worse than I thought.
 
Still not convinced that any of this is real, because of the simple reason that it doesn't help NVIDIA in any way shape or form.

Two PCBs? More expensive than one, and now NVIDIA needs to design and test a separate reference PCB for partners.
Unnecessarily convoluted and expensive cooler? Why do they need it, given that the Turing Founders Edition cooler is perfectly capable?

I wouldn't put it past NVIDIA to do something weird and wacky, but this isn't weird and wacky, it's just stupid and expensive.
Have you seen any RTX 2060 Teardown? Turing FE Coolers are also overengineered and complicated AF. It's what nVidia does. They're eyecatchers
 
Have you seen any RTX 2060 Teardown? Turing FE Coolers are also overengineered and complicated AF. It's what nVidia does. They're eyecatchers

Turing coolers aren't $150 overengineered.
 
Turing coolers aren't $150 overengineered.
yeah you're right about that. Still nVidia apparently likes form over function. Dont get me wrong; the Turing FE Coolers do work, but so did the blower style coolers and they probably were cheaper.
 
Hm, sincerely hope it cools better than reference for Titans and does it preferably in some quiet manner.
 
$150 of marketing BS more like.
 
How are they gonna sell that 3080 for $1000 then? The Gigarays gig only works once, I think :p

Why not? Now you get TWENTY Gigarays!

And they just work!
 
Seems that new nvidia cards will be able to set your house on fire like in old times with gtx 480.

Yeah 320W does scream Fermi and Vega to me too. Curious about the idea behind that, Nvidia's 250W TDP for top end was almost becoming a fixed thing, and now they shrink and still need to push this up radically?

Either the perf jump is massive, or Nvidia is running out of ideas. Surely they won't do an RTG now....?
 
So, either blowing hot air from the GPU right into the intake of the CPU air cooler, or, if intake, directly competing with the CPU fans for cool air. Hmmm... wonder how that is going to turn out.
 
Back
Top