• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's Next-Gen Reference Cooler Costs $150 By Itself, to Feature in Three SKUs

long as a low end graphics card. I know its the 7nm. And if its true i see the power delivery VRM ecc is detachable like 2 separate PCB.

More complexity like 2 PCB equals + fragility of the solderings.

This idea of detachable VRM PCB seems to serve nothing but fancy looking. With all these high currents passing through the fragile soldering points I wonder how long the lifespan is for these cards...
 
Pretty much this. I'll take an EVGA card Thank You very much!
Too bad Sapphire doesn't make Nvidia cards. They've been great for years :)

This still isn't an official render but this version raises two more drawbacks.

  1. It's very clear that the front fan needs to be a radial fan. An ordinary axial fan without the outer ring would be a better choice than the one Nvidia have picked that effectively prevents the blade tips from acting as a psuedo-radial blower.

  2. With the PCIe plugs on the end of the card connected via that daughterboard, half of the effective cooling from the rear fan is blocked.

I mean, it didn't look like the best design to start off with but if this rendition is accurate then it's even worse than I thought.
PCIe plugs could be wired to the board like in GTX 1060 and RTX 2060..?
 
Yeah 320W does scream Fermi and Vega to me too. Curious about the idea behind that, Nvidia's 250W TDP for top end was almost becoming a fixed thing, and now they shrink and still need to push this up radically?

Either the perf jump is massive, or Nvidia is running out of ideas. Surely they won't do an RTG now....?
Seems to me like someone did some back of the cigarette packet maths on their competition and came up short via Xbox series X specs.
Now there dragging out the bigger chips , going with as much cooling and power as possible and shooting for the max clock's they can get.
Someone's worried.
 
Expensive overengineered boards and cooling for the affluent.
Better stick to aibs
 
Seems to me like someone did some back of the cigarette packet maths on their competition and came up short via Xbox series X specs.
Now there dragging out the bigger chips , going with as much cooling and power as possible and shooting for the max clock's they can get.
Someone's worried.

Imagine if the GPU table finally turns in AMD's favor, now of all times.

Turing might have been the writing on the wall.... won't say told you so... but.... :P
 
Wouldn't be surprised with Foxconn lol. These leaks can do some work on their share prices.
 
I was dreaming of upgrading my EVGA 1080 hybrid to one of those new 3080s ...
If the cooler itself is $150, then my 1080 with AIO, bought in the middle of the crypto craze for $600, looks like a bargain.
If AMD comes up with something good to compete with Nvidia (RDNA2 card that ties with the 3080), they will price it accordingly so having upper tier GPU is no longer for me. Back to XX60 class I guess, probably a 4060 in ... 2023?
 
Imagine if the GPU table finally turns in AMD's favor, now of all times.

Turing might have been the writing on the wall.... won't say told you so... but.... :P
I am not sure it will, but I am at least becoming more sure that the next year will be interesting from a GPU Pov , it's all rumours though.
But it is looking like some will have a god reason to get the Philips screwy out.
 
Yeah 320W does scream Fermi and Vega to me too. Curious about the idea behind that, Nvidia's 250W TDP for top end was almost becoming a fixed thing, and now they shrink and still need to push this up radically?

Either the perf jump is massive, or Nvidia is running out of ideas. Surely they won't do an RTG now....?

These power draws are a rumour amongst a sea of rumours that isn't corroborated by anything else. In particular the rumoured configurations of the CUDA cores has GA102 at only 15% more cores than TU102. Coupled with the 7nm shrink, that means Ampere should draw far less power than Turing.

Further, NVIDIA has historically been extremely hesitant to go above 300W TBP because that's the maximum that can be supplied by an 8+6 pin connector combination. The only reason they'd want to blow that budget is for performance reasons, and I don't see any reason that they need to worry about that.

Seems to me like someone did some back of the cigarette packet maths on their competition and came up short via Xbox series X specs.

LOL, no. Console paper specs don't mean s**t. Especially considering RDNA2 is the first time AMD is doing HW accelerated ray-tracing.
 
These power draws are a rumour amongst a sea of rumours that isn't corroborated by anything else. In particular the rumoured configurations of the CUDA cores has GA102 at only 15% more cores than TU102. Coupled with the 7nm shrink, that means Ampere should draw far less power than Turing.

Further, NVIDIA has historically been extremely hesitant to go above 300W TBP because that's the maximum that can be supplied by an 8+6 pin connector combination. The only reason they'd want to blow that budget is for performance reasons, and I don't see any reason that they need to worry about that.



LOL, no. Console paper specs don't mean s**t. Especially considering RDNA2 is the first time AMD is doing HW accelerated ray-tracing.

You're right. I got carried away, but what if
 
I was dreaming of upgrading my EVGA 1080 hybrid to one of those new 3080s ...
If the cooler itself is $150, then my 1080 with AIO, bought in the middle of the crypto craze for $600, looks like a bargain.
If AMD comes up with something good to compete with Nvidia (RDNA2 card that ties with the 3080), they will price it accordingly so having upper tier GPU is no longer for me. Back to XX60 class I guess, probably a 4060 in ... 2023?
I'd wait for AIB models with custom cooler if I were you and an upgrade is in sight.
 
$150 for the cooler, plus the cost of the pcb, plus the extra chips, plus the memory, plus what the manufacturer of the card will make, plus what the retailer will make, plus what Nvidia charges for the GPU and in whatever it charges there is a 60% or more profit margin.

I guess it will cost $499 :laugh:
 
seems the card price will be 800 $ at least
ahhhhhh and to say all 1080Ti and 2080/2080Ti were above 1000 for me ... oh woes ...

alright, benchmark will not matter ... i will go RDNA2 once i want to get rid of my 1070 (which is bound to happen before RDNA2 cards availability .... oh well maybe a second hand RX5700XT)


150$ for that cooler? heck even aftermarket cooler are cheaper than that (and probably would work just as fine ...) Jen Hsu really need to stop comming up with BS to grab extra money just to sponsor his next leather jacket ...
 
Last edited:
Is this supposed to be some pre-emptive marketing BS to justify another price hike?

150US for 2x 90-100mm fans, a vapor chamber (?) over the GPU, a separate heatsink for the VRAM I presume and another for the VRM? My ass...
 
These power draws are a rumour amongst a sea of rumours that isn't corroborated by anything else. In particular the rumoured configurations of the CUDA cores has GA102 at only 15% more cores than TU102. Coupled with the 7nm shrink, that means Ampere should draw far less power than Turing.

Further, NVIDIA has historically been extremely hesitant to go above 300W TBP because that's the maximum that can be supplied by an 8+6 pin connector combination. The only reason they'd want to blow that budget is for performance reasons, and I don't see any reason that they need to worry about that.



LOL, no. Console paper specs don't mean s**t. Especially considering RDNA2 is the first time AMD is doing HW accelerated ray-tracing.
Paper specs, what are you smoking , the Xbox series x is in hands, it defo isn't a unicorn.

We will see eh.
 
Paper specs, what are you smoking , the Xbox series x is in hands, it defo isn't a unicorn.

Yes... non-final hardware in the hands of "influencers" (aka Xbox fanboys/girls) who have no idea what performance numbers mean comparatively, but are happy to regurgitate the ones Microsoft supplies them with ad infinitum.

Console hardware has never been better than high-end PC hardware and it never will be, because it's always a generation (at best) behind the latest and greatest PC hardware.
 
Yes... non-final hardware in the hands of "influencers" (aka Xbox fanboys/girls) who have no idea what performance numbers mean comparatively, but are happy to regurgitate the ones Microsoft supplies them with ad infinitum.

Console hardware has never been better than high-end PC hardware and it never will be, because it's always a generation (at best) behind the latest and greatest PC hardware.
Xbox 360 GPU was pretty powerful in 2005 when the console was launched..?
 
Probably charge 2K in cost just because it has a fan on the back.

Yes... non-final hardware in the hands of "influencers" (aka Xbox fanboys/girls) who have no idea what performance numbers mean comparatively, but are happy to regurgitate the ones Microsoft supplies them with ad infinitum.

Console hardware has never been better than high-end PC hardware
and it never will be, because it's always a generation (at best) behind the latest and greatest PC hardware.

Very wrong indeed.

Xbox 360 GPU was pretty powerful in 2005 when the console was launched..?
Both the 360 and original XBOX had GPU's more capable upon launch than anything on the desktop market, but it was merely months before the PC had something better, so not truly a win, but they were more powerful for a monent in time.
 
Both the 360 and original XBOX had GPU's more capable upon launch than anything on the desktop market, but it was merely months before the PC had something better, so not truly a win, but they were more powerful for a monent in time.
I totally forgot the OG Xbox, yeah, it had a hybrid between GeForce 3 and 4.
 
Too bad Sapphire doesn't make Nvidia cards. They've been great for years :)


PCIe plugs could be wired to the board like in GTX 1060 and RTX 2060..?
In the other thread I assumed the PCIe plugs would logically go where the NVLink fingers are, thus avoiding silly wires like we had in the 2060 and this insane daughter board shown in the render.

Wires would obviously be better for airflow on the rear fan but I was just going off the exploded diagram render that this thread is discussing.
 
In the other thread I assumed the PCIe plugs would logically go where the NVLink fingers are, thus avoiding silly wires like we had in the 2060 and this insane daughter board shown in the render.

Wires would obviously be better for airflow on the rear fan but I was just going off the exploded diagram render that this thread is discussing.
Totally missed the plugs on that daughter board, my bad.
 
Yes... non-final hardware in the hands of "influencers" (aka Xbox fanboys/girls) who have no idea what performance numbers mean comparatively, but are happy to regurgitate the ones Microsoft supplies them with ad infinitum.

Console hardware has never been better than high-end PC hardware and it never will be, because it's always a generation (at best) behind the latest and greatest PC hardware.
Your getting a bit Or and wrong, I'm talking about Devs not influencer's and you said paper specs, they're set in stone now that's not paper and non final?, they're going to respin it's chip before releasing, nah.
It doesn't matter if it's better or worse , it's competing with what comes out for some people's money, and the main point being it does well.
So competition is turning up this year.

How competitive they all are is yet to be decided but I sure as shit am not paying 150£ for no air cooler, I'll say that much on topic.
 
Back
Top