• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GTX 1080-successor a Rather Hot Chip, Reference Cooler Has Dual-Fans

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,680 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
The GeForce GTX 1080 set high standards for efficiency. Launched as a high-end product that was faster than any other client-segment graphics card at the time, the GTX 1080 made do with just a single 8-pin PCIe power connector, and had a TDP of just 180W. The reference-design PCB, accordingly, has a rather simple VRM setup. The alleged GTX 1080-successor, called either GTX 1180 or GTX 2080 depending on who you ask, could deviate from its ideology of extreme efficiency. There were telltale signs of this departure on the first bare PCB shots.

The PCB pictures revealed preparation for an unusually strong VRM design, given that this is an NVIDIA reference board. It draws power from a combination of 6-pin and 8-pin PCIe power connectors, and features a 10+2 phase setup, with up to 10 vGPU and 2 vMem phases. The size of the pads for the ASIC and no more than 8 memory chips confirmed that the board is meant for the GTX 1080-successor. Adding to the theory of this board being unusually hot is an article by Chinese publication Benchlife.info, which mentions that the reference design (Founders Edition) cooling solution does away with a single lateral blower, and features a strong aluminium fin-stack heatsink ventilated by two top-flow fans (like most custom-design cards). Given that NVIDIA avoided such a design for even big-chip cards such as the GTX 1080 Ti FE or the TITAN V, the GTX 1080-successor is proving to be an interesting card to look forward to. But then what if this is the fabled GTX 1180+ / GTX 2080+, slated for late-September?



View at TechPowerUp Main Site
 
i doubt it, if titan v runs at similar power to titan xp then there is no way that these new cards are going to be "hot, or hungry". my guess is that the dual fan reference design is to entice people to buy the founders edition cards before the aib cards drop
 
Regardless of why, the fact they are not using the ancient blower style anymore is a step in the right direction.
 
i doubt it, if titan v runs at similar power to titan xp then there is no way that these new cards are going to be "hot, or hungry". my guess is that the dual fan reference design is to entice people to buy the founders edition cards before the aib cards drop
Remember that Titan V gets a lot of power efficiency from HBM2 which uses about a third of the same power GDDR5 uses for the same performance, and doesn't have a significant increase in amount of cores. So HBM2 efficiency covers the computing cores increase.

...Which means that Titan V and Titan xp perf/watt are roughly same.
 
Last edited:
Hotter and hungrier, just to stay ahead of the competition by a few points in Benchmarks.
The way I see it, better efficiency and lower power consumption wins, lets see which uses the most at the wall.
 
Or they're just not doing Founder's Edition for the series. NVIDIA has had plenty of time to stockpile chips and send them to AIBs so there's no delay between launch date and AIB availability. Alternatively, it could be an AIB card that was described that NVIDIA may have bulk ordered to carry the Founder's Edition badge (e.g. Asus, MSI, or Gigabyte as an informal apology for GeForce Partner Program).
 
Hotter and hungrier, just to stay ahead of the competition by a few points in Benchmarks.
Their competition is a +300W card that is 200W GTX 1080 performance.

I think they just want to sell more FE cards, this time they're offering a solution similar to AIB coolers, they'll probably offer a blower one as well. As for the "hot and hungry", 8-pin works for 150-180W cards fairly well, but once you go near 200W you'd better have that extra 6-pin, max power spikes are always higher than avg. power consumption. It's gonna be 1080Ti with GDDR6 and slightly more efficient process, 10-15% faster, with TDP probably half way between 1080 and 1080Ti.
 
Last edited:
That's an MSI card in the pic, but to be honest it would be awesome to have something like this on the factory card. Those coolers are the best both performance and noise wise.
 
If this is true, and it's running hot, then it looks like nVidia is struggling to innovate, and is instead relying on overclocking the chip to get performance...

nVidia has had a long time with no pressure on them to make this "new" GPU, so this is rather telling, if true.
 
NVIDIA is not struggling to innovate. They just don't really care at this point as they can milk some more money out of old Pascal with minimal effort. I mean, really, why would they throw in millions if they can just stretch Pascal a bit and call it a day. So what if it's hot, it'll be faster than anything one can buy for gaming and that's enough for some. I mean, whoever is buying the fastest Ferrari, petrol consumption is the last thing they care about. It's not much different here. All this efficiency is all nice and fancy, but if you have the fastest card in the world, would you really care? I know I wouldn't.
 
NVIDIA is not struggling to innovate. They just don't really care at this point as they can milk some more money out of old Pascal with minimal effort. I mean, really, why would they throw in millions if they can just stretch Pascal a bit and call it a day. So what if it's hot, it'll be faster than anything one can buy for gaming and that's enough for some. I mean, whoever is buying the fastest Ferrari, petrol consumption is the last thing they care about. It's not much different here. All this efficiency is all nice and fancy, but if you have the fastest card in the world, would you really care? I know I wouldn't.
I agree, basically the same as Intel did with the CPU's. As for the two fan set up I as well think it might be for the more powerful version coming out later in the year
 
This reminds me of when TT "industry sources" said the 1180 & 1170 were going to be released in July
 
If blower did have cooled Fermi, they could on anything, it's just a question of design / cost or anything else
 
If this is true, and it's running hot, then it looks like nVidia is struggling to innovate, and is instead relying on overclocking the chip to get performance...

nVidia has had a long time with no pressure on them to make this "new" GPU, so this is rather telling, if true.

Nobody knows for sure what's in the chip yet. I don't see the OC as the issue. It all depends what they've put in the hardware. By all accounts (metaphorically speaking) Nvidia are bringing a higher degree of compute back to their chips. And that's a hotty right there.
 
Hotter and hungrier, just to stay ahead of the competition by a few points in Benchmarks.
The way I see it, better efficiency and lower power consumption wins, lets see which uses the most at the wall.

AMD can't match Pascal in either efficiency or raw pixel crunching. I'd say there's some room for Nvidia to up the power draw while still keeping its leadership.
I tend to be rather laid back when it comes to high-end cards because I don't buy them. I buy mid-range and the power draw/efficiency is much better in that segment regardless of what happens at the top. Obviously ymmv.

So, does it mean hope for AMD? :D
Your competition raking in cash because you underperform never equals hope.
AMD's hope is Zen can make them enough cash to fund their GPU game. Just like their GPU game kept them afloat during Bulldozer days. So there is hope for AMD, but it has nothing to do with this announcement right here.
 
They are going with the same strategy that they had with the second generation of Kepler based GPUs. Larger dies with higher frequencies and no tangible advancement in power efficiency, they can't rely on new nodes everytime.
 
Why does wording in OP sound like it was written by an nVidia employee working in marketing department?

So, does it mean hope for AMD? :D
Elaborate on "hope for AMD", I might have missed the problems they had recently, looking at their stock.

NVIDIA is not struggling to innovate. They just don't really care at this point as they can milk some more money out of old Pascal with minimal effort. I mean, really, why would they throw in millions if they can just stretch Pascal a bit and call it a day.
First, it's billions, and second, because developing GPUs takes many months and there are likely numerous projects at nVidia that has never been seen by consumer.

It was obvious that Pascal had thermal headroom. It was also obvious that 20-25% bump from arch bump on the same process node would not have come for free.
 
Last edited:
This is allegedly leaked info by an nVidia employee:

 
That might be an option as well but I have a G-Sync monitor so...
Well, I guess competition is good for everyone, even if you buy NVIDIA exclusively. Look what Intel did after Ryzen. They stole the MOAR CORES tactics.
 
Well, I guess competition is good for everyone, even if you buy NVIDIA exclusively. Look what Intel did after Ryzen. They stole the MOAR CORES tactics.
Competition is always good for technology world and consumers in general...unfortunately sometimes some companies that should be in competition arrange prices and the customer has to suffer off this
 
This is allegedly leaked info by an nVidia employee:

Makes sense to me. The Tensor cores in Volta are not things NVIDIA wants to waste fab space on for gamers. Turing always made sense considering the delay between Pascal and now. NVIDIA pushing RTX also fits NVIDIA's modus operandi.
 
They are going with the same strategy that they had with the second generation of Kepler based GPUs. Larger dies with higher frequencies and no tangible advancement in power efficiency, they can't rely on new nodes everytime.
RTX alone would disagree with that assertion. But it is true that besides tensors and RTX, not many improvements have been advertised for Volta, so you're probably not far off base.
 
Hey, at least their top don't isn't using LC yet, unlike...
 
Back
Top