• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Fixes RTX 2080 Ti & RTX 2080 Power Consumption. Tested. Better, But not Good Enough

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,653 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
While conducting our first reviews for NVIDIA's new GeForce RTX 2080 and RTX 2080 Ti we noticed surprisingly high non-gaming power consumption from NVIDIA's latest flagship cards. Back then, we reached out to NVIDIA who confirmed that this is a known issue which will be fixed in an upcoming driver.

Today the company released version 411.70 of their GeForce graphics driver, which, besides adding GameReady support for new titles, includes the promised fix for RTX 2080 & RTX 2080 Ti.

We gave this new version a quick spin, using our standard graphics card power consumption testing methodology, to check how things have been improved.



As you can see, single-monitor idle power consumption is much better now, bringing it to acceptable levels. Even though the numbers are not as low as Pascal, the improvement is great, reaching idle values similar to AMD's Vega. Blu-ray power is improved a little bit. Multi-monitor power consumption, which was really terrible, hasn't seen any improvements at all. This could turn into a deal breaker for many semi-professional users looking at using Turing, not just for gaming, but productivity with multiple monitors. An extra power draw of more than 40 W over Pascal will quickly add up into real Dollars for PCs that run all day, even though they're not used for gaming most of the time.



We also tested gaming power consumption and Furmark, for completeness. Nothing to report here. It's still the most power efficient architecture on the planet (for gaming power).



The table above shows monitored clocks and voltages for the non-gaming power states and it looks like NVIDIA did several things: First, the memory frequency in single-monitor idle and Blu-ray has been reduced by 50%, which definitely helps with power. Second, for the GTX 2080, the idle voltages have also been lowered slightly, to bring them in line with the idle voltages of RTX 2080 Ti. I'm sure there's additional under-the-hood improvements to power management, internal ones, that are not visible to any monitoring.

Let's just hope that multi-monitor idle power gets addressed soon, too.

View at TechPowerUp Main Site
 
Strange !!! No one talks about power consumption of New Nvidia Card. Nearly Vega 64 level.
 
RTX2080Ti eats the same as GTX970 on idle... What you mean "not good enough"?? Might as well u sell that "useless" monster card to me & get them discounted GT1030 that no one wants if u think the idle power draw is "bad". TBH, it's still better off than that AMD flop; the Vega 64 which unfortunately still getting its ass kicked by the "useless" full-fledged TU102 card in most gaming benches.
 
Prolly cause it's twice as fast with the addition of tensor and rt cores while still on 12nm and still consumes less.
 
RTX2080Ti eats the same as GTX970 on idle... What you mean "not good enough"?? Might as well u sell that "useless" monster card to me & get them discounted GT1030 that no one wants if u think the idle power draw is "bad".
talking about gaming consumption. which isnt good.
 
talking about gaming consumption. which isnt good.
2080TI is 45% faster than Vega 64, while consuming 5-10% less power. What are you talking about?
 
Strange !!! No one talks about power consumption of New Nvidia Card. Nearly Vega 64 level.


RT and Tensor cores arent being taxed yet. 1/3 of the die isnt being utilized yet

I'd be interesting how it reacts or if there is any impact. Power usage go up or does it clock down.
 
Last edited:
2080TI is 45% faster than Vega 64, while consuming 5-10% less power. What are you talking about?
50% is half of 100%.

relative-performance_3840-2160.png



2080 is 45% faster than V64, 78/54=1.444x
 
Cost of RTX:confused:

Although you're not taking into account the increased performance.
By this logic, Vega 64 looks much better. More performance with 20W to 25W more heat.
 
What puzzles me is the fact that NVIDIA could get a bunch of performance out of the GTX 1080Ti for the same power consumption as the GTX 980Ti. Now they're just pumping more watts into these RTX cards to get even the smallest of performance gain over the last generation.
 
This could turn into a deal breaker for many semi-professional users looking at using Turing, not just for gaming, but productivity with multiple monitors.

No.

Extra 30W more is almost nothing in a long term. People who shell out at the very least $500 for a GPU and also for an extra monitor couldn't care less about this extra wattage.
 
What puzzles me is the fact that NVIDIA could get a bunch of performance out of the GTX 1080Ti for the same power consumption as the GTX 980Ti. Now they're just pumping more watts into these RTX cards to get even the smallest of performance gain over the last generation.

Could lithography be a major factor?

980 TI: 28nm
1080 TI: 16nm = 0.57x shrink
2080 TI: 12nm = 0.75x shrink
 
Low quality post by Assimilator
Strange !!! No one talks about power consumption of New Nvidia Card. Nearly Vega 64 level.

HERP DERP GTX 2080 CONSUMES AS MUCH POWER AS VEGA LET'S JUST IGNORE THAT IT'S TWICE AS FAST

What puzzles me is the fact that NVIDIA could get a bunch of performance out of the GTX 1080Ti for the same power consumption as the GTX 980Ti. Now they're just pumping more watts into these RTX cards to get even the smallest of performance gain over the last generation.

HERP DERP A 35% PERFORMANCE GAIN OVER LAST GENERATION IS SMALL
 
You need to learn how to read a graph. The 2080 TI is 85% faster than Vega 64.
Oh yeah, you're right. That's even better!
 
The multi-monitor power draw is still pretty bad compared to last generation. I have a large main monitor and a smaller one to the side just like their testing set up. Drawing that much idle is not great.
 
HERP DERP A 35% PERFORMANCE GAIN OVER LAST GENERATION IS SMALL
If that was the gain 2080->2080ti that would be impressive. As-is, 35% isn’t bad but I’m not actually impressed like I was with the 1080. Add in the cost and the regression in power usage and I think you’ll see how people aren’t jumping on the bandwagon, even if it isn’t an objectively bad card.

Also that all caps herp derp is a *fantastic* argument
 
I am not impressed by power use. Rtx looks like some thing from a gen or two older compared to pascal. For not to mention compared to gtx 1080 ti vs. Rtx 2080 ti. Ilde is not so bad, but then it goes down hill. Twice as much in blu-ray and multi monitor is laufhable.

Just another good reason to stay on pascal in sted of these overpriced rtx cards. I will stick to my gtx 1080 ti for sure.
 
Last edited:
Strange !!! No one talks about power consumption of New Nvidia Card. Nearly Vega 64 level.

How FUD works 101.

From the charts above, multi-monitor:
Vega - 17W
2080Ti - 58W

But mentioning Vega "in certain context" did it's job.

So, yea, nearly like Vega, if you meant 3 times Vega, or almost 4 times Vega.

PS
And I don't believe the spin to be accidental, sorry.
 
Strange !!! No one talks about power consumption of New Nvidia Card. Nearly Vega 64 level.
If you think power consumption alone was the problem, you completely missed the conversations in the past.

The problem with AMD's power consumption was that vega 64, for instance, drew WAY more power (and ran way hotter) then a 1080, hell more then a 1080ti, and only performed at 1080 levels some of the time.

That is not an issue here, as the 2080ti is drawing less power then a vega 64 while curb-stomping it performance wise. Now, if AMD came out with a magic GPU that performed at 2080ti level but only pulled 2080 power, then nvidia would be criticized for being so inefficient by comparison.

No.

Extra 30W more is almost nothing in a long term. People who shell out at the very least $500 for a GPU and also for an extra monitor couldn't care less about this extra wattage.
Yes, cost isnt the issue, lower OC headroom and extra heat are the problems.
 
If you're concerned about power draw during Bluray playback you're probably not in the market for an RTX2080(ti).

BTW... you guys realize Perf/Watt is still up 18%, right?
 
How FUD works 101.

From the charts above, multi-monitor:
Vega - 17W
2080Ti - 58W

But mentioning Vega "in certain context" did it's job.

So, yea, nearly like Vega, if you meant 3 times Vega, or almost 4 times Vega.

PS
And I don't believe the spin to be accidental, sorry.
I was talking about single monitor, isnt that clear enough from the text?
 
I was talking about single monitor, isnt that clear enough from the text?
Of course it is.
To me.
Is it clear to IceShroom? A rhetorical question.
 
(and ran way hotter) then a 1080,

I still don't understand why this information is going around. Compare both reference cards to each other and they stop at damn near the exact same temperatures because they both have hard locked temperature limits enabled.

Vega 64's temperature limit is 85C, GTX 1080's temperature limit is 83C. Is 2C really so much higher that it's causing that much of a problem?

Vega 64 Temperatures of different power/performance presets:
temp.png

(From here)

GTX 1080 Founders Edition temperatures, stock and overclocked:
temp.png

(From here)
 
Back
Top