Thursday, September 27th 2018

NVIDIA Fixes RTX 2080 Ti & RTX 2080 Power Consumption. Tested. Better, But not Good Enough

While conducting our first reviews for NVIDIA's new GeForce RTX 2080 and RTX 2080 Ti we noticed surprisingly high non-gaming power consumption from NVIDIA's latest flagship cards. Back then, we reached out to NVIDIA who confirmed that this is a known issue which will be fixed in an upcoming driver.

Today the company released version 411.70 of their GeForce graphics driver, which, besides adding GameReady support for new titles, includes the promised fix for RTX 2080 & RTX 2080 Ti.

We gave this new version a quick spin, using our standard graphics card power consumption testing methodology, to check how things have been improved.

As you can see, single-monitor idle power consumption is much better now, bringing it to acceptable levels. Even though the numbers are not as low as Pascal, the improvement is great, reaching idle values similar to AMD's Vega. Blu-ray power is improved a little bit. Multi-monitor power consumption, which was really terrible, hasn't seen any improvements at all. This could turn into a deal breaker for many semi-professional users looking at using Turing, not just for gaming, but productivity with multiple monitors. An extra power draw of more than 40 W over Pascal will quickly add up into real Dollars for PCs that run all day, even though they're not used for gaming most of the time.
We also tested gaming power consumption and Furmark, for completeness. Nothing to report here. It's still the most power efficient architecture on the planet (for gaming power).
The table above shows monitored clocks and voltages for the non-gaming power states and it looks like NVIDIA did several things: First, the memory frequency in single-monitor idle and Blu-ray has been reduced by 50%, which definitely helps with power. Second, for the GTX 2080, the idle voltages have also been lowered slightly, to bring them in line with the idle voltages of RTX 2080 Ti. I'm sure there's additional under-the-hood improvements to power management, internal ones, that are not visible to any monitoring.

Let's just hope that multi-monitor idle power gets addressed soon, too.
Add your own comment

53 Comments on NVIDIA Fixes RTX 2080 Ti & RTX 2080 Power Consumption. Tested. Better, But not Good Enough

#1
IceShroom
Strange !!! No one talks about power consumption of New Nvidia Card. Nearly Vega 64 level.
Posted on Reply
#2
Tsukiyomi91
RTX2080Ti eats the same as GTX970 on idle... What you mean "not good enough"?? Might as well u sell that "useless" monster card to me & get them discounted GT1030 that no one wants if u think the idle power draw is "bad". TBH, it's still better off than that AMD flop; the Vega 64 which unfortunately still getting its ass kicked by the "useless" full-fledged TU102 card in most gaming benches.
Posted on Reply
#3
cucker tarlson
Prolly cause it's twice as fast with the addition of tensor and rt cores while still on 12nm and still consumes less.
Posted on Reply
#4
R0H1T
IceShroomStrange !!! No one talks about power consumption of New Nvidia Card. Nearly Vega 64 level.
Cost of RTX :confused:

Although you're not taking into account the increased performance.
Posted on Reply
#5
T4C Fantasy
CPU & GPU DB Maintainer
Tsukiyomi91RTX2080Ti eats the same as GTX970 on idle... What you mean "not good enough"?? Might as well u sell that "useless" monster card to me & get them discounted GT1030 that no one wants if u think the idle power draw is "bad".
talking about gaming consumption. which isnt good.
Posted on Reply
#6
ZeppMan217
T4C Fantasytalking about gaming consumption. which isnt good.
2080TI is 45% faster than Vega 64, while consuming 5-10% less power. What are you talking about?
Posted on Reply
#7
Xzibit
IceShroomStrange !!! No one talks about power consumption of New Nvidia Card. Nearly Vega 64 level.
RT and Tensor cores arent being taxed yet. 1/3 of the die isnt being utilized yet

I'd be interesting how it reacts or if there is any impact. Power usage go up or does it clock down.
Posted on Reply
#8
cucker tarlson
ZeppMan2172080TI is 45% faster than Vega 64, while consuming 5-10% less power. What are you talking about?
50% is half of 100%.




2080 is 45% faster than V64, 78/54=1.444x
Posted on Reply
#9
IceShroom
R0H1TCost of RTX:confused:

Although you're not taking into account the increased performance.
By this logic, Vega 64 looks much better. More performance with 20W to 25W more heat.
Posted on Reply
#10
LDNL
What puzzles me is the fact that NVIDIA could get a bunch of performance out of the GTX 1080Ti for the same power consumption as the GTX 980Ti. Now they're just pumping more watts into these RTX cards to get even the smallest of performance gain over the last generation.
Posted on Reply
#11
robb
ZeppMan2172080TI is 45% faster than Vega 64, while consuming 5-10% less power. What are you talking about?
You need to learn how to read a graph. The 2080 TI is 85% faster than Vega 64.
Posted on Reply
#12
birdie
This could turn into a deal breaker for many semi-professional users looking at using Turing, not just for gaming, but productivity with multiple monitors.
No.

Extra 30W more is almost nothing in a long term. People who shell out at the very least $500 for a GPU and also for an extra monitor couldn't care less about this extra wattage.
Posted on Reply
#13
mouacyk
LDNLWhat puzzles me is the fact that NVIDIA could get a bunch of performance out of the GTX 1080Ti for the same power consumption as the GTX 980Ti. Now they're just pumping more watts into these RTX cards to get even the smallest of performance gain over the last generation.
Could lithography be a major factor?

980 TI: 28nm
1080 TI: 16nm = 0.57x shrink
2080 TI: 12nm = 0.75x shrink
Posted on Reply
#14
ZeppMan217
robbYou need to learn how to read a graph. The 2080 TI is 85% faster than Vega 64.
Oh yeah, you're right. That's even better!
Posted on Reply
#15
Andromos
The multi-monitor power draw is still pretty bad compared to last generation. I have a large main monitor and a smaller one to the side just like their testing set up. Drawing that much idle is not great.
Posted on Reply
#16
bubbleawsome
AssimilatorHERP DERP A 35% PERFORMANCE GAIN OVER LAST GENERATION IS SMALL
If that was the gain 2080->2080ti that would be impressive. As-is, 35% isn’t bad but I’m not actually impressed like I was with the 1080. Add in the cost and the regression in power usage and I think you’ll see how people aren’t jumping on the bandwagon, even if it isn’t an objectively bad card.

Also that all caps herp derp is a *fantastic* argument
Posted on Reply
#17
Tomgang
I am not impressed by power use. Rtx looks like some thing from a gen or two older compared to pascal. For not to mention compared to gtx 1080 ti vs. Rtx 2080 ti. Ilde is not so bad, but then it goes down hill. Twice as much in blu-ray and multi monitor is laufhable.

Just another good reason to stay on pascal in sted of these overpriced rtx cards. I will stick to my gtx 1080 ti for sure.
Posted on Reply
#18
medi01
IceShroomStrange !!! No one talks about power consumption of New Nvidia Card. Nearly Vega 64 level.
How FUD works 101.

From the charts above, multi-monitor:
Vega - 17W
2080Ti - 58W

But mentioning Vega "in certain context" did it's job.

So, yea, nearly like Vega, if you meant 3 times Vega, or almost 4 times Vega.

PS
And I don't believe the spin to be accidental, sorry.
Posted on Reply
#19
TheinsanegamerN
IceShroomStrange !!! No one talks about power consumption of New Nvidia Card. Nearly Vega 64 level.
If you think power consumption alone was the problem, you completely missed the conversations in the past.

The problem with AMD's power consumption was that vega 64, for instance, drew WAY more power (and ran way hotter) then a 1080, hell more then a 1080ti, and only performed at 1080 levels some of the time.

That is not an issue here, as the 2080ti is drawing less power then a vega 64 while curb-stomping it performance wise. Now, if AMD came out with a magic GPU that performed at 2080ti level but only pulled 2080 power, then nvidia would be criticized for being so inefficient by comparison.
birdieNo.

Extra 30W more is almost nothing in a long term. People who shell out at the very least $500 for a GPU and also for an extra monitor couldn't care less about this extra wattage.
Yes, cost isnt the issue, lower OC headroom and extra heat are the problems.
Posted on Reply
#20
Blueberries
If you're concerned about power draw during Bluray playback you're probably not in the market for an RTX2080(ti).

BTW... you guys realize Perf/Watt is still up 18%, right?
Posted on Reply
#21
W1zzard
medi01How FUD works 101.

From the charts above, multi-monitor:
Vega - 17W
2080Ti - 58W

But mentioning Vega "in certain context" did it's job.

So, yea, nearly like Vega, if you meant 3 times Vega, or almost 4 times Vega.

PS
And I don't believe the spin to be accidental, sorry.
I was talking about single monitor, isnt that clear enough from the text?
Posted on Reply
#22
medi01
W1zzardI was talking about single monitor, isnt that clear enough from the text?
Of course it is.
To me.
Is it clear to IceShroom? A rhetorical question.
Posted on Reply
#23
Fouquin
TheinsanegamerN(and ran way hotter) then a 1080,
I still don't understand why this information is going around. Compare both reference cards to each other and they stop at damn near the exact same temperatures because they both have hard locked temperature limits enabled.

Vega 64's temperature limit is 85C, GTX 1080's temperature limit is 83C. Is 2C really so much higher that it's causing that much of a problem?

Vega 64 Temperatures of different power/performance presets:

(From here)

GTX 1080 Founders Edition temperatures, stock and overclocked:

(From here)
Posted on Reply
#24
cucker tarlson
FouquinI still don't understand why this information is going around. Compare both reference cards to each other and they stop at damn near the exact same temperatures because they both have hard locked temperature limits enabled.

Vega 64's temperature limit is 85C, GTX 1080's temperature limit is 83C. Is 2C really so much higher that it's causing that much of a problem?

Vega 64 Temperatures of different power/performance presets:

(From here)

GTX 1080 Founders Edition temperatures, stock and overclocked:

(From here)
lol but at what rpm, I had 1080 FE, I can attest that stock it runs at 50% fan and is fairly quiet, it's at +70% that it starts to sound annoying.The temps argument is usually if not always about the noise, low temps = ability to turn the fan down, high temps - the need for noisy cooling too. 32-34 db is quiet, 37db is quite manageable, 40 is quite loud for most people, 45 is noisy as hell.
Posted on Reply
#25
Fouquin
cucker tarlsonThe temps argument is usually if not always about the noise
The original statement was the card runs significantly hotter, not louder. There was no mention of the noise. You're deflecting the argument at your convenience to another attribute that was not mentioned anywhere in the above comments. I won't argue against your point because you are not wrong, the acoustics of the Vega 64 are higher, but that doesn't change the maximum temperature limit of the card which was the point of discussion. Regardless of fan noise, the card is limited to 85C out of the box.
Posted on Reply
Add your own comment
Apr 27th, 2024 14:27 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts