• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Fixes RTX 2080 Ti & RTX 2080 Power Consumption. Tested. Better, But not Good Enough

Joined
Aug 6, 2017
Messages
7,412 (3.02/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
I still don't understand why this information is going around. Compare both reference cards to each other and they stop at damn near the exact same temperatures because they both have hard locked temperature limits enabled.

Vega 64's temperature limit is 85C, GTX 1080's temperature limit is 83C. Is 2C really so much higher that it's causing that much of a problem?

Vega 64 Temperatures of different power/performance presets:

(From here)

GTX 1080 Founders Edition temperatures, stock and overclocked:

(From here)
lol but at what rpm, I had 1080 FE, I can attest that stock it runs at 50% fan and is fairly quiet, it's at +70% that it starts to sound annoying.The temps argument is usually if not always about the noise, low temps = ability to turn the fan down, high temps - the need for noisy cooling too. 32-34 db is quiet, 37db is quite manageable, 40 is quite loud for most people, 45 is noisy as hell.
 
Joined
May 30, 2015
Messages
1,873 (0.58/day)
Location
Seattle, WA
The temps argument is usually if not always about the noise

The original statement was the card runs significantly hotter, not louder. There was no mention of the noise. You're deflecting the argument at your convenience to another attribute that was not mentioned anywhere in the above comments. I won't argue against your point because you are not wrong, the acoustics of the Vega 64 are higher, but that doesn't change the maximum temperature limit of the card which was the point of discussion. Regardless of fan noise, the card is limited to 85C out of the box.
 
Joined
Nov 24, 2017
Messages
853 (0.36/day)
Location
Asia
Processor Intel Core i5 4590
Motherboard Gigabyte Z97x Gaming 3
Cooling Intel Stock Cooler
Memory 8GiB(2x4GiB) DDR3-1600 [800MHz]
Video Card(s) XFX RX 560D 4GiB
Storage Transcend SSD370S 128GB; Toshiba DT01ACA100 1TB HDD
Display(s) Samsung S20D300 20" 768p TN
Case Cooler Master MasterBox E501L
Audio Device(s) Realtek ALC1150
Power Supply Corsair VS450
Mouse A4Tech N-70FX
Software Windows 10 Pro
Benchmark Scores BaseMark GPU : 250 Point in HD 4600
How FUD works 101.

From the charts above, multi-monitor:
Vega - 17W
2080Ti - 58W

But mentioning Vega "in certain context" did it's job.

So, yea, nearly like Vega, if you meant 3 times Vega, or almost 4 times Vega.

PS
And I don't believe the spin to be accidental, sorry.
I was talking about average gaming power consumption. Didn't looked carefully for idle, as I thought Nvidia is "efficient" at gaming, so they may sip no power with multi monitor.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.98/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
That fix came quickly, good show for NVIDIA.

@W1zzard Did you try testing the power consumption using different screen resolutions? I think the cards will show differences there and I suggest using 2K and 4K video modes since they're the most common.
 

no_

New Member
Joined
Sep 28, 2018
Messages
1 (0.00/day)
if your past results showed 11w for 1080ti idle then your methodology is misleading, intentional or otherwise

the 10 series have a widely-documented problem with idle power consumption (which you guys being "journalists" should already know) which is popularly believed to stem from mixed refresh rates (example one or more 60hz monitor and one or more higher than that such as 144hz) causing the card to not actually idle when it should be, resulting in power draw right around ... guess what ... 40 watts

however, contrary to that popular belief, that problem isnt directly from the mixed refresh rates, but instead from the mix of which ports are used on the card, with displayport in particular being the culprit. if you only use hdmi and/or dvi then the card will actually idle ... but the moment you use displayport in a multi-monitor setup because youre "supposed" to use displayport for higher refresh rates, it wont. dont believe me? saying to yourself "yeah i could totally prove you wrong if i had the hardware tools readily available for measuing off-the-wall-power but im too lazy to do that rn"? well the good news is you dont need hardware tools, all you need is afterburner or precision, whichever tool you prefer that can show the current clocks (to see if the cards actually in an idle state or not). 400ish core clock? idle. 1000+ core clock? not idle. comprende?

plug your multiple monitors into hdmi and/or dvi and NOT displayport and watch what happens to the clockspeed in afterburner/precision WHAT THE FUCK THE CARD IS ACTUALLY IDLING NOW ..... or if you only have 1 hdmi and no dvi because you have an FE, then plug the higher refresh rate monitor into hdmi and the lower refresh one(s) into displayport WHAT THE FUCK THE CARD IS ACTUALLY IDLING NOW

youre welcome

oh and before you say "hurr durr my higher refresh monitor wont work on hdmi because its not 2.0" ..... how sure are you that its not 2.0? do it anyway, go into NCP, and see what's there. wait ... do you mean ... that people on the internet (like nvidia and OEMS) would go on the internet and just tell lies ...?

youre welcome, again
 
Joined
Mar 24, 2012
Messages
528 (0.12/day)
No.

Extra 30W more is almost nothing in a long term. People who shell out at the very least $500 for a GPU and also for an extra monitor couldn't care less about this extra wattage.

Haha i remember back in 2010 when they say using GTX480 will make your bill skyrocket by the end of month compared to 5870 for minimal performance increase over 5870. i remember some people end up making extensive calculation about it. In the end they conclude based on their calculation even in a year the difference in electricity cost between 5870 and 480 is very small (i don't remember the exact value but it was something like $10-$15 more for system with GTX480). Now people complaining that Turing draw just a bit more than pascal and said they were happy choosing pascal over Turing.
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,731 (3.41/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
I like efficiency just as much as the next guy, but the argument that running these cards is more expensive than a more efficient one is ridiculous, unless you are buying them by the pallet for a mining farm or something, in which case, that little bit more power would be multiplied over X (high) number of cards which experience load conditions constantly. While the (still) high multi monitor power consumption is a little unsettling, in the real world it hardly matters.

Pascal was a bit of an outlier. Due to lack of competition, nVidia simply tweaked Maxwell and, along with a die shrink, both increased performance and efficiency, and the power draw was superb. Now, with Turing, we have really big dies thanks to all the RT stuff (big die means inefficient) and more shaders/CUDA cores/whatever they're called now.

Next time competition heats up, especially in the high performance segment, nobody will give a flying whale turd about power draw and focus on getting the best performance, because that's what wins sales. While we like efficiency, almost nobody will buy a lower performing product because it's more efficient. We need FPS!
 
Low quality post by LDNL
Joined
Aug 6, 2017
Messages
7,412 (3.02/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
The original statement was the card runs significantly hotter, not louder. There was no mention of the noise. You're deflecting the argument at your convenience to another attribute that was not mentioned anywhere in the above comments. I won't argue against your point because you are not wrong, the acoustics of the Vega 64 are higher, but that doesn't change the maximum temperature limit of the card which was the point of discussion. Regardless of fan noise, the card is limited to 85C out of the box.
Wrong.

the original point was: "drew WAY more power (and ran way hotter)", you deflected it to make it temperature limit,while it was about gaming temps and power all the time.Also,you can keep arguing that one can talk about temps and noise as separate issues,but not with me anymore.If one card is running 83 degrees at 50% fan while the other has 85 degrees at 90% then we have a clear temperature winner here.At 90% fan the gtx 1080 fe stayed at 70 degrees with 2063/11000 OC IIRC, I had it for a couple of months only.
 
Last edited:
Joined
May 30, 2015
Messages
1,873 (0.58/day)
Location
Seattle, WA
Wrong.

the original point was: "drew WAY more power (and ran way hotter)", you deflected it to make it temperature limit,while it was about gaming temps and power all the time.

I quite intentionally addressed exactly what I quoted from that post, as it was the only part I disagreed with. You introduced an entirely different variable, in this case fan noise, as if it were an argument against what I said about the cards having a thermal limit.

If one card is running 83 degrees at 50% fan while the other has 85 degrees at 90% then we have a clear temperature winner here.At 90% fan the gtx 1080 fe stayed at 70 degrees with 2063/11000 OC IIRC, I had it for a couple of months only.

You appear to be throwing out unsubstantiated numbers as if they are fact, and attempting to support them with subjective anecdotes, then declaring a "winner" based on that information. If you read TPUs very own review of the Vega 64 (which I linked in my first post) you'll notice on the 'Fan Noise' page that it says, "AMD has limited the fan's maximum speed to ensure it doesn't get even noisier, even when the card climbs up to 85°C, its maximum temperature level."

This rather pointedly dismisses your outlandish "90% fan speed" statement, and again reinforces my statement that the cards have a maximum thermal limit that is a mere 2 degrees above the GTX 1080. Not dependent on fan speed, high or low.
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,731 (3.41/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
Meh, in my eyes if you force a fan to be quiet, but also have a temperature limit that can't be surpassed... there's only one thing left to do: throttle the card. Not good.
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
What puzzles me is the fact that NVIDIA could get a bunch of performance out of the GTX 1080Ti for the same power consumption as the GTX 980Ti. Now they're just pumping more watts into these RTX cards to get even the smallest of performance gain over the last generation.
No, they could not. 1080Ti as well as the rest of the Pascal line hit a pretty solid wall somewhere between 2000 and 2100 MHz. Increasing power consumption brings dininishing returns from there. Yes, the retail cards are voltage-limited but extreme overclocking guys have shown clearly that scaling voltage up from there immediately destroys efficiency. It was pretty much the same for 980Ti and Maxwell and so far Turing seems to be in the same boat as Pascal, with possibility to get maybe 50-100 MHz more out of it. This all is defined by process as well as architecture.

Could lithography be a major factor?
980 TI: 28nm
1080 TI: 16nm = 0.57x shrink
2080 TI: 12nm = 0.75x shrink
Lithography is definitely a factor. So is architecture. Nvidia said a lot of transistors in Pascal went to increasing clock speed. The same remains true for Turing.
While 28nm > 16nm is a real node shrink, 16nm > 12nm actually is not. 12nm is more optimized (mainly power) but it is all basically the same process.

The multi-monitor power draw is still pretty bad compared to last generation. I have a large main monitor and a smaller one to the side just like their testing set up. Drawing that much idle is not great.
I have two 1440p monitors (and a 4K TV that is connected half the time) and 2080 so far does not go below 50W even with the latest 411.70 drivers. It is not the first time Nvidia has fucked up the multi-monitor idle modes and will not be the last. Sounds like something they should be able to fix in the drivers though.

if your past results showed 11w for 1080ti idle then your methodology is misleading, intentional or otherwise

the 10 series have a widely-documented problem with idle power consumption (which you guys being "journalists" should already know) which is popularly believed to stem from mixed refresh rates (example one or more 60hz monitor and one or more higher than that such as 144hz) causing the card to not actually idle when it should be, resulting in power draw right around ... guess what ... 40 watts

however, contrary to that popular belief, that problem isnt directly from the mixed refresh rates, but instead from the mix of which ports are used on the card, with displayport in particular being the culprit. if you only use hdmi and/or dvi then the card will actually idle ... but the moment you use displayport in a multi-monitor setup because youre "supposed" to use displayport for higher refresh rates, it wont. dont believe me? saying to yourself "yeah i could totally prove you wrong if i had the hardware tools readily available for measuing off-the-wall-power but im too lazy to do that rn"? well the good news is you dont need hardware tools, all you need is afterburner or precision, whichever tool you prefer that can show the current clocks (to see if the cards actually in an idle state or not). 400ish core clock? idle. 1000+ core clock? not idle. comprende?
Interesting. I was seeing that exact problem with 1080Ti for a while but it was resolved at some point in drivers. 2080 on the other hand seems to still run into the same issue.

@W1zzard, what monitors did you have in multi-monitor testing and which ports? :)
Also, have you re-tested the Vega64 or are you using older results? About a month and couple drivers after release my Vega 64 started to idle at some incredibly low frequency like 32Mhz with 1-2 monitors and reported single-digit power consumption at that.

This rather pointedly dismisses your outlandish "90% fan speed" statement, and again reinforces my statement that the cards have a maximum thermal limit that is a mere 2 degrees above the GTX 1080. Not dependent on fan speed, high or low.
Cards do have a thermal limit but that is usually at 100-105C.
83, 85 or 90 are the usual suspects for maximum temperature you see because that is where the fan controller is configured to.
 
Last edited:

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
18,932 (2.85/day)
Location
Piteå
System Name Black MC in Tokyo
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Line6 UX1 + some headphones, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
VR HMD Acer Mixed Reality Headset
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
No.

Extra 30W more is almost nothing in a long term. People who shell out at the very least $500 for a GPU and also for an extra monitor couldn't care less about this extra wattage.

If it had comptetion I'd definitely take a monitor pulling 30W less in idle.
 
Joined
Nov 5, 2014
Messages
714 (0.21/day)
So, they released a new driver to combat idle power, and it's even worse than the old driver in idle if you have multiple screens (as most people buying a 1k+ card do), why even bother to release it? lol
 
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
if your past results showed 11w for 1080ti idle then your methodology is misleading, intentional or otherwise

the 10 series have a widely-documented problem with idle power consumption (which you guys being "journalists" should already know) which is popularly believed to stem from mixed refresh rates (example one or more 60hz monitor and one or more higher than that such as 144hz) causing the card to not actually idle when it should be, resulting in power draw right around ... guess what ... 40 watts

however, contrary to that popular belief, that problem isnt directly from the mixed refresh rates, but instead from the mix of which ports are used on the card, with displayport in particular being the culprit. if you only use hdmi and/or dvi then the card will actually idle ... but the moment you use displayport in a multi-monitor setup because youre "supposed" to use displayport for higher refresh rates, it wont. dont believe me? saying to yourself "yeah i could totally prove you wrong if i had the hardware tools readily available for measuing off-the-wall-power but im too lazy to do that rn"? well the good news is you dont need hardware tools, all you need is afterburner or precision, whichever tool you prefer that can show the current clocks (to see if the cards actually in an idle state or not). 400ish core clock? idle. 1000+ core clock? not idle. comprende?

plug your multiple monitors into hdmi and/or dvi and NOT displayport and watch what happens to the clockspeed in afterburner/precision WHAT THE FUCK THE CARD IS ACTUALLY IDLING NOW ..... or if you only have 1 hdmi and no dvi because you have an FE, then plug the higher refresh rate monitor into hdmi and the lower refresh one(s) into displayport WHAT THE FUCK THE CARD IS ACTUALLY IDLING NOW

youre welcome

oh and before you say "hurr durr my higher refresh monitor wont work on hdmi because its not 2.0" ..... how sure are you that its not 2.0? do it anyway, go into NCP, and see what's there. wait ... do you mean ... that people on the internet (like nvidia and OEMS) would go on the internet and just tell lies ...?

youre welcome, again

I have a GTX 1070 with two monitors connected via DP and HDMI (different resolutions, same refresh rates) and my idle core clock is 135MHz. 400MHz is absolutely not idle in any way shape or form.

I'm still trying to decipher what this claimed bug is from your rant... are you saying there is an issue if using monitors with different refresh rates AND one is plugged into DP? Or is it only if more than one monitor is used, and one of them is plugged into a DP? The only reference to this I can find online is from 2016, can you maybe give me some recent links so I can do further research?
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
2 and 3 connected monitors has different behaviour. Sometimes considerably different.
Not @no_ but I guess I can answer that - Different refresh rates and one plugged into DP (which often enough is a requirement for high refresh rate).

That power bug link you found is related but as far as I remember it was not the exact bug as after the fix for that there were still problems.

I honestly havent found a good source for this either. It only occasionally comes up in a conversation on different topics, mainly using mixed refresh rate monitors, high refresh rate monitors, sometimes driver updates, power consumption/tweaking threads etc. For what it's worth, I have not seen this in a long time with Nvidia 10-series GPUs. It used to be a problem but not any more.
 
Joined
Feb 18, 2017
Messages
688 (0.26/day)
Strange !!! No one talks about power consumption of New Nvidia Card. Nearly Vega 64 level.
Of course it offers plenty more performance. However, it's strange that when an AMD card does the same, it gets the treatment from NV users, when their card is doing the same, they keep quiet. :)

2080TI is 45% faster than Vega 64, while consuming 5-10% less power. What are you talking about?


You know Vega 64 is capable of huge UV? Even you don't have to do that, there are preset BIOS-es. The most efficient nearly lowers consumption by 100 W.
 
Joined
Aug 6, 2017
Messages
7,412 (3.02/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
I quite intentionally addressed exactly what I quoted from that post, as it was the only part I disagreed with. You introduced an entirely different variable, in this case fan noise, as if it were an argument against what I said about the cards having a thermal limit.



You appear to be throwing out unsubstantiated numbers as if they are fact, and attempting to support them with subjective anecdotes, then declaring a "winner" based on that information. If you read TPUs very own review of the Vega 64 (which I linked in my first post) you'll notice on the 'Fan Noise' page that it says, "AMD has limited the fan's maximum speed to ensure it doesn't get even noisier, even when the card climbs up to 85°C, its maximum temperature level."

This rather pointedly dismisses your outlandish "90% fan speed" statement, and again reinforces my statement that the cards have a maximum thermal limit that is a mere 2 degrees above the GTX 1080. Not dependent on fan speed, high or low.
limited to what ? 45 dba.

You know Vega 64 is capable of huge UV? Even you don't have to do that, there are preset BIOS-es. The most efficient nearly lowers consumption by 100 W.

And performance to sub-1070Ti level.
 
Joined
Mar 10, 2014
Messages
1,793 (0.48/day)
Why should they, when perf/W is the best what it ever been?

These power numbers are not really that bad for desktop usage anyway. Where it's really matters is mobile space, I kind of doubt we will see RTX cards on laptops anytime soon. Maybe some TU106 variant will made it, but Nvidia will have the dilemma how marketing that. Pascal where all about equalizing naming with desktop and laptops parts, so will they name RTX 2070 as their highest mobile card or go back to laptop variant is tier lower than desktop variant.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Of course it offers plenty more performance. However, it's strange that when an AMD card does the same, it gets the treatment from NV users, when their card is doing the same, they keep quiet. :)
Now, imagine these two situations:
1. Card A consumes 20 units of power and has 20 units of performance. Card B consumes 20 units of power and has 35 units of performance.
2. Card A consumes 20 units of power and has 20 units of performance. Card B consumes 12 units of power and has 20 units of performance.
In which situation and for which card is power consumption a problem?
You know Vega 64 is capable of huge UV? Even you don't have to do that, there are preset BIOS-es. The most efficient nearly lowers consumption by 100 W.
No, it is not. You can do a minor undervolt from stock settings that won't help much with anything. All the "Undervolt" articles are about overclocking and include raising the power limit to 150%.
 
Joined
Aug 6, 2017
Messages
7,412 (3.02/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
it's strange that when an AMD card does the same, it gets the treatment from NV users, when their card is doing the same, they keep quiet. :)

yes cause people are gonna complain about the most power efficient card out there. Vega can't beat the 28nm 600mm2 GM200 with DDR5 in efficiency.
 
Joined
Mar 10, 2014
Messages
1,793 (0.48/day)
Now, imagine these two situations:
1. Card A consumes 20 units of power and has 20 units of performance. Card B consumes 20 units of power and has 35 units of performance.
2. Card A consumes 20 units of power and has 20 units of performance. Card B consumes 12 units of power and has 20 units of performance.
In which situation and for which card is power consumption a problem?
No, it is not. You can do a minor undervolt from stock settings that won't help much with anything. All the "Undervolt" articles are about overclocking and include raising the power limit to 150%.

Hmh. Why I'm remember that UV was first done with vega to overcome OC bug at the early days of vega. I think the best way for Vega is not the OC gpu itself at all but UV it and OC HBM2 freqs.. But all this is Off Topic anyway, so sorry and carry on with the Turing power.
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Hmh. Why I'm remember that UV was first done with vega to overcome OC bug at the early days of vega. I think the best way for Vega is not the OC gpu itself at all but UV it and OC HBM2 freqs.. But all this is Off Topic anyway, so sorry and carry on with the Turing power.
OC bugs were different and there were several. If I remember correctly, in the initial couple versions of tools memory voltage actually set the GPU voltage, some P-states were tricky etc.
Undervolting is a real thing as the GPU is (arguably) overvolted but it does not really do much at stock settings. You get about 20-30MHz from undervolting leaving everything else the same.
Undervolt is definitely a requirement for overclocking for Vega as OC potential without undervolting is very limited. Even at 150% power limit you will not see considerable boost and probably not even spec boost clock without undervolting first. This does not reduce the power consumption either.
 
Joined
Jun 28, 2018
Messages
299 (0.14/day)
Strange !!! No one talks about power consumption of New Nvidia Card. Nearly Vega 64 level.

Yeah, a card that consumes less power in gaming than a Vega 64 and offers 85% more performance... what a joke Nvidia is.

C´mon people, this trend to defend the "underdog" at all costs is getting pretty ridiculous at this point.
 
Joined
Aug 6, 2017
Messages
7,412 (3.02/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
As far as efficiency 2080ti is pretty friggin sweet, 750mm2, runs 4K like my 1080 does 1440p (awesome), packs RT and tensor.It's the usefulness of those extra features that is under doubt,not the card's efficiency. Then again, a $700 and $1000 card is not for those who want something ordinary. I just wish they made $600 variant of 2080 with RT and tensor gone.I'd still buy the $699 full chip but a lot of people would be happier.
 
Top