• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU Clock decreases as temperature goes up

Joined
Oct 20, 2017
Messages
6 (0.00/day)
Processor Intel Core i7 6700K
Motherboard Asus MAXIMUS VIII Ranger
Cooling Noctua NH-D15
Memory Riphaws 2x8GB DDR4
Video Card(s) Gigabyte Aorus GTX 1080 Ti Xtreme Edition
Storage Samsung SSD Pro 850
Display(s) Asus MX239H
Power Supply CoolerMaster 850+
Software Windows 10 x64 15063.632
Hello fine ladies and gentlemen. This is my first post in this forum :)

So straight to the problem... About a month ago I bought a MSI Lighting Z 1080 Ti. It was a great card but its RGB LEDs was malfunctioning so I had to sent it back. Then I decided to buy a Gigabyte Aorus GTX 1080 Ti Xtreme Editon since Lighting Z wasn't available anymore.
But the problem I have with my Gigabyte Aorus is that GPU Clock decreases constantly as temperature goes up. At first (when running a game), Afterburner shows GPU Clock as 1987Mhz, then temperature reaches 58c and GPU Clock decreases to 1974Mhz. Then temperature goes up till 63c and again GPU Clock goes down to 1949Mhz. When temperature reaches 71c, GPU Clock decreases to 1932Mhz.

I know this is probably nvidia's GPU Boost which is working but why I didn't had this problem with my Lighting Z 1080 Ti? On my previous Lighting Z 1080 Ti, regardless of what temperature was, GPU Clock was constant 1962Mhz (as stated in this page). Sometimes the temperature was as high as 66c but still 1962Mhz. But my Aorus GTX 1080 Ti keeps decreasing GPU Clock as temperature goes up. So am I missing something? Is there a way to prevent thins using Afterburne or somethingr?

My second question...Why I can't reach these clock profiles with my Gigabyte Aorus GTX 1080 Ti Xtreme? I never had 2025Mhz or 1993Mhz of GPU Clock.

Thanks in advance.
 
Joined
Nov 18, 2010
Messages
7,124 (1.45/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX + under waterblock.
Storage Optane 900P[W11] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO[FEDORA]
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 39 / Windows 11 insider
It should start to decrease from 45C, and it is like that. Read up first...
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,029 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
GPU clock decreasing as temperature increases is a feature of GPU Boost 3.0. As far as I know there is no way to change its behavior or disable it.


Lightning Z
 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
@W1zzard
Yes there is. Cool it better. Either by improving case ventilation/cooling or by ramping up the fans on graphic card.

EDIT:
I have the regular AORUS GTX 1080Ti and there is another way. Manually bump up the clocks. That also changes how boost behaves. It'll keep higher clocks unless the temperatures really go that high that it'll decrease the clocks regardless...
 
Last edited:
Joined
Oct 20, 2017
Messages
6 (0.00/day)
Processor Intel Core i7 6700K
Motherboard Asus MAXIMUS VIII Ranger
Cooling Noctua NH-D15
Memory Riphaws 2x8GB DDR4
Video Card(s) Gigabyte Aorus GTX 1080 Ti Xtreme Edition
Storage Samsung SSD Pro 850
Display(s) Asus MX239H
Power Supply CoolerMaster 850+
Software Windows 10 x64 15063.632
GPU clock decreasing as temperature increases is a feature of GPU Boost 3.0. As far as I know there is no way to change its behavior or disable it.


Lightning Z

Yes. I know it's GPU Boost 3.0 but I think in my Aorus GTX 1080 Ti Xtreme it reduces GPU Clock more aggressively compared to my previous Lightning Z. For example Lightning Z had GPU Clock of 1962Mhz while working under ~65c or 66c degrees but on my Aorus GTX 1080 Ti, GPU Boost decreases GPU Clock to 1949Mhz at the same temperature. Why is that?

Also, do you have the same graph for Aorus GTX 1080 Ti Xtreme Edition? that could be more clarifying on the subject.

@W1zzard
Yes there is. Cool it better. Either by improving case ventilation/cooling or by ramping up the fans on graphic card.

EDIT:
I have the regular AORUS GTX 1080Ti and there is another way. Manually bump up the clocks. That also changes how boost behaves. It'll keep higher clocks unless the temperatures really go that high that it'll decrease the clocks regardless...

I tried to change fan curve in Afterburner in order to prevent my AORUS GTX 1080Ti hit upper than 66c but no luck. Even at 100 percent fan spinnig, GPU temperature reaches 71c in FurMark tests. Room temperature was ~20c.
 
Last edited by a moderator:

sneekypeet

Retired Super Moderator
Joined
Apr 12, 2006
Messages
29,409 (4.47/day)
System Name EVA-01
Processor Intel i7 13700K
Motherboard Asus ROG Maximus Z690 HERO EVA Edition
Cooling ASUS ROG Ryujin III 360 with Noctua Industrial Fans
Memory PAtriot Viper Elite RGB 96GB @ 6000MHz.
Video Card(s) Asus ROG Strix GeForce RTX 3090 24GB OC EVA Edition
Storage Addlink S95 M.2 PCIe GEN 4x4 2TB
Display(s) Asus ROG SWIFT OLED PG42UQ
Case Thermaltake Core P3 TG
Audio Device(s) Realtek on board > Sony Receiver > Cerwin Vegas
Power Supply be quiet DARK POWER PRO 12 1500W
Mouse ROG STRIX Impact Electro Punk
Keyboard ROG STRIX Scope TKL Electro Punk
Software Windows 11
Joined
Jan 8, 2017
Messages
8,925 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Yes. I know it's GPU Boost 3.0 but I think in my Aorus GTX 1080 Ti Xtreme it reduces GPU Clock more aggressively compared to my previous Lightning Z. For example Lightning Z had GPU Clock of 1962Mhz while working under ~65c or 66c degrees but on my Aorus GTX 1080 Ti, GPU Boost decreases GPU Clock to 1949Mhz at the same temperature. Why is that?

Turbo Boost is very fine grained (13mhz per step), so much that differences in silicon quality can make 2 cards of the same model reach slightly different clocks. Your card is fine , if you want more OC.
 
Joined
Oct 20, 2017
Messages
6 (0.00/day)
Processor Intel Core i7 6700K
Motherboard Asus MAXIMUS VIII Ranger
Cooling Noctua NH-D15
Memory Riphaws 2x8GB DDR4
Video Card(s) Gigabyte Aorus GTX 1080 Ti Xtreme Edition
Storage Samsung SSD Pro 850
Display(s) Asus MX239H
Power Supply CoolerMaster 850+
Software Windows 10 x64 15063.632
Turbo Boost is very fine grained (13mhz per step), so much that differences in silicon quality can make 2 cards of the same model reach slightly different clocks. Your card is fine , if you want more OC.

You mean GPU boost right? Cause Turbo Boost is Intel's technology in CPUs.
But anyway...yeah I think compared to my previous card which was Lightning Z, somehow silicon lottery plays a role in here too. I guess right now the only way to maintain high clock rates is to provide better cooling for the card. Though as I said, I've already tried %100 fan carve at ~65c-66c but again, the temperature goes up until 71 and thus, GPU clock decreases even more (more than Lightning Z at the same temperature). Shame really :(
 
Joined
Sep 17, 2014
Messages
20,902 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
You can try to give the core clock a small bump and see how stable that is. In PrecisionX, you can also set clocks per boost bin, make your own curve.

Don't expect miracles, GPU Boost 3.0 is already fine tuned as it is, and pretty locked down.
 
Joined
Jul 2, 2008
Messages
8,069 (1.40/day)
Location
Hillsboro, OR
System Name Main/DC
Processor i7-3770K/i7-2600K
Motherboard MSI Z77A-GD55/GA-P67A-UD4-B3
Cooling Phanteks PH-TC14CS/H80
Memory Crucial Ballistix Sport 16GB (2 x 8GB) LP /4GB Kingston DDR3 1600
Video Card(s) Asus GTX 660 Ti/MSI HD7770
Storage Crucial MX100 256GB/120GB Samsung 830 & Seagate 2TB(died)
Display(s) Asus 24' LED/Samsung SyncMaster B1940
Case P100/Antec P280 It's huge!
Audio Device(s) on board
Power Supply SeaSonic SS-660XP2/Seasonic SS-760XP2
Software Win 7 Home Premiun 64 Bit
GPU temperature reaches 71c in FurMark tests
Are you only seeing this problem with Furmark? Is the card able to stay cool when running something like Heaven? You really should stay away from Furmark.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.59/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Hello fine ladies and gentlemen. This is my first post in this forum :)

So straight to the problem... About a month ago I bought a MSI Lighting Z 1080 Ti. It was a great card but its RGB LEDs was malfunctioning so I had to sent it back. Then I decided to buy a Gigabyte Aorus GTX 1080 Ti Xtreme Editon since Lighting Z wasn't available anymore.
But the problem I have with my Gigabyte Aorus is that GPU Clock decreases constantly as temperature goes up. At first (when running a game), Afterburner shows GPU Clock as 1987Mhz, then temperature reaches 58c and GPU Clock decreases to 1974Mhz. Then temperature goes up till 63c and again GPU Clock goes down to 1949Mhz. When temperature reaches 71c, GPU Clock decreases to 1932Mhz.

I know this is probably nvidia's GPU Boost which is working but why I didn't had this problem with my Lighting Z 1080 Ti? On my previous Lighting Z 1080 Ti, regardless of what temperature was, GPU Clock was constant 1962Mhz (as stated in this page). Sometimes the temperature was as high as 66c but still 1962Mhz. But my Aorus GTX 1080 Ti keeps decreasing GPU Clock as temperature goes up. So am I missing something? Is there a way to prevent thins using Afterburne or somethingr?

My second question...Why I can't reach these clock profiles with my Gigabyte Aorus GTX 1080 Ti Xtreme? I never had 2025Mhz or 1993Mhz of GPU Clock.

Thanks in advance.

Different cards, different coolers. Gpus depend on unobstructed ample airflow and ambient temperature which is room air temperature. The cooler the room air temperature the better gpus run.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.24/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Different cards, different coolers. Gpus depend on unobstructed ample airflow and ambient temperature which is room air temperature. The cooler the room air temperature the better gpus run.

That definitely plays a part. It sounds like the Gigabyte card's cooling isn't performing as well as the MSI card. Which isn't surprising considering Gigibyte's crappy quality.

But at the same time, Jayztwocents just did a video about how he had two identical cards, same manufacturer and model, and they boosted to different clocks. So my theory is that asic quality plays some role in it as well.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.59/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
That definitely plays a part. It sounds like the Gigabyte card's cooling isn't performing as well as the MSI card. Which isn't surprising considering Gigibyte's crappy quality.

But at the same time, Jayztwocents just did a video about how he had two identical cards, same manufacturer and model, and they boosted to different clocks. So my theory is that asic quality plays some role in it as well.

That's another great factor too.

It's interesting on a 7770 thread i stated not all cards are created equal.
 
Joined
Sep 17, 2014
Messages
20,902 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Realistically we're talking about less than 2-3% deviation in clocks between top-end cards from different manufacturers. Worst case scenario, you'll see 4-5%. Honestly even if you count in silicon lottery, this is pretty neat. Real life performance of all these cards is remarkably close, at typical res of 1440p or 4K that's what, 3-5 FPS.
 

Toothless

Tech, Games, and TPU!
Supporter
Joined
Mar 26, 2014
Messages
9,267 (2.52/day)
Location
Washington, USA
System Name Veral
Processor 5950x
Motherboard MSI MEG x570 Ace
Cooling Corsair H150i RGB Elite
Memory 4x16GB G.Skill TridentZ
Video Card(s) Powercolor 7900XTX Red Devil
Storage Crucial P5 Plus 1TB, Samsung 980 1TB, Teamgroup MP34 4TB
Display(s) Acer Nitro XZ342CK Pbmiiphx + 2x AOC 2425W
Case Fractal Design Meshify Lite 2
Audio Device(s) Blue Yeti + SteelSeries Arctis 5 / Samsung HW-T550
Power Supply Corsair HX850
Mouse Corsair Nightsword
Keyboard Corsair K55
VR HMD HP Reverb G2
Software Windows 11 Professional
Benchmark Scores PEBCAK
Haaaaaa.. my card doesn't downclock at higher temps unless I allow it to pass 65c. Stays at the 2012mhz I have it hit.

Honestly as others have said, keep it cool. Gigabyte might have their profiles more key on cooling than MSI does.
 
Joined
Apr 30, 2011
Messages
2,651 (0.56/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Stop using Furmark to check clocks or thermals. Use a bechmark as 3D Mark or Heaven. AMD and nVidia have included years ago security checks into their drivers to not allow the bashing Furmark does to their GPUs.
 
Joined
Jan 13, 2016
Messages
660 (0.22/day)
Location
127.0.0.1, London, UK
System Name Warranty Void Mk.IV
Processor AMD Ryzen 5 5600
Motherboard Asus X470-I Strix
Cooling Arctic Freezer 240 + 2x Be Quiet! Pure Wings 2 140mm / Silverstone 120mm Slim
Memory Crucial Ballistix Elite 3600MHz 2x8GB CL16 - Tightened Sub-timings
Video Card(s) EVGA RTX 2080 XC Ultra
Storage WD SN550 / MX300 / MX500
Display(s) AOC CU34G2 / LG 29UM69G-B - Auxilary
Case CM NR200p / Silverstone RVZ03-B
Audio Device(s) Realtek ALC 1220+SupremeFX
Power Supply Corsair CX550M 550W / Silverstone SX650-G 650W
Mouse Logitech G302/G502/G203 / RPG: Corsair Nightsword
Keyboard CM Masterkeys Pro M / Asus Sagaris GK100
VR HMD Oculus Rift S
Software Windows 10 Pro x64 - LTSB
Point is, GPU boosting technologies still suck. All they do is prevent proper overclocking out of the box.

It's good when it's there, but makes your day horrible when it's not needed and gets in the way.

Btw, Gigabyte's cards are pretty weak sauce on average when it comes to overclocking/thermals because getting a good GPU die from them is pretty rare IMO. They can slap a fancy cooling solution on it, it doesn't make the processor itself better quality.

I haven't noticed much difference between OC, G1 and AORUS cards. While EVGA segments tend do to something.
 
Last edited:
Joined
Jul 2, 2008
Messages
8,069 (1.40/day)
Location
Hillsboro, OR
System Name Main/DC
Processor i7-3770K/i7-2600K
Motherboard MSI Z77A-GD55/GA-P67A-UD4-B3
Cooling Phanteks PH-TC14CS/H80
Memory Crucial Ballistix Sport 16GB (2 x 8GB) LP /4GB Kingston DDR3 1600
Video Card(s) Asus GTX 660 Ti/MSI HD7770
Storage Crucial MX100 256GB/120GB Samsung 830 & Seagate 2TB(died)
Display(s) Asus 24' LED/Samsung SyncMaster B1940
Case P100/Antec P280 It's huge!
Audio Device(s) on board
Power Supply SeaSonic SS-660XP2/Seasonic SS-760XP2
Software Win 7 Home Premiun 64 Bit
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Oh geez, i guess i didnt catch that in the second post.

Point still remains is he said it he was seeing it in games first..then that morsel came in on top. :)
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.59/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Point is, GPU boosting technologies still suck. All they do is prevent proper overclocking out of the box.

It's good when it's there, but makes your day horrible when it's not needed and gets in the way.

Btw, Gigabyte's cards are pretty weak sauce on average when it comes to overclocking/thermals because getting a good GPU die from them is pretty rare IMO. They can slap a fancy cooling solution on it, it doesn't make the processor itself better quality.

I haven't noticed much difference between OC, G1 and AORUS cards. While EVGA segments tend do to something.

I prefer my cards operating at full throttle at all times or specifying on my own what profile to use when a program launches, variable clock rates tend to cause some screwy issues.
 
Joined
Feb 18, 2012
Messages
2,715 (0.61/day)
System Name MSI GP76
Processor intel i7 11800h
Cooling 2 laptop fans
Memory 32gb of 3000mhz DDR4
Video Card(s) Nvidia 3070
Storage x2 PNY 8tb cs2130 m.2 SSD--16tb of space
Display(s) 17.3" IPS 1920x1080 240Hz
Power Supply 280w laptop power supply
Mouse Logitech m705
Keyboard laptop keyboard
Software lots of movies and Windows 10 with win 7 shell
Benchmark Scores Good enough for me
If you download and install Afterburner you can undervolt the graphics card to run at lower voltage which equals to lower temps.
 
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
Not really..it just does some of it for you.

Not to mention GPU Boost 3.0 does it so well it almost reaches what cards can achieve with manual overclock. I wonder if NVIDIA will be able to bump it up even further to really squeeze out every MHz from it out of the box, basically making manual overclocking unnecessary. I'd actually quite like that. Because I'm lazy and don't want to be bothered by installing 3rd party stuff to achieve same results. :D
 
Joined
Oct 20, 2017
Messages
6 (0.00/day)
Processor Intel Core i7 6700K
Motherboard Asus MAXIMUS VIII Ranger
Cooling Noctua NH-D15
Memory Riphaws 2x8GB DDR4
Video Card(s) Gigabyte Aorus GTX 1080 Ti Xtreme Edition
Storage Samsung SSD Pro 850
Display(s) Asus MX239H
Power Supply CoolerMaster 850+
Software Windows 10 x64 15063.632
OK so I did run some tests to find out how I can prevent GPU Boost's behavior in decreasing GPU Clock at higher temperatures.
First and utmost, as is obvious, was temperature. I Couldn't change the fact that GPU Boost downclock GPU Clock from 1974Mhz to 1949Mhz at 63c degrees, however; I did manage to bump up fan speed to keep the GPU cool and under 63c degrees and thus, playing with 1974Mhz stable. This was achievable through games and benchmarks and not in stress tests (for example in Firestrike Ultra stress test)
Second factor was power target. By monitoring power consumption through Afterburner I did find a sweet spot for power target offset in 4K tests which was %140. Any more than this was pointless since power consumption never went upper than this. Of course this required more Fan Speed to keep the GPU cool and prevent GPU Clock from going under 1974Mhz. Also, I must mention that this power target is not necessary in most games at 1080. At 1080, something around %100 or %110 was enough to maintain 1974Mhz Clock rates. More demanding games like Ghost Recon Wildland was exception of course. I hope gigabyte somehow increase the low default power target by a future Bios updates.
And third was Voltage which I did notice some limitations at stock offset but didn't tamper with it because I didn't know what was the safe offset. Again, this limitation was shown in 4K by Afterburner and not in 1080. Plus, I've already reached 1974Mhz stable as I said so didn't feel the need to tamper with it.

So overall, I can say that given the fact of Aorus's price compared to my previous Lightning Z, I'm pretty happy with my card at the moment. It's true that it's not as cool as Lightning Z (71c degrees at full load in Aorus compared to maximum 66c degrees in Lightning Z) but as @Vayra86 said, difference between these clock rates (for example 1974Mhz,1963Mhz or even 1949Mhz) are just 1 or 2 percent max and not even noticeable in in-game FPS. Maybe at 4K this can lead to maximum 1 or 2 FPS difference.


You can try to give the core clock a small bump and see how stable that is. In PrecisionX, you can also set clocks per boost bin, make your own curve.

Don't expect miracles, GPU Boost 3.0 is already fine tuned as it is, and pretty locked down.

Well, my goal was to maintain high clock rates out of the box and without overclocking. Because as I said and you said it too, these GPU clock rates are not that much of difference. But I confess, seeing clock rates goes up and down in games sometimes can make you sensitive or edgy :)

Are you only seeing this problem with Furmark? Is the card able to stay cool when running something like Heaven? You really should stay away from Furmark.

Well I should've mentioned that this problem only occurs in Furmark stress test (Firestrike Ultra) and not in games. I haven't tried Heaven since games are OK. Maybe you or others can report their temperatures in Firestrike Ultra stress test? This can be helpful.

That definitely plays a part. It sounds like the Gigabyte card's cooling isn't performing as well as the MSI card. Which isn't surprising considering Gigibyte's crappy quality.

I thinks it's true that Gigibyte's quality is actually crappy compared to let's say MSI for example. But my last gen card was Gigabyte GTX 980Ti Xtreme Edition which I think was one of the finest cards at 1-2 years ago. So manufacturers too kinda have their own bumpy road in the terms of product's quality I think.
 
Top