• We've upgraded our forums. Please post any issues/requests in this thread.

GPU Clock decreases as temperature goes up

Joined
Oct 20, 2017
Messages
6 (0.11/day)
Likes
0
Processor Intel Core i7 6700K
Motherboard Asus MAXIMUS VIII Ranger
Cooling Noctua NH-D15
Memory Riphaws 2x8GB DDR4
Video Card(s) Gigabyte Aorus GTX 1080 Ti Xtreme Edition
Storage Samsung SSD Pro 850
Display(s) Asus MX239H
Power Supply CoolerMaster 850+
Software Windows 10 x64 15063.632
#1
Hello fine ladies and gentlemen. This is my first post in this forum :)

So straight to the problem... About a month ago I bought a MSI Lighting Z 1080 Ti. It was a great card but its RGB LEDs was malfunctioning so I had to sent it back. Then I decided to buy a Gigabyte Aorus GTX 1080 Ti Xtreme Editon since Lighting Z wasn't available anymore.
But the problem I have with my Gigabyte Aorus is that GPU Clock decreases constantly as temperature goes up. At first (when running a game), Afterburner shows GPU Clock as 1987Mhz, then temperature reaches 58c and GPU Clock decreases to 1974Mhz. Then temperature goes up till 63c and again GPU Clock goes down to 1949Mhz. When temperature reaches 71c, GPU Clock decreases to 1932Mhz.

I know this is probably nvidia's GPU Boost which is working but why I didn't had this problem with my Lighting Z 1080 Ti? On my previous Lighting Z 1080 Ti, regardless of what temperature was, GPU Clock was constant 1962Mhz (as stated in this page). Sometimes the temperature was as high as 66c but still 1962Mhz. But my Aorus GTX 1080 Ti keeps decreasing GPU Clock as temperature goes up. So am I missing something? Is there a way to prevent thins using Afterburne or somethingr?

My second question...Why I can't reach these clock profiles with my Gigabyte Aorus GTX 1080 Ti Xtreme? I never had 2025Mhz or 1993Mhz of GPU Clock.

Thanks in advance.
 
Joined
Nov 18, 2010
Messages
3,831 (1.48/day)
Likes
2,220
Location
Rīga, Latvia
System Name HELLSTAR
Processor Intel 5820K @ 4.6GHz
Motherboard Gigabyte GA-X99-UD3
Cooling Custom Loop. 360+240 rads.
Memory 4x8GB Corsair Vengeance LPX 3200MHz 15-17-17-35
Video Card(s) ASUS 1080 Ti FE + water block
Storage Optane 32GB + Samsung 950Pro 256GB NVMe + 750 EVO 500GB
Display(s) Philips PHL BDM3270
Case Phanteks Enthoo Evolv ATX Tempered Glass
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer Deathstalker
Software Windows 10 insider
#2
It should start to decrease from 45C, and it is like that. Read up first...
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
17,026 (3.43/day)
Likes
17,893
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
#3
GPU clock decreasing as temperature increases is a feature of GPU Boost 3.0. As far as I know there is no way to change its behavior or disable it.


Lightning Z
 
Last edited:
Joined
Oct 2, 2004
Messages
12,351 (2.56/day)
Likes
5,809
Location
Europe\Slovenia
System Name Dark Silence 2
Processor Intel Core i7 5820K @ 4.5 GHz (1.15V)
Motherboard MSI X99A Gaming 7
Cooling Cooler Master Nepton 120XL
Memory 32 GB DDR4 Kingston HyperX Fury 2400 MHz @ 2666 MHz 15-15-15-32 1T (1.25V)
Video Card(s) AORUS GeForce GTX 1080Ti 11GB (1950/11000 OC Mode)
Storage Samsung 850 Pro 2TB SSD (3D V-NAND)
Display(s) ASUS VG248QE 144Hz 1ms (DisplayPort)
Case Corsair Carbide 330R Titanium
Audio Device(s) Creative Sound BlasterX AE-5 + Altec Lansing MX5021 (HiFi capacitors and OPAMP upgrade)
Power Supply BeQuiet! Dark Power Pro 11 750W
Mouse Logitech G502 Proteus Spectrum
Keyboard Cherry Stream XT Black
Software Windows 10 Pro 64-bit (Fall Creators Update)
#4
@W1zzard
Yes there is. Cool it better. Either by improving case ventilation/cooling or by ramping up the fans on graphic card.

EDIT:
I have the regular AORUS GTX 1080Ti and there is another way. Manually bump up the clocks. That also changes how boost behaves. It'll keep higher clocks unless the temperatures really go that high that it'll decrease the clocks regardless...
 
Last edited:
Joined
Oct 20, 2017
Messages
6 (0.11/day)
Likes
0
Processor Intel Core i7 6700K
Motherboard Asus MAXIMUS VIII Ranger
Cooling Noctua NH-D15
Memory Riphaws 2x8GB DDR4
Video Card(s) Gigabyte Aorus GTX 1080 Ti Xtreme Edition
Storage Samsung SSD Pro 850
Display(s) Asus MX239H
Power Supply CoolerMaster 850+
Software Windows 10 x64 15063.632
#5
GPU clock decreasing as temperature increases is a feature of GPU Boost 3.0. As far as I know there is no way to change its behavior or disable it.


Lightning Z
Yes. I know it's GPU Boost 3.0 but I think in my Aorus GTX 1080 Ti Xtreme it reduces GPU Clock more aggressively compared to my previous Lightning Z. For example Lightning Z had GPU Clock of 1962Mhz while working under ~65c or 66c degrees but on my Aorus GTX 1080 Ti, GPU Boost decreases GPU Clock to 1949Mhz at the same temperature. Why is that?

Also, do you have the same graph for Aorus GTX 1080 Ti Xtreme Edition? that could be more clarifying on the subject.

@W1zzard
Yes there is. Cool it better. Either by improving case ventilation/cooling or by ramping up the fans on graphic card.

EDIT:
I have the regular AORUS GTX 1080Ti and there is another way. Manually bump up the clocks. That also changes how boost behaves. It'll keep higher clocks unless the temperatures really go that high that it'll decrease the clocks regardless...
I tried to change fan curve in Afterburner in order to prevent my AORUS GTX 1080Ti hit upper than 66c but no luck. Even at 100 percent fan spinnig, GPU temperature reaches 71c in FurMark tests. Room temperature was ~20c.
 
Last edited by a moderator:

sneekypeet

Unpaid Babysitter
Staff member
Joined
Apr 12, 2006
Messages
24,613 (5.77/day)
Likes
9,340
System Name His
Processor Intel i7 5930K
Motherboard Asus Maximus V Extreme
Cooling Thermalright True Spirit 140 BW Rev.A
Memory Corsair Dominator Platinum ROG
Video Card(s) nVidia GTX 1080 SLI with EVGA Hybrid coolers.
Storage Samsung 960 Pro / Crucial MX300 750GB / Seagate 1TB Spinner
Display(s) Sony 43" 4K 60hz
Case SilverStone Temjin TJ11
Audio Device(s) Realtek on board > Sony receiver > Cerwin Vega's
Power Supply Thermaltake TPSG 1050W
Mouse Always Changing
Keyboard Always Changing
Software Windows 10 Pro 64
#6
Joined
Jan 8, 2017
Messages
1,672 (4.96/day)
Likes
884
System Name Good enough
Processor AMD FX-6300 - 4.5 Ghz
Motherboard ASRock 970M Pro3
Cooling Scythe Katana 4 - 3x 120mm case fans
Memory 16GB - 4x4GB A-DATA 1866 Mhz (OC)
Video Card(s) ASUS GTX 1060 Turbo 6GB ~ 2139 Mhz / 9.4 Gbps
Storage 1x Samsung 850 EVO 250GB , 1x 1 Tb Seagate something or other
Display(s) 1080p TV
Case Zalman R1
Power Supply 500W
#7
Yes. I know it's GPU Boost 3.0 but I think in my Aorus GTX 1080 Ti Xtreme it reduces GPU Clock more aggressively compared to my previous Lightning Z. For example Lightning Z had GPU Clock of 1962Mhz while working under ~65c or 66c degrees but on my Aorus GTX 1080 Ti, GPU Boost decreases GPU Clock to 1949Mhz at the same temperature. Why is that?
Turbo Boost is very fine grained (13mhz per step), so much that differences in silicon quality can make 2 cards of the same model reach slightly different clocks. Your card is fine , if you want more OC.
 
Joined
Oct 20, 2017
Messages
6 (0.11/day)
Likes
0
Processor Intel Core i7 6700K
Motherboard Asus MAXIMUS VIII Ranger
Cooling Noctua NH-D15
Memory Riphaws 2x8GB DDR4
Video Card(s) Gigabyte Aorus GTX 1080 Ti Xtreme Edition
Storage Samsung SSD Pro 850
Display(s) Asus MX239H
Power Supply CoolerMaster 850+
Software Windows 10 x64 15063.632
#8
Turbo Boost is very fine grained (13mhz per step), so much that differences in silicon quality can make 2 cards of the same model reach slightly different clocks. Your card is fine , if you want more OC.
You mean GPU boost right? Cause Turbo Boost is Intel's technology in CPUs.
But anyway...yeah I think compared to my previous card which was Lightning Z, somehow silicon lottery plays a role in here too. I guess right now the only way to maintain high clock rates is to provide better cooling for the card. Though as I said, I've already tried %100 fan carve at ~65c-66c but again, the temperature goes up until 71 and thus, GPU clock decreases even more (more than Lightning Z at the same temperature). Shame really :(
 
Joined
Sep 17, 2014
Messages
3,496 (2.96/day)
Likes
2,732
Location
Duiven, Netherlands
Processor i5 3570k / 4.4Ghz @ 1.26v
Motherboard Gigabyte Z77X-D3H
Cooling Gelid Tranquillo Rev 2
Memory Corsair Vengeance 1600CL9 2x4GB
Video Card(s) MSI GTX 1080 GamingX @ 2100/5500
Storage Samsung 830 256gb SSD + Crucial BX100 250gb + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define R4
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
#9
You can try to give the core clock a small bump and see how stable that is. In PrecisionX, you can also set clocks per boost bin, make your own curve.

Don't expect miracles, GPU Boost 3.0 is already fine tuned as it is, and pretty locked down.
 
Joined
Jul 2, 2008
Messages
6,268 (1.82/day)
Likes
8,168
Location
Hillsboro, OR
System Name Main/DC
Processor i7-3770K/i7-2600K
Motherboard MSI Z77A-GD55/GA-P67A-UD4-B3
Cooling Phanteks PH-TC14CS/H80
Memory Crucial Ballistix Sport 16GB (2 x 8GB) LP /4GB Kingston DDR3 1600
Video Card(s) Asus GTX 660 Ti/MSI HD7770
Storage Crucial MX100 256GB/120GB Samsung 830 & Seagate 2TB(died)
Display(s) Asus 24' LED/Samsung SyncMaster B1940
Case P100/Antec P280 It's huge!
Audio Device(s) on board
Power Supply SeaSonic SS-660XP2/Seasonic SS-760XP2
Software Win 7 Home Premiun 64 Bit
#10
GPU temperature reaches 71c in FurMark tests
Are you only seeing this problem with Furmark? Is the card able to stay cool when running something like Heaven? You really should stay away from Furmark.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
19,194 (5.03/day)
Likes
4,798
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
#11
Hello fine ladies and gentlemen. This is my first post in this forum :)

So straight to the problem... About a month ago I bought a MSI Lighting Z 1080 Ti. It was a great card but its RGB LEDs was malfunctioning so I had to sent it back. Then I decided to buy a Gigabyte Aorus GTX 1080 Ti Xtreme Editon since Lighting Z wasn't available anymore.
But the problem I have with my Gigabyte Aorus is that GPU Clock decreases constantly as temperature goes up. At first (when running a game), Afterburner shows GPU Clock as 1987Mhz, then temperature reaches 58c and GPU Clock decreases to 1974Mhz. Then temperature goes up till 63c and again GPU Clock goes down to 1949Mhz. When temperature reaches 71c, GPU Clock decreases to 1932Mhz.

I know this is probably nvidia's GPU Boost which is working but why I didn't had this problem with my Lighting Z 1080 Ti? On my previous Lighting Z 1080 Ti, regardless of what temperature was, GPU Clock was constant 1962Mhz (as stated in this page). Sometimes the temperature was as high as 66c but still 1962Mhz. But my Aorus GTX 1080 Ti keeps decreasing GPU Clock as temperature goes up. So am I missing something? Is there a way to prevent thins using Afterburne or somethingr?

My second question...Why I can't reach these clock profiles with my Gigabyte Aorus GTX 1080 Ti Xtreme? I never had 2025Mhz or 1993Mhz of GPU Clock.

Thanks in advance.
Different cards, different coolers. Gpus depend on unobstructed ample airflow and ambient temperature which is room air temperature. The cooler the room air temperature the better gpus run.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
24,274 (5.51/day)
Likes
10,361
Location
Indiana, USA
Processor Intel Core i7 4790K@4.6GHz
Motherboard AsRock Z97 Extreme6
Cooling Corsair H100i
Memory 32GB Corsair DDR3-1866 9-10-9-27
Video Card(s) ASUS GTX960 STRIX @ 1500/1900
Storage 480GB Crucial MX200 + 2TB Seagate Solid State Hybrid Drive with 128GB OCZ Synapse SSD Cache
Display(s) QNIX QX2710 1440p@120Hz
Case Corsair 650D Black
Audio Device(s) Onboard is good enough for me
Power Supply Corsair HX850
Software Windows 10 Pro x64
#12
Different cards, different coolers. Gpus depend on unobstructed ample airflow and ambient temperature which is room air temperature. The cooler the room air temperature the better gpus run.
That definitely plays a part. It sounds like the Gigabyte card's cooling isn't performing as well as the MSI card. Which isn't surprising considering Gigibyte's crappy quality.

But at the same time, Jayztwocents just did a video about how he had two identical cards, same manufacturer and model, and they boosted to different clocks. So my theory is that asic quality plays some role in it as well.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
19,194 (5.03/day)
Likes
4,798
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
#13
That definitely plays a part. It sounds like the Gigabyte card's cooling isn't performing as well as the MSI card. Which isn't surprising considering Gigibyte's crappy quality.

But at the same time, Jayztwocents just did a video about how he had two identical cards, same manufacturer and model, and they boosted to different clocks. So my theory is that asic quality plays some role in it as well.
That's another great factor too.

It's interesting on a 7770 thread i stated not all cards are created equal.
 
Joined
Sep 17, 2014
Messages
3,496 (2.96/day)
Likes
2,732
Location
Duiven, Netherlands
Processor i5 3570k / 4.4Ghz @ 1.26v
Motherboard Gigabyte Z77X-D3H
Cooling Gelid Tranquillo Rev 2
Memory Corsair Vengeance 1600CL9 2x4GB
Video Card(s) MSI GTX 1080 GamingX @ 2100/5500
Storage Samsung 830 256gb SSD + Crucial BX100 250gb + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define R4
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
#14
Realistically we're talking about less than 2-3% deviation in clocks between top-end cards from different manufacturers. Worst case scenario, you'll see 4-5%. Honestly even if you count in silicon lottery, this is pretty neat. Real life performance of all these cards is remarkably close, at typical res of 1440p or 4K that's what, 3-5 FPS.
 
Joined
Mar 26, 2014
Messages
5,162 (3.80/day)
Likes
3,839
Location
Washington, USA
System Name Volt
Processor i7-4790k
Motherboard Gigabyte GA-Z97X Gaming 5
Cooling NZXT Kraken X60
Memory G.Skill Ripjaws 4x8GB
Video Card(s) MSI GTX1080 Ti Gaming X
Storage 250GB SSD / 3x1TB + 2x2TB HDD
Display(s) 3x AOC 2425W + ViewSonic VA2855Smh + 2x Dell E178FP
Case Phantec Enthoo Pro
Audio Device(s) LucidSound LS30
Power Supply Rosewill Fortress 750w
Mouse G.Skill MX780 RGB
Keyboard G.Skill KM780 RGB (Brown switches)
Software Windows 10 Professional
Benchmark Scores Technical term is PEBCAK issue, which stands for Problem Exists Between Chair And Keyboard
#15
Haaaaaa.. my card doesn't downclock at higher temps unless I allow it to pass 65c. Stays at the 2012mhz I have it hit.

Honestly as others have said, keep it cool. Gigabyte might have their profiles more key on cooling than MSI does.
 
Joined
Apr 30, 2011
Messages
802 (0.33/day)
Likes
736
Location
Greece
Processor AMD FX-8350 4GHz@1.3V
Motherboard Gigabyte GA-970A UD3 Rev3.0
Cooling Zalman CNPS5X Performa
Memory 2*4GB Patriot Venom RED DDR3 1600MHz CL9
Video Card(s) Sapphire 7950 3GB
Storage Sandisk SSD 120GB, 2 Samsung F1 & F3 (1TB)
Display(s) LG IPS235
Case Zalman Neo Z9 Black
Audio Device(s) Via 7.1 onboard
Power Supply OCZ Z550
Mouse Zalman ZM-M401R
Keyboard Logitech K120
Software Win 7 sp1 64bit
Benchmark Scores CB R15 64bit: single core 95p, multicore 626p, OpenGL 91.61 WPrime 1.55 (8 cores): 9.0 secs
#16
Stop using Furmark to check clocks or thermals. Use a bechmark as 3D Mark or Heaven. AMD and nVidia have included years ago security checks into their drivers to not allow the bashing Furmark does to their GPUs.
 
Joined
Dec 31, 2009
Messages
11,471 (3.95/day)
Likes
6,245
Location
Ohio
System Name Daily Driver
Processor 7900X 4.5GHz 10c/10t 1.15V.
Motherboard ASUS Prime X299 Deluxe
Cooling MCR320 + Kuplos Kryos NEXT CPU block
Memory GSkill Trident Z 4x8 GB DDR4 3600 MHz CL16
Video Card(s) EVGA GTX 1080 FTW3
Storage 512GB Patriot Hellfire, 512GB OCZ RD400, 640GB Caviar Black, 2TB Caviar Green
Display(s) Yamakasi 27" 2560x1440 IPS
Case Thermaltake P5
Power Supply EVGA 750W Supernova G2
Benchmark Scores Faster than most of you! Bet on it! :)
#17
While true, the OP never mentioned furmark??? He said "game".
 
Last edited:
Joined
Jan 13, 2016
Messages
353 (0.51/day)
Likes
269
Location
Behind you
System Name Warranty Void/Silicon Sadness
Processor AMD Ryzen 1600 @ 3.7Ghz 1.235v
Motherboard AsRock AB350M Pro4 Rev 1.0
Cooling Deepcool Captain EX 240 White powaah
Memory Corsair Vengeance LED DDR4 3200MHz 2x8GB
Video Card(s) Gigabyte GeForce GTX 460 1GB @ 875MHz Core
Storage ADATA SP550 250GB
Display(s) AOC G2778VQ Freesync
Case Fractal Design Define Mini C TG
Audio Device(s) Onboard - Realtek ALC889 - 7.1
Power Supply Corsair CX550M 550W
Mouse A4Tech Bloody V3/SteelSeries Rival 100
Keyboard Microsoft Comfort 3000 Curve/Asus Sagaris GK100
Software Windows 10 Pro x64
#18
Point is, GPU boosting technologies still suck. All they do is prevent proper overclocking out of the box.

It's good when it's there, but makes your day horrible when it's not needed and gets in the way.

Btw, Gigabyte's cards are pretty weak sauce on average when it comes to overclocking/thermals because getting a good GPU die from them is pretty rare IMO. They can slap a fancy cooling solution on it, it doesn't make the processor itself better quality.

I haven't noticed much difference between OC, G1 and AORUS cards. While EVGA segments tend do to something.
 
Last edited:
Joined
Dec 31, 2009
Messages
11,471 (3.95/day)
Likes
6,245
Location
Ohio
System Name Daily Driver
Processor 7900X 4.5GHz 10c/10t 1.15V.
Motherboard ASUS Prime X299 Deluxe
Cooling MCR320 + Kuplos Kryos NEXT CPU block
Memory GSkill Trident Z 4x8 GB DDR4 3600 MHz CL16
Video Card(s) EVGA GTX 1080 FTW3
Storage 512GB Patriot Hellfire, 512GB OCZ RD400, 640GB Caviar Black, 2TB Caviar Green
Display(s) Yamakasi 27" 2560x1440 IPS
Case Thermaltake P5
Power Supply EVGA 750W Supernova G2
Benchmark Scores Faster than most of you! Bet on it! :)
#19
Not really..it just does some of it for you.
 
Joined
Jul 2, 2008
Messages
6,268 (1.82/day)
Likes
8,168
Location
Hillsboro, OR
System Name Main/DC
Processor i7-3770K/i7-2600K
Motherboard MSI Z77A-GD55/GA-P67A-UD4-B3
Cooling Phanteks PH-TC14CS/H80
Memory Crucial Ballistix Sport 16GB (2 x 8GB) LP /4GB Kingston DDR3 1600
Video Card(s) Asus GTX 660 Ti/MSI HD7770
Storage Crucial MX100 256GB/120GB Samsung 830 & Seagate 2TB(died)
Display(s) Asus 24' LED/Samsung SyncMaster B1940
Case P100/Antec P280 It's huge!
Audio Device(s) on board
Power Supply SeaSonic SS-660XP2/Seasonic SS-760XP2
Software Win 7 Home Premiun 64 Bit
#20
Joined
Dec 31, 2009
Messages
11,471 (3.95/day)
Likes
6,245
Location
Ohio
System Name Daily Driver
Processor 7900X 4.5GHz 10c/10t 1.15V.
Motherboard ASUS Prime X299 Deluxe
Cooling MCR320 + Kuplos Kryos NEXT CPU block
Memory GSkill Trident Z 4x8 GB DDR4 3600 MHz CL16
Video Card(s) EVGA GTX 1080 FTW3
Storage 512GB Patriot Hellfire, 512GB OCZ RD400, 640GB Caviar Black, 2TB Caviar Green
Display(s) Yamakasi 27" 2560x1440 IPS
Case Thermaltake P5
Power Supply EVGA 750W Supernova G2
Benchmark Scores Faster than most of you! Bet on it! :)
#21
Oh geez, i guess i didnt catch that in the second post.

Point still remains is he said it he was seeing it in games first..then that morsel came in on top. :)
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
19,194 (5.03/day)
Likes
4,798
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
#22
Point is, GPU boosting technologies still suck. All they do is prevent proper overclocking out of the box.

It's good when it's there, but makes your day horrible when it's not needed and gets in the way.

Btw, Gigabyte's cards are pretty weak sauce on average when it comes to overclocking/thermals because getting a good GPU die from them is pretty rare IMO. They can slap a fancy cooling solution on it, it doesn't make the processor itself better quality.

I haven't noticed much difference between OC, G1 and AORUS cards. While EVGA segments tend do to something.
I prefer my cards operating at full throttle at all times or specifying on my own what profile to use when a program launches, variable clock rates tend to cause some screwy issues.
 
Joined
Feb 18, 2012
Messages
1,007 (0.47/day)
Likes
891
System Name Eurocom Tornado F5 laptop
Processor i7 7700k at 4.6ghz
Cooling 2 laptop fans
Memory 16gb of 2400mhz DDR4
Video Card(s) Nvidia 1070
Storage 2tb 2.5 inch HD, 1tb SSD m.2
Display(s) 15.6inch 120hz screen
Power Supply 230w power supply
Mouse Logitech m705
Keyboard laptop keyboard
Software lots of movies and Windows 10 Pro with win 7 shell
Benchmark Scores Very high for a laptop with 1070 and desktop cpu
#23
If you download and install Afterburner you can undervolt the graphics card to run at lower voltage which equals to lower temps.
 
Joined
Oct 2, 2004
Messages
12,351 (2.56/day)
Likes
5,809
Location
Europe\Slovenia
System Name Dark Silence 2
Processor Intel Core i7 5820K @ 4.5 GHz (1.15V)
Motherboard MSI X99A Gaming 7
Cooling Cooler Master Nepton 120XL
Memory 32 GB DDR4 Kingston HyperX Fury 2400 MHz @ 2666 MHz 15-15-15-32 1T (1.25V)
Video Card(s) AORUS GeForce GTX 1080Ti 11GB (1950/11000 OC Mode)
Storage Samsung 850 Pro 2TB SSD (3D V-NAND)
Display(s) ASUS VG248QE 144Hz 1ms (DisplayPort)
Case Corsair Carbide 330R Titanium
Audio Device(s) Creative Sound BlasterX AE-5 + Altec Lansing MX5021 (HiFi capacitors and OPAMP upgrade)
Power Supply BeQuiet! Dark Power Pro 11 750W
Mouse Logitech G502 Proteus Spectrum
Keyboard Cherry Stream XT Black
Software Windows 10 Pro 64-bit (Fall Creators Update)
#24
Not really..it just does some of it for you.
Not to mention GPU Boost 3.0 does it so well it almost reaches what cards can achieve with manual overclock. I wonder if NVIDIA will be able to bump it up even further to really squeeze out every MHz from it out of the box, basically making manual overclocking unnecessary. I'd actually quite like that. Because I'm lazy and don't want to be bothered by installing 3rd party stuff to achieve same results. :D
 
Joined
Oct 20, 2017
Messages
6 (0.11/day)
Likes
0
Processor Intel Core i7 6700K
Motherboard Asus MAXIMUS VIII Ranger
Cooling Noctua NH-D15
Memory Riphaws 2x8GB DDR4
Video Card(s) Gigabyte Aorus GTX 1080 Ti Xtreme Edition
Storage Samsung SSD Pro 850
Display(s) Asus MX239H
Power Supply CoolerMaster 850+
Software Windows 10 x64 15063.632
#25
OK so I did run some tests to find out how I can prevent GPU Boost's behavior in decreasing GPU Clock at higher temperatures.
First and utmost, as is obvious, was temperature. I Couldn't change the fact that GPU Boost downclock GPU Clock from 1974Mhz to 1949Mhz at 63c degrees, however; I did manage to bump up fan speed to keep the GPU cool and under 63c degrees and thus, playing with 1974Mhz stable. This was achievable through games and benchmarks and not in stress tests (for example in Firestrike Ultra stress test)
Second factor was power target. By monitoring power consumption through Afterburner I did find a sweet spot for power target offset in 4K tests which was %140. Any more than this was pointless since power consumption never went upper than this. Of course this required more Fan Speed to keep the GPU cool and prevent GPU Clock from going under 1974Mhz. Also, I must mention that this power target is not necessary in most games at 1080. At 1080, something around %100 or %110 was enough to maintain 1974Mhz Clock rates. More demanding games like Ghost Recon Wildland was exception of course. I hope gigabyte somehow increase the low default power target by a future Bios updates.
And third was Voltage which I did notice some limitations at stock offset but didn't tamper with it because I didn't know what was the safe offset. Again, this limitation was shown in 4K by Afterburner and not in 1080. Plus, I've already reached 1974Mhz stable as I said so didn't feel the need to tamper with it.

So overall, I can say that given the fact of Aorus's price compared to my previous Lightning Z, I'm pretty happy with my card at the moment. It's true that it's not as cool as Lightning Z (71c degrees at full load in Aorus compared to maximum 66c degrees in Lightning Z) but as @Vayra86 said, difference between these clock rates (for example 1974Mhz,1963Mhz or even 1949Mhz) are just 1 or 2 percent max and not even noticeable in in-game FPS. Maybe at 4K this can lead to maximum 1 or 2 FPS difference.


You can try to give the core clock a small bump and see how stable that is. In PrecisionX, you can also set clocks per boost bin, make your own curve.

Don't expect miracles, GPU Boost 3.0 is already fine tuned as it is, and pretty locked down.
Well, my goal was to maintain high clock rates out of the box and without overclocking. Because as I said and you said it too, these GPU clock rates are not that much of difference. But I confess, seeing clock rates goes up and down in games sometimes can make you sensitive or edgy :)

Are you only seeing this problem with Furmark? Is the card able to stay cool when running something like Heaven? You really should stay away from Furmark.
Well I should've mentioned that this problem only occurs in Furmark stress test (Firestrike Ultra) and not in games. I haven't tried Heaven since games are OK. Maybe you or others can report their temperatures in Firestrike Ultra stress test? This can be helpful.

That definitely plays a part. It sounds like the Gigabyte card's cooling isn't performing as well as the MSI card. Which isn't surprising considering Gigibyte's crappy quality.
I thinks it's true that Gigibyte's quality is actually crappy compared to let's say MSI for example. But my last gen card was Gigabyte GTX 980Ti Xtreme Edition which I think was one of the finest cards at 1-2 years ago. So manufacturers too kinda have their own bumpy road in the terms of product's quality I think.