• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU Clock decreases as temperature goes up

Joined
Oct 20, 2017
Messages
6 (0.01/day)
Likes
0
Processor Intel Core i7 6700K
Motherboard Asus MAXIMUS VIII Ranger
Cooling Noctua NH-D15
Memory Riphaws 2x8GB DDR4
Video Card(s) Gigabyte Aorus GTX 1080 Ti Xtreme Edition
Storage Samsung SSD Pro 850
Display(s) Asus MX239H
Power Supply CoolerMaster 850+
Software Windows 10 x64 15063.632
#1
Hello fine ladies and gentlemen. This is my first post in this forum :)

So straight to the problem... About a month ago I bought a MSI Lighting Z 1080 Ti. It was a great card but its RGB LEDs was malfunctioning so I had to sent it back. Then I decided to buy a Gigabyte Aorus GTX 1080 Ti Xtreme Editon since Lighting Z wasn't available anymore.
But the problem I have with my Gigabyte Aorus is that GPU Clock decreases constantly as temperature goes up. At first (when running a game), Afterburner shows GPU Clock as 1987Mhz, then temperature reaches 58c and GPU Clock decreases to 1974Mhz. Then temperature goes up till 63c and again GPU Clock goes down to 1949Mhz. When temperature reaches 71c, GPU Clock decreases to 1932Mhz.

I know this is probably nvidia's GPU Boost which is working but why I didn't had this problem with my Lighting Z 1080 Ti? On my previous Lighting Z 1080 Ti, regardless of what temperature was, GPU Clock was constant 1962Mhz (as stated in this page). Sometimes the temperature was as high as 66c but still 1962Mhz. But my Aorus GTX 1080 Ti keeps decreasing GPU Clock as temperature goes up. So am I missing something? Is there a way to prevent thins using Afterburne or somethingr?

My second question...Why I can't reach these clock profiles with my Gigabyte Aorus GTX 1080 Ti Xtreme? I never had 2025Mhz or 1993Mhz of GPU Clock.

Thanks in advance.
 
Joined
Nov 18, 2010
Messages
4,134 (1.40/day)
Likes
2,431
Location
Rīga, Latvia
System Name HELLSTAR
Processor Intel 5960X @ 4.4GHz
Motherboard Gigabyte GA-X99-UD3
Cooling Custom Loop. 360+240 rads.
Memory 4x8GB Corsair Vengeance LPX 2966MHz 16-17-17-35
Video Card(s) ASUS 1080 Ti FE + water block
Storage Optane 900P + Samsung 950Pro 256GB NVMe + 750 EVO 500GB
Display(s) Philips PHL BDM3270
Case Phanteks Enthoo Evolv ATX Tempered Glass
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer Deathstalker
Software Windows 10 insider
#2
It should start to decrease from 45C, and it is like that. Read up first...
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
18,530 (3.48/day)
Likes
20,507
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
#3
GPU clock decreasing as temperature increases is a feature of GPU Boost 3.0. As far as I know there is no way to change its behavior or disable it.


Lightning Z
 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (2.66/day)
Likes
6,920
#4
@W1zzard
Yes there is. Cool it better. Either by improving case ventilation/cooling or by ramping up the fans on graphic card.

EDIT:
I have the regular AORUS GTX 1080Ti and there is another way. Manually bump up the clocks. That also changes how boost behaves. It'll keep higher clocks unless the temperatures really go that high that it'll decrease the clocks regardless...
 
Last edited:
Likes: ww_ea
Joined
Oct 20, 2017
Messages
6 (0.01/day)
Likes
0
Processor Intel Core i7 6700K
Motherboard Asus MAXIMUS VIII Ranger
Cooling Noctua NH-D15
Memory Riphaws 2x8GB DDR4
Video Card(s) Gigabyte Aorus GTX 1080 Ti Xtreme Edition
Storage Samsung SSD Pro 850
Display(s) Asus MX239H
Power Supply CoolerMaster 850+
Software Windows 10 x64 15063.632
#5
GPU clock decreasing as temperature increases is a feature of GPU Boost 3.0. As far as I know there is no way to change its behavior or disable it.


Lightning Z
Yes. I know it's GPU Boost 3.0 but I think in my Aorus GTX 1080 Ti Xtreme it reduces GPU Clock more aggressively compared to my previous Lightning Z. For example Lightning Z had GPU Clock of 1962Mhz while working under ~65c or 66c degrees but on my Aorus GTX 1080 Ti, GPU Boost decreases GPU Clock to 1949Mhz at the same temperature. Why is that?

Also, do you have the same graph for Aorus GTX 1080 Ti Xtreme Edition? that could be more clarifying on the subject.

@W1zzard
Yes there is. Cool it better. Either by improving case ventilation/cooling or by ramping up the fans on graphic card.

EDIT:
I have the regular AORUS GTX 1080Ti and there is another way. Manually bump up the clocks. That also changes how boost behaves. It'll keep higher clocks unless the temperatures really go that high that it'll decrease the clocks regardless...
I tried to change fan curve in Afterburner in order to prevent my AORUS GTX 1080Ti hit upper than 66c but no luck. Even at 100 percent fan spinnig, GPU temperature reaches 71c in FurMark tests. Room temperature was ~20c.
 
Last edited by a moderator:

sneekypeet

Super Moderator
Staff member
Joined
Apr 12, 2006
Messages
25,703 (5.55/day)
Likes
10,835
System Name His
Processor Intel i9 7920X
Motherboard Asus Prime X299 Deluxe
Cooling Corsair H150i PRO RGB
Memory G.Skill TridentZ RGB 32GB @ 3600MHz
Video Card(s) nVidia GTX 1080ti SLI with EVGA Hybrid coolers.
Storage Samsung 960 Pro / Crucial MX300 750GB / Seagate 1TB Spinner
Display(s) Sony 43" 4K 60hz
Case Cooler Master Cosmos C700P (Inverted Layout)
Audio Device(s) Realtek on board > Sony Receiver > Cerwin Vega's
Power Supply Thermaltake TPSG 1050W
Mouse Always Changing
Keyboard Always Changing
Software Windows 10 Pro 64
#6
Likes: ww_ea
Joined
Jan 8, 2017
Messages
3,484 (4.93/day)
Likes
2,587
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Scythe Katana 4 - 3x 120mm case fans
Memory 16GB - Corsair Vengeance LPX
Video Card(s) ASUS GTX 1060 Turbo 6GB ~ 2139 Mhz / 9.4 Gbps
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Zalman R1
Power Supply 500W
#7
Yes. I know it's GPU Boost 3.0 but I think in my Aorus GTX 1080 Ti Xtreme it reduces GPU Clock more aggressively compared to my previous Lightning Z. For example Lightning Z had GPU Clock of 1962Mhz while working under ~65c or 66c degrees but on my Aorus GTX 1080 Ti, GPU Boost decreases GPU Clock to 1949Mhz at the same temperature. Why is that?
Turbo Boost is very fine grained (13mhz per step), so much that differences in silicon quality can make 2 cards of the same model reach slightly different clocks. Your card is fine , if you want more OC.
 
Joined
Oct 20, 2017
Messages
6 (0.01/day)
Likes
0
Processor Intel Core i7 6700K
Motherboard Asus MAXIMUS VIII Ranger
Cooling Noctua NH-D15
Memory Riphaws 2x8GB DDR4
Video Card(s) Gigabyte Aorus GTX 1080 Ti Xtreme Edition
Storage Samsung SSD Pro 850
Display(s) Asus MX239H
Power Supply CoolerMaster 850+
Software Windows 10 x64 15063.632
#8
Turbo Boost is very fine grained (13mhz per step), so much that differences in silicon quality can make 2 cards of the same model reach slightly different clocks. Your card is fine , if you want more OC.
You mean GPU boost right? Cause Turbo Boost is Intel's technology in CPUs.
But anyway...yeah I think compared to my previous card which was Lightning Z, somehow silicon lottery plays a role in here too. I guess right now the only way to maintain high clock rates is to provide better cooling for the card. Though as I said, I've already tried %100 fan carve at ~65c-66c but again, the temperature goes up until 71 and thus, GPU clock decreases even more (more than Lightning Z at the same temperature). Shame really :(
 
Joined
Sep 17, 2014
Messages
7,016 (4.53/day)
Likes
5,900
Location
Duiven, Netherlands
Processor i7 8700k 4.8Ghz @ 1.31v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
#9
You can try to give the core clock a small bump and see how stable that is. In PrecisionX, you can also set clocks per boost bin, make your own curve.

Don't expect miracles, GPU Boost 3.0 is already fine tuned as it is, and pretty locked down.
 
Joined
Jul 2, 2008
Messages
6,934 (1.82/day)
Likes
9,423
Location
Hillsboro, OR
System Name Main/DC
Processor i7-3770K/i7-2600K
Motherboard MSI Z77A-GD55/GA-P67A-UD4-B3
Cooling Phanteks PH-TC14CS/H80
Memory Crucial Ballistix Sport 16GB (2 x 8GB) LP /4GB Kingston DDR3 1600
Video Card(s) Asus GTX 660 Ti/MSI HD7770
Storage Crucial MX100 256GB/120GB Samsung 830 & Seagate 2TB(died)
Display(s) Asus 24' LED/Samsung SyncMaster B1940
Case P100/Antec P280 It's huge!
Audio Device(s) on board
Power Supply SeaSonic SS-660XP2/Seasonic SS-760XP2
Software Win 7 Home Premiun 64 Bit
#10
GPU temperature reaches 71c in FurMark tests
Are you only seeing this problem with Furmark? Is the card able to stay cool when running something like Heaven? You really should stay away from Furmark.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
23,835 (5.70/day)
Likes
7,726
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
#11
Hello fine ladies and gentlemen. This is my first post in this forum :)

So straight to the problem... About a month ago I bought a MSI Lighting Z 1080 Ti. It was a great card but its RGB LEDs was malfunctioning so I had to sent it back. Then I decided to buy a Gigabyte Aorus GTX 1080 Ti Xtreme Editon since Lighting Z wasn't available anymore.
But the problem I have with my Gigabyte Aorus is that GPU Clock decreases constantly as temperature goes up. At first (when running a game), Afterburner shows GPU Clock as 1987Mhz, then temperature reaches 58c and GPU Clock decreases to 1974Mhz. Then temperature goes up till 63c and again GPU Clock goes down to 1949Mhz. When temperature reaches 71c, GPU Clock decreases to 1932Mhz.

I know this is probably nvidia's GPU Boost which is working but why I didn't had this problem with my Lighting Z 1080 Ti? On my previous Lighting Z 1080 Ti, regardless of what temperature was, GPU Clock was constant 1962Mhz (as stated in this page). Sometimes the temperature was as high as 66c but still 1962Mhz. But my Aorus GTX 1080 Ti keeps decreasing GPU Clock as temperature goes up. So am I missing something? Is there a way to prevent thins using Afterburne or somethingr?

My second question...Why I can't reach these clock profiles with my Gigabyte Aorus GTX 1080 Ti Xtreme? I never had 2025Mhz or 1993Mhz of GPU Clock.

Thanks in advance.
Different cards, different coolers. Gpus depend on unobstructed ample airflow and ambient temperature which is room air temperature. The cooler the room air temperature the better gpus run.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
25,668 (5.38/day)
Likes
12,031
Location
Indiana, USA
Processor Intel Core i7 8700K@4.8GHz(Quick and dirty)
Motherboard AsRock Z370 Taichi
Cooling Corsair H110i GTX w/ Noctua NF-A14 Fans
Memory 32GB Corsair DDR4-3000
Video Card(s) ASUS Strix GTX 1080Ti
Storage 500GB Crucial MX500 + 2TB Seagate Solid State Hybrid Drive with 480GB MX200 SSD Cache
Display(s) QNIX QX2710 1440p@120Hz
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply Corsair HX850
Software Windows 10 Pro x64
#12
Different cards, different coolers. Gpus depend on unobstructed ample airflow and ambient temperature which is room air temperature. The cooler the room air temperature the better gpus run.
That definitely plays a part. It sounds like the Gigabyte card's cooling isn't performing as well as the MSI card. Which isn't surprising considering Gigibyte's crappy quality.

But at the same time, Jayztwocents just did a video about how he had two identical cards, same manufacturer and model, and they boosted to different clocks. So my theory is that asic quality plays some role in it as well.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
23,835 (5.70/day)
Likes
7,726
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
#13
That definitely plays a part. It sounds like the Gigabyte card's cooling isn't performing as well as the MSI card. Which isn't surprising considering Gigibyte's crappy quality.

But at the same time, Jayztwocents just did a video about how he had two identical cards, same manufacturer and model, and they boosted to different clocks. So my theory is that asic quality plays some role in it as well.
That's another great factor too.

It's interesting on a 7770 thread i stated not all cards are created equal.
 
Joined
Sep 17, 2014
Messages
7,016 (4.53/day)
Likes
5,900
Location
Duiven, Netherlands
Processor i7 8700k 4.8Ghz @ 1.31v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
#14
Realistically we're talking about less than 2-3% deviation in clocks between top-end cards from different manufacturers. Worst case scenario, you'll see 4-5%. Honestly even if you count in silicon lottery, this is pretty neat. Real life performance of all these cards is remarkably close, at typical res of 1440p or 4K that's what, 3-5 FPS.
 
Joined
Mar 26, 2014
Messages
5,750 (3.33/day)
Likes
4,281
Location
Washington, USA
System Name Volt
Processor i7-4790k
Motherboard Gigabyte GA-Z97X Gaming 5
Cooling NZXT Kraken X60
Memory G.Skill Ripjaws 4x8GB
Video Card(s) MSI GTX1080 Ti Gaming X
Storage 250GB SSD / 2x1TB + 2x2TB HDD
Display(s) 3x AOC 2425W + ViewSonic VA2855Smh
Case Nanoxia Deep Silence 6
Audio Device(s) LucidSound LS30
Power Supply Rosewill Fortress 750w
Mouse Logitech G602
Keyboard G.Skill KM780 RGB (Brown switches)
Software Windows 10 Professional
Benchmark Scores Technical term is PEBCAK issue, which stands for Problem Exists Between Chair And Keyboard
#15
Haaaaaa.. my card doesn't downclock at higher temps unless I allow it to pass 65c. Stays at the 2012mhz I have it hit.

Honestly as others have said, keep it cool. Gigabyte might have their profiles more key on cooling than MSI does.
 
Joined
Apr 30, 2011
Messages
1,088 (0.39/day)
Likes
1,082
Location
Greece
Processor AMD FX-8350 4GHz@1.3V
Motherboard Gigabyte GA-970A UD3 Rev3.0
Cooling Zalman CNPS5X Performa
Memory 2*4GB Patriot Venom RED DDR3 1600MHz CL9
Video Card(s) XFX RX580 GTS 4GB
Storage Sandisk SSD 120GB, 2 Samsung F1 & F3 (1TB)
Display(s) LG IPS235
Case Zalman Neo Z9 Black
Audio Device(s) Via 7.1 onboard
Power Supply OCZ Z550
Mouse Zalman ZM-M401R
Keyboard Trust GXT280
Software Win 7 sp1 64bit
Benchmark Scores CB R15 64bit: single core 99p, multicore 647p WPrime 1.55 (8 cores): 9.0 secs
#16
Stop using Furmark to check clocks or thermals. Use a bechmark as 3D Mark or Heaven. AMD and nVidia have included years ago security checks into their drivers to not allow the bashing Furmark does to their GPUs.
 
Joined
Dec 31, 2009
Messages
12,823 (3.92/day)
Likes
7,252
Location
Ohio
System Name Daily Driver
Processor 7960X 4.5GHz 16c/16t 1.17V
Motherboard MSI XPower Gaming Titanium
Cooling MCR320 + Kuplos Kryos NEXT CPU block
Memory GSkill Trident Z 4x8 GB DDR4 3600 MHz CL16
Video Card(s) NVIDIA RTX 2080 Ti FE
Storage 512GB Patriot Hellfire, 512GB OCZ RD400, 640GB Caviar Black, 2TB Caviar Green
Display(s) 27" Acer Predator 2560x1440 144hz IPS + 27" Acer 2560x1440 75Hz IPS
Case Thermaltake P5
Power Supply EVGA 750W Supernova G2
Benchmark Scores Faster than most of you! Bet on it! :)
#17
While true, the OP never mentioned furmark??? He said "game".
 
Last edited:
Joined
Jan 13, 2016
Messages
528 (0.49/day)
Likes
428
Location
127.0.0.1, London, UK
System Name Warranty Void Mk.IV
Processor AMD Ryzen 2600 @ 4Ghz
Motherboard MSI X470 Gaming Plus - Rev 1.0 A30 / Asus X470-I Strix
Cooling Deepcool Captain EX 240 White powaah + 2x Be Quiet! Pure Wings 2 140mm / Silverstone 120mm Slim
Memory Corsair Vengeance LED DDR4 3200MHz 2x8GB CL16 - Tightened Sub-timings
Video Card(s) Palit JetStream GTX 1070 - Micron die
Storage Crucial MX300 525GB/MX500/ADATA SP550 250GB
Display(s) LG 29UM69G-B
Case Fractal Design Meshify TG/ Silverstone RVZ03-B
Audio Device(s) Realtek ALC 1220+SupremeFX
Power Supply Corsair CX550M 550W / Silverstone SX650-G 650W
Mouse A4Tech Bloody V3/SteelSeries Rival 100/Logitech G302
Keyboard Microsoft Comfort 3000 Curve/ CM Masterkeys Pro M
Software Windows 10 Pro x64 - LTSB
#18
Point is, GPU boosting technologies still suck. All they do is prevent proper overclocking out of the box.

It's good when it's there, but makes your day horrible when it's not needed and gets in the way.

Btw, Gigabyte's cards are pretty weak sauce on average when it comes to overclocking/thermals because getting a good GPU die from them is pretty rare IMO. They can slap a fancy cooling solution on it, it doesn't make the processor itself better quality.

I haven't noticed much difference between OC, G1 and AORUS cards. While EVGA segments tend do to something.
 
Last edited:
Joined
Dec 31, 2009
Messages
12,823 (3.92/day)
Likes
7,252
Location
Ohio
System Name Daily Driver
Processor 7960X 4.5GHz 16c/16t 1.17V
Motherboard MSI XPower Gaming Titanium
Cooling MCR320 + Kuplos Kryos NEXT CPU block
Memory GSkill Trident Z 4x8 GB DDR4 3600 MHz CL16
Video Card(s) NVIDIA RTX 2080 Ti FE
Storage 512GB Patriot Hellfire, 512GB OCZ RD400, 640GB Caviar Black, 2TB Caviar Green
Display(s) 27" Acer Predator 2560x1440 144hz IPS + 27" Acer 2560x1440 75Hz IPS
Case Thermaltake P5
Power Supply EVGA 750W Supernova G2
Benchmark Scores Faster than most of you! Bet on it! :)
#19
Not really..it just does some of it for you.
 
Joined
Jul 2, 2008
Messages
6,934 (1.82/day)
Likes
9,423
Location
Hillsboro, OR
System Name Main/DC
Processor i7-3770K/i7-2600K
Motherboard MSI Z77A-GD55/GA-P67A-UD4-B3
Cooling Phanteks PH-TC14CS/H80
Memory Crucial Ballistix Sport 16GB (2 x 8GB) LP /4GB Kingston DDR3 1600
Video Card(s) Asus GTX 660 Ti/MSI HD7770
Storage Crucial MX100 256GB/120GB Samsung 830 & Seagate 2TB(died)
Display(s) Asus 24' LED/Samsung SyncMaster B1940
Case P100/Antec P280 It's huge!
Audio Device(s) on board
Power Supply SeaSonic SS-660XP2/Seasonic SS-760XP2
Software Win 7 Home Premiun 64 Bit
#20
Joined
Dec 31, 2009
Messages
12,823 (3.92/day)
Likes
7,252
Location
Ohio
System Name Daily Driver
Processor 7960X 4.5GHz 16c/16t 1.17V
Motherboard MSI XPower Gaming Titanium
Cooling MCR320 + Kuplos Kryos NEXT CPU block
Memory GSkill Trident Z 4x8 GB DDR4 3600 MHz CL16
Video Card(s) NVIDIA RTX 2080 Ti FE
Storage 512GB Patriot Hellfire, 512GB OCZ RD400, 640GB Caviar Black, 2TB Caviar Green
Display(s) 27" Acer Predator 2560x1440 144hz IPS + 27" Acer 2560x1440 75Hz IPS
Case Thermaltake P5
Power Supply EVGA 750W Supernova G2
Benchmark Scores Faster than most of you! Bet on it! :)
#21
Oh geez, i guess i didnt catch that in the second post.

Point still remains is he said it he was seeing it in games first..then that morsel came in on top. :)
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
23,835 (5.70/day)
Likes
7,726
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
#22
Point is, GPU boosting technologies still suck. All they do is prevent proper overclocking out of the box.

It's good when it's there, but makes your day horrible when it's not needed and gets in the way.

Btw, Gigabyte's cards are pretty weak sauce on average when it comes to overclocking/thermals because getting a good GPU die from them is pretty rare IMO. They can slap a fancy cooling solution on it, it doesn't make the processor itself better quality.

I haven't noticed much difference between OC, G1 and AORUS cards. While EVGA segments tend do to something.
I prefer my cards operating at full throttle at all times or specifying on my own what profile to use when a program launches, variable clock rates tend to cause some screwy issues.
 
Joined
Feb 18, 2012
Messages
1,429 (0.57/day)
Likes
1,120
System Name MSI GT75 Titan
Processor i7-8850H
Cooling 2 laptop fans
Memory 32gb of 2666mhz DDR4
Video Card(s) Nvidia 1080
Storage Samsung 860 4tb SSD, x2 Samsung 970 2tb nvme
Display(s) 17.3" IPS 1920x1080 120Hz
Power Supply 330w power supply
Mouse Logitech m705
Keyboard laptop keyboard
Software lots of movies and Windows 10 with win 7 shell
Benchmark Scores High enough for me
#23
If you download and install Afterburner you can undervolt the graphics card to run at lower voltage which equals to lower temps.
 
Joined
Oct 2, 2004
Messages
13,791 (2.66/day)
Likes
6,920
#24
Not really..it just does some of it for you.
Not to mention GPU Boost 3.0 does it so well it almost reaches what cards can achieve with manual overclock. I wonder if NVIDIA will be able to bump it up even further to really squeeze out every MHz from it out of the box, basically making manual overclocking unnecessary. I'd actually quite like that. Because I'm lazy and don't want to be bothered by installing 3rd party stuff to achieve same results. :D
 
Joined
Oct 20, 2017
Messages
6 (0.01/day)
Likes
0
Processor Intel Core i7 6700K
Motherboard Asus MAXIMUS VIII Ranger
Cooling Noctua NH-D15
Memory Riphaws 2x8GB DDR4
Video Card(s) Gigabyte Aorus GTX 1080 Ti Xtreme Edition
Storage Samsung SSD Pro 850
Display(s) Asus MX239H
Power Supply CoolerMaster 850+
Software Windows 10 x64 15063.632
#25
OK so I did run some tests to find out how I can prevent GPU Boost's behavior in decreasing GPU Clock at higher temperatures.
First and utmost, as is obvious, was temperature. I Couldn't change the fact that GPU Boost downclock GPU Clock from 1974Mhz to 1949Mhz at 63c degrees, however; I did manage to bump up fan speed to keep the GPU cool and under 63c degrees and thus, playing with 1974Mhz stable. This was achievable through games and benchmarks and not in stress tests (for example in Firestrike Ultra stress test)
Second factor was power target. By monitoring power consumption through Afterburner I did find a sweet spot for power target offset in 4K tests which was %140. Any more than this was pointless since power consumption never went upper than this. Of course this required more Fan Speed to keep the GPU cool and prevent GPU Clock from going under 1974Mhz. Also, I must mention that this power target is not necessary in most games at 1080. At 1080, something around %100 or %110 was enough to maintain 1974Mhz Clock rates. More demanding games like Ghost Recon Wildland was exception of course. I hope gigabyte somehow increase the low default power target by a future Bios updates.
And third was Voltage which I did notice some limitations at stock offset but didn't tamper with it because I didn't know what was the safe offset. Again, this limitation was shown in 4K by Afterburner and not in 1080. Plus, I've already reached 1974Mhz stable as I said so didn't feel the need to tamper with it.

So overall, I can say that given the fact of Aorus's price compared to my previous Lightning Z, I'm pretty happy with my card at the moment. It's true that it's not as cool as Lightning Z (71c degrees at full load in Aorus compared to maximum 66c degrees in Lightning Z) but as @Vayra86 said, difference between these clock rates (for example 1974Mhz,1963Mhz or even 1949Mhz) are just 1 or 2 percent max and not even noticeable in in-game FPS. Maybe at 4K this can lead to maximum 1 or 2 FPS difference.


You can try to give the core clock a small bump and see how stable that is. In PrecisionX, you can also set clocks per boost bin, make your own curve.

Don't expect miracles, GPU Boost 3.0 is already fine tuned as it is, and pretty locked down.
Well, my goal was to maintain high clock rates out of the box and without overclocking. Because as I said and you said it too, these GPU clock rates are not that much of difference. But I confess, seeing clock rates goes up and down in games sometimes can make you sensitive or edgy :)

Are you only seeing this problem with Furmark? Is the card able to stay cool when running something like Heaven? You really should stay away from Furmark.
Well I should've mentioned that this problem only occurs in Furmark stress test (Firestrike Ultra) and not in games. I haven't tried Heaven since games are OK. Maybe you or others can report their temperatures in Firestrike Ultra stress test? This can be helpful.

That definitely plays a part. It sounds like the Gigabyte card's cooling isn't performing as well as the MSI card. Which isn't surprising considering Gigibyte's crappy quality.
I thinks it's true that Gigibyte's quality is actually crappy compared to let's say MSI for example. But my last gen card was Gigabyte GTX 980Ti Xtreme Edition which I think was one of the finest cards at 1-2 years ago. So manufacturers too kinda have their own bumpy road in the terms of product's quality I think.
 
Top