• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU-Z readings are strange - bug?

Joined
Apr 3, 2019
Messages
25 (0.50/day)
Location
London, UK
System Name Custom Water-cooled Server / Desktop
Processor Dual Intel Xeon E5465 / i5-6600k
Motherboard Asus Z8NA-D6 / Maximus VIII Ranger
Cooling Custom built: DangerDen MC-TDX / H100i
Memory 24GB DDR3 / 16GB DDR4-3200
Video Card(s) GTX470 / RTX2080
Storage SSD + 1TB HDD / SSD + M.2
Display(s) BenQ GW2765 1440p
Case CoolerMaster Full Tower / Cosmos
Power Supply PC Power & Cooling 750w (both)
Software VMware ESXi 6.7 / Windows 10
I have the EVGA RTX 2080 XC Gaming and been using it for Folding and I am getting some strange results - is it X1 or GPUz that is correct?

With 1920 OC in X1 - why does GPUz show same clocks as stock in main screen for GPU Clock?



If I reduce the OC to 1635 then it shows clocks are lower than stock?



Am I reading this wrong or any ideas?
Thanks.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
26,163 (5.31/day)
Location
Indiana, USA
Processor Intel Core i7 8700K@4.8GHz(Quick and dirty)
Motherboard AsRock Z370 Taichi
Cooling Corsair H110i GTX w/ Noctua NF-A14 Fans
Memory 32GB Corsair DDR4-3000
Video Card(s) ASUS Strix GTX 1080Ti
Storage 500GB Crucial MX500 + 2TB Seagate Solid State Hybrid Drive with 480GB MX200 SSD Cache
Display(s) QNIX QX2710 1440p@120Hz
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply Corsair HX850
Software Windows 10 Pro x64
Apparently X1 is giving you the max boost for the clock speed, and I think 1920 is actually the stock max boost for that card not an OC. While GPUz is showing you the "standard" boost clock, not the maximum. I think that is pretty standard for GPUz, because it always shows a lower clock speed for the boost on the first tab than what my cards actually boost to.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
19,053 (3.47/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
It's normal that way. The rated boost that you see listed in gpuz and mfgr specifications is a very conservative value, actual boost will almost always be higher, but there is no way read/detect it without fully loading the card and it varies too
 
Joined
Apr 3, 2019
Messages
25 (0.50/day)
Location
London, UK
System Name Custom Water-cooled Server / Desktop
Processor Dual Intel Xeon E5465 / i5-6600k
Motherboard Asus Z8NA-D6 / Maximus VIII Ranger
Cooling Custom built: DangerDen MC-TDX / H100i
Memory 24GB DDR3 / 16GB DDR4-3200
Video Card(s) GTX470 / RTX2080
Storage SSD + 1TB HDD / SSD + M.2
Display(s) BenQ GW2765 1440p
Case CoolerMaster Full Tower / Cosmos
Power Supply PC Power & Cooling 750w (both)
Software VMware ESXi 6.7 / Windows 10
Thanks for the replies - I guess this is something new for me and maybe I didn't ask the right question above. I understand your point that it is Boost and not OC, but why are the clock speeds different on the two screens (main and sensors)?
2080-1920.jpg

2080-1635.jpg
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
26,163 (5.31/day)
Location
Indiana, USA
Processor Intel Core i7 8700K@4.8GHz(Quick and dirty)
Motherboard AsRock Z370 Taichi
Cooling Corsair H110i GTX w/ Noctua NF-A14 Fans
Memory 32GB Corsair DDR4-3000
Video Card(s) ASUS Strix GTX 1080Ti
Storage 500GB Crucial MX500 + 2TB Seagate Solid State Hybrid Drive with 480GB MX200 SSD Cache
Display(s) QNIX QX2710 1440p@120Hz
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply Corsair HX850
Software Windows 10 Pro x64
Thanks for the replies - I guess this is something new for me and maybe I didn't ask the right question above. I understand your point that it is Boost and not OC, but why are the clock speeds different on the two screens (main and sensors)?
View attachment 121351
View attachment 121352
We both answered that question. The "rated" boost listed on the first tab is a conservative number, the actual boost is almost always higher. AFAIK, this is just the way nVidia does their rating and how GPUBoost works.
 
Joined
Dec 31, 2009
Messages
14,245 (4.15/day)
One is a REALTIME clock speed with boost (sensor tab), the other is the minimum rated boost (mfg specification like w1z said) for the card and a static value read from the card.
 
Last edited:
Joined
Jan 8, 2017
Messages
4,040 (4.68/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Scythe Katana 4 - 3x 120mm case fans
Memory 16GB - Corsair Vengeance LPX
Video Card(s) OEM Dell GTX 1080
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Zalman R1
Power Supply 500W
The boost figure is just an "average" or "likely" figure that Nvidia figured they should list. It's doesn't mean anything in particular.
 
Joined
Apr 3, 2019
Messages
25 (0.50/day)
Location
London, UK
System Name Custom Water-cooled Server / Desktop
Processor Dual Intel Xeon E5465 / i5-6600k
Motherboard Asus Z8NA-D6 / Maximus VIII Ranger
Cooling Custom built: DangerDen MC-TDX / H100i
Memory 24GB DDR3 / 16GB DDR4-3200
Video Card(s) GTX470 / RTX2080
Storage SSD + 1TB HDD / SSD + M.2
Display(s) BenQ GW2765 1440p
Case CoolerMaster Full Tower / Cosmos
Power Supply PC Power & Cooling 750w (both)
Software VMware ESXi 6.7 / Windows 10
Thanks for the replies - I think I understand it now.

I removed EVGA Precision X1 and installed MSI Afterburner. Now it makes more sense having done the OC Scanner and played with the sliders.
It has been a while since I had a Geforce card.........
 
Top