• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU-Z readings are strange - bug?

Joined
Apr 3, 2019
Messages
44 (0.02/day)
Location
London, UK
System Name Custom Water-cooled Server / Desktop
Processor Dual Intel Xeon E5465 / i5-11400
Motherboard Asus Z8NA-D6 / Prime H570-plus
Cooling Custom built: DangerDen MC-TDX / H100i
Memory 24GB DDR3 / 16GB DDR4-3200
Video Card(s) GTX470 / RTX2080
Storage SSD + 1TB HDD / P5-Plus + BX500
Display(s) BenQ GW2765 1440p
Case CoolerMaster Full Tower / Cosmos
Power Supply PC Power & Cooling 750w (both)
Software Ubuntu / Windows 11
I have the EVGA RTX 2080 XC Gaming and been using it for Folding and I am getting some strange results - is it X1 or GPUz that is correct?

With 1920 OC in X1 - why does GPUz show same clocks as stock in main screen for GPU Clock?



If I reduce the OC to 1635 then it shows clocks are lower than stock?



Am I reading this wrong or any ideas?
Thanks.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Apparently X1 is giving you the max boost for the clock speed, and I think 1920 is actually the stock max boost for that card not an OC. While GPUz is showing you the "standard" boost clock, not the maximum. I think that is pretty standard for GPUz, because it always shows a lower clock speed for the boost on the first tab than what my cards actually boost to.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
It's normal that way. The rated boost that you see listed in gpuz and mfgr specifications is a very conservative value, actual boost will almost always be higher, but there is no way read/detect it without fully loading the card and it varies too
 
Joined
Apr 3, 2019
Messages
44 (0.02/day)
Location
London, UK
System Name Custom Water-cooled Server / Desktop
Processor Dual Intel Xeon E5465 / i5-11400
Motherboard Asus Z8NA-D6 / Prime H570-plus
Cooling Custom built: DangerDen MC-TDX / H100i
Memory 24GB DDR3 / 16GB DDR4-3200
Video Card(s) GTX470 / RTX2080
Storage SSD + 1TB HDD / P5-Plus + BX500
Display(s) BenQ GW2765 1440p
Case CoolerMaster Full Tower / Cosmos
Power Supply PC Power & Cooling 750w (both)
Software Ubuntu / Windows 11
Thanks for the replies - I guess this is something new for me and maybe I didn't ask the right question above. I understand your point that it is Boost and not OC, but why are the clock speeds different on the two screens (main and sensors)?
2080-1920.jpg

2080-1635.jpg
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Thanks for the replies - I guess this is something new for me and maybe I didn't ask the right question above. I understand your point that it is Boost and not OC, but why are the clock speeds different on the two screens (main and sensors)?
View attachment 121351
View attachment 121352

We both answered that question. The "rated" boost listed on the first tab is a conservative number, the actual boost is almost always higher. AFAIK, this is just the way nVidia does their rating and how GPUBoost works.
 
Joined
Dec 31, 2009
Messages
19,366 (3.70/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
One is a REALTIME clock speed with boost (sensor tab), the other is the minimum rated boost (mfg specification like w1z said) for the card and a static value read from the card.
 
Last edited:
Joined
Jan 8, 2017
Messages
8,944 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
The boost figure is just an "average" or "likely" figure that Nvidia figured they should list. It's doesn't mean anything in particular.
 
Joined
Apr 3, 2019
Messages
44 (0.02/day)
Location
London, UK
System Name Custom Water-cooled Server / Desktop
Processor Dual Intel Xeon E5465 / i5-11400
Motherboard Asus Z8NA-D6 / Prime H570-plus
Cooling Custom built: DangerDen MC-TDX / H100i
Memory 24GB DDR3 / 16GB DDR4-3200
Video Card(s) GTX470 / RTX2080
Storage SSD + 1TB HDD / P5-Plus + BX500
Display(s) BenQ GW2765 1440p
Case CoolerMaster Full Tower / Cosmos
Power Supply PC Power & Cooling 750w (both)
Software Ubuntu / Windows 11
Thanks for the replies - I think I understand it now.

I removed EVGA Precision X1 and installed MSI Afterburner. Now it makes more sense having done the OC Scanner and played with the sliders.
It has been a while since I had a Geforce card.........
 
Top