• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU-Z "validation" easy to fool?

Z999z3mystorys

New Member
Joined
Jan 11, 2009
Messages
5 (0.00/day)
It seems that it's quite easy to fool GPU-Z into thinking that you've gotten much higher clock speeds than you really have.

I'd think it best not to say how this is done, but is this going to be fixed? There are impossibly high clocks recorded with this now, making comparing the GPU-Z validation records pointless. I just set a new record for my card by "cheating" this way, and tt was way too easy.
 

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,673 (2.29/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
I know it reads driver clocks instead of hardware clocks on my GTX 260, which depending on which "clock step-range" i'm in could be lower or higher than my actual clocks. Though it's not a huge difference ever. But you should post your validation link, and explain further how you could cheat. May be useful to W1z in sorting it out.

:toast:
 

Z999z3mystorys

New Member
Joined
Jan 11, 2009
Messages
5 (0.00/day)
http://www.techpowerup.com/gpuz/dbda5/

1200 core with a 2000 (4000 effective) memory clock...

How was this done? one hyphenated word: Auto-underclocking

Basicly some GPUs will underclock themselves to a low speed and only ramp up to there listed speed when a 3D Application is started.

So you can set your speed sky high, not run any 3D programs, and the GPU will stay at it's underclocked speed, but GPU-Z will think it's running at full speed. GPU-Z need a monitoring system to monitor real Clock speeds and report that, not the set clock speeds.
 

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,673 (2.29/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
LoL nice results haha. Yeah another user here by the name of Solaris17 was doing some heavy OC-ing on an 8600GT(S), found out GPU-z was misreporting, that was months ago tho. But with submissions like this maybe that can be caught and fixed.

:toast:
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
One can always fake a score by memory editing...

There is no way to 100% guarentee it isn't happening. The only way I can think of is to establish a secured connection with the validation server and have the validation serve initialize and record the measurements. That way, the window to modify the result client-side is very slim if not completely non-existant unless the encryption is broken on the connection.

Still, the value could be altered at it's origin. If they aren't the actual clocks the card is using as in if the clocks were changed, the card could be damaged. I have no idea how you'd go about finding those memory addresses assuming they are even accessible (not protected by the BIOS and/or OS).

I'll have to ponder that one...
 

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.22/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
I don't think it matters so much since you can't actually use the application.

If you were to use what GPU-Z is actually for then you wouldn't be able to
show those clocks and have anyone take you seriously.

Take Vantage for example. You complete it at a clock and then bring up
CPU/GPU-Z for a screenshot, but you decide to tweak about the graphics
results. Your vantage score looks like crap because someone with your
same clocks pre-tweak would get the same scores.

I never see anyone take validation seriously as we usually trust each other.

It's not like it's hard to sniff out a liar with so many enthusiasts.

-my 2 cents
 

Z999z3mystorys

New Member
Joined
Jan 11, 2009
Messages
5 (0.00/day)
Well I guess my main point is that it's way to easy, if you have to do memory editing to fool the system, that's one thing, but if you can just stumble on this without really trying, like you can with this method, that's another matter.
 

Z999z3mystorys

New Member
Joined
Jan 11, 2009
Messages
5 (0.00/day)
Anyway my main concern is how can you compare if people can cheat this way? That is to say, the absurdly high scores are easy to ignore, but the borderline ones are the ones I'm concerned about, did they really get their core clock to 800, or did the fake it? I mean I know it's not that important in the end, but in the spirit of competition people like me want to compare scores with a fair degree of certainly that the scores aren't faked.

Take 3DMark, people compete for high scores there, and we can be reasonably believe that the scores are real.

Basicly having something that so easy to cheat at makes competing pointless.
 
Joined
May 19, 2007
Messages
4,520 (0.73/day)
Location
Perth AU
Processor Intel Core i9 10980XE @ 4.7Ghz 1.2v
Motherboard ASUS Rampage VI Extreme Omega
Cooling EK-Velocity D-RGB, EK-CoolStream PE 360, XSPC TX240 Ultrathin, EK X-RES 140 Revo D5 RGB PWM
Memory G.Skill Trident Z RGB F4-3000C14D 64GB
Video Card(s) Asus ROG Strix GeForce RTX 4090 OC WC
Storage M.2 990 Pro 1TB / 10TB WD RED Helium / 3x 860 2TB Evos
Display(s) Samsung Odyssey G7 28"
Case Corsair Obsidian 500D SE Modded
Power Supply Cooler Master V Series 1300W
Software Windows 11
you can fall it ezy but people know what clocks are real and not real, u can ezy just oc really high just to get a screen shot then drop it back down even noing that the card would never run that high.
 

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.22/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
Anyway my main concern is how can you compare if people can cheat this way? That is to say, the absurdly high scores are easy to ignore, but the borderline ones are the ones I'm concerned about, did they really get their core clock to 800, or did the fake it? I mean I know it's not that important in the end, but in the spirit of competition people like me want to compare scores with a fair degree of certainly that the scores aren't faked.

Take 3DMark, people compete for high scores there, and we can be reasonably believe that the scores are real.

Basicly having something that so easy to cheat at makes competing pointless.

but I would never compete with people in GPU-Z validations. lol

Then again I don't compete. I only play along. The numbers mean one thing but as long as my name is up on the list of entrees I'm happy.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
there has never been any effort on establishing some kind of 100% validated scoring system into gpuz. this may be something to work on in the future. the 2d/3d clock trick has been around since the first days since gpuz is out
 
Top