• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Who to believe? GPU-Z vs MSI Afterburner

Luis Garcia

New Member
Joined
Jul 4, 2017
Messages
4 (0.00/day)
Hello,

I've noticed that GPU-Z and MSI Afterburner show different values of memory clock:



My graphic card is MSI GeForce GTX 970 100ME. Screen above shows the card in IDLE. As you can see, Afterburner says that memory clock is 324 MHz and GPU-Z - 162 MHz, so question is simply: which utility is telling the truth? Which value is correct?
 
Joined
Jan 5, 2006
Messages
17,847 (2.67/day)
System Name AlderLake / Laptop
Processor Intel i7 12700K P-Cores @ 5Ghz / Intel i3 7100U
Motherboard Gigabyte Z690 Aorus Master / HP 83A3 (U3E1)
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans / Fan
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MHz CL36 / 8GB DDR4 HyperX CL13
Video Card(s) MSI RTX 2070 Super Gaming X Trio / Intel HD620
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2 / Samsung 256GB M.2 SSD
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p / 14" 1080p IPS Glossy
Case Be quiet! Silent Base 600 - Window / HP Pavilion
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W / Powerbrick
Mouse Logitech MX Anywhere 2 Laser wireless / Logitech M330 wireless
Keyboard RAPOO E9270P Black 5GHz wireless / HP backlit
Software Windows 11 / Windows 10
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
Hello,

I've noticed that GPU-Z and MSI Afterburner show different values of memory clock:



My graphic card is MSI GeForce GTX 970 100ME. Screen above shows the card in IDLE. As you can see, Afterburner says that memory clock is 324 MHz and GPU-Z - 162 MHz, so question is simply: which utility is telling the truth? Which value is correct?

Both are correct , it's GDDR5 , 162MHz x 2 = 324Mhz

http://www.geeks3d.com/20100613/tut...-clock-real-and-effective-speeds-demystified/
 
Joined
May 8, 2016
Messages
1,741 (0.60/day)
System Name BOX
Processor Core i7 6950X @ 4,26GHz (1,28V)
Motherboard X99 SOC Champion (BIOS F23c + bifurcation mod)
Cooling Thermalright Venomous-X + 2x Delta 38mm PWM (Push-Pull)
Memory Patriot Viper Steel 4000MHz CL16 4x8GB (@3240MHz CL12.12.12.24 CR2T @ 1,48V)
Video Card(s) Titan V (~1650MHz @ 0.77V, HBM2 1GHz, Forced P2 state [OFF])
Storage WD SN850X 2TB + Samsung EVO 2TB (SATA) + Seagate Exos X20 20TB (4Kn mode)
Display(s) LG 27GP950-B
Case Fractal Design Meshify 2 XL
Audio Device(s) Motu M4 (audio interface) + ATH-A900Z + Behringer C-1
Power Supply Seasonic X-760 (760W)
Mouse Logitech RX-250
Keyboard HP KB-9970
Software Windows 10 Pro x64
But why would you use DDR metric for GDDR5 ?
Can GDDR5 work in GDDR3 mode (for 2D states) ?

Because to me, QDR is QDR, and DDR is DDR.
You either use real values for frequency or effective values and effective values for GDDR5 memory should always be in QDR (ie. 4x real clock).

Also, what do you do with GDDR5X/GDDR6 ?
You have Real 1250MHz, DDR effective 2500MHz, QDR ("8n") effective of 5000MHz and "Octa" Data Rate (or QDR "16n") of 10 000MHz.
Which is the actual effective VRAM frequency for original GTX 1080 ?
 
Last edited:
Top