• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU-Z box ordering

Joined
Mar 21, 2021
Messages
2,780 (4.46/day)
Location
Colorado, U.S.A.
System Name HP Compaq 8000 Elite CMT
Processor Intel Core 2 Quad Q9550
Motherboard Hewlett-Packard 3647h
Memory 16GB DDR3
Video Card(s) NVIDIA GeForce GT 1030 GDDR5 (fan-less)
Storage 2TB Seagate Firecuda 3.5"
Display(s) Dell P2416D (2560 x 1440)
Power Supply 12V HP proprietary
Software Windows 10 Pro 64-bit
I find the clock rates arrangement possibly confusing

GPU Clock 1228 MHz | Memory 1502 MHz | Boost 1468 MHz

would this not be better arranged

GPU Clock/boost 1228 MHz/1468 MHz | Memory 1502 MHz

or else it almost looks like the Memory slows down on boost
 

Attachments

  • GPU-Z.jpg
    GPU-Z.jpg
    9.3 KB · Views: 97
Last edited:
Joined
Feb 20, 2020
Messages
6,074 (5.97/day)
Location
Texas
System Name Ghetto Rigs x299 & z490 & x99
Processor 9940x w/Optimus SigV2 & 10900k w/Optimus Foundation & 5930k w/Noctua D15
Motherboard X299 Rampage VI Apex & z490 Maximus XII Apex & x99 Sabertooth
Cooling D5 combo/280 GTX/ VRM water block copper/280 GTX/ D5 Top/Optimus sigV2/TitanXp/Mora 360x2
Memory Trident-Z 3600C16 4x8gb & Trident-Z Royal 4000c16 2x16gb & Trident-Z 3200c14 4x8gb
Video Card(s) 1080ti FTW3 & Titan Xp & evga 980ti gaming
Storage 970 evo plus 500gb & 970 evo 500gb many sammy 2.5" ssd's and WD BLK hdd's
Display(s) 1-AOC G2460PG 24"G-Sync 144Hz/ 2nd 1-ASUS VG248QE 24"/ 3rd LG 43" series
Case D450 second floor for 2nd rad x2/ Cherry Entertainment center on Test bench
Audio Device(s) Built in Realtek x2 with 2-Insignia 2.0 sound bars & 1-LG sound bar
Power Supply EVGA 1200P2 & 1000P2 with APC AX1500 & 850P2 with CyberPower-GX1325U & 750P2
Mouse Redragon Perdition x3
Keyboard G710+x3
Software Win-7 pro x3 and Linux Cinnamon 20.2x2 & win-10 & 11pro x3
Benchmark Scores Are in the benchmark section
Hi,
Doesn't memory have a boost clock to ?
 
Joined
Apr 28, 2011
Messages
990 (0.23/day)
Location
Botevgrad, Bulgaria, Europe
System Name Main PC/OldPC/3rd PC
Processor Intel Core i7-3770K Ivy Bridge/Core i5-3470 Ivy Bridge/Core i3-4330 Haswell
Motherboard ASUS P8Z77-V/ASRock Z68 Pro3 Gen1/ASUS H81M2
Cooling Cooler Master Hyper 212 EVO/Intel Box cooler/Intel Box cooler
Memory 32GB Corsair Vengeance/32GB ADATA/16GB ADATA
Video Card(s) SAPPHIRE R9 290 Tri-X OC 4GB/MSI RX 480 8GB/SAPPHIRE R9 390 8GB
Storage 2x1TB ADATA SSDs in RAID0+3 HDDs/2xCrucial 1TB SSDs in RAID0+3 HDDs/Samsung 1TB SSD+8TB+4TB HDDs
Display(s) Philips 274E5QHAB@HDMI + Philips 273EQH@DVI (both 27")
Case Fractal Design Define R4 Titanium
Audio Device(s) Kenwood Mini HiFi system/Microlab speakers/Philips HDMI (main)+LG TV monitor HDMI + Apple headphones
Power Supply Cooler Master Silent ProM 600 W (modular)
Mouse Microsoft Ergonomic Sculpt Desktop 2.0 (combo)@Razer Goliath mousepad (Medium speed)
Keyboard Microsoft Ergonomic Sculpt Desktop 2.0 (combo)
Software Win10 64-bit (Main PC v.1809 RTM Enterprise/2nd PC v.1903 Insider Preview Pro/3rd PC - same as 2nd)
Please could you give a screenshot of whole GPU-Z window, without cutting it?
 
Joined
Feb 6, 2021
Messages
1,588 (2.38/day)
Location
Germany
Processor AMD Ryzen 7 5800X3D (-30 CO)
Motherboard MSI B550 Tomahawk
Cooling Noctua NH-D15
Memory 2x16GB G.Skill Trident Z Neo 3600 Mhz CL16
Video Card(s) Gainward RTX 4090 Phantom Golden Sample
Storage Corsair MP 600 Pro 1TB, Crucial P5 Plus 2TB, 2x Samsung 870 QVO 4TB (external)
Display(s) Alienware AW2723DF, LG GP850-B, LG GN950-B
Case Streacom BC1 V2
Audio Device(s) Bose Companion Series 2 III, Sennheiser GSP600 and HD599 SE - Sharkoon Gaming DAC Pro S V2
Power Supply bequiet! Dark Power Pro 12 1200w Titanium
Mouse Logitech G303 Shroud Edition, G502 X
Keyboard Corsair K65 RGB Mini
VR HMD Oculus Rift S
Joined
May 8, 2016
Messages
1,470 (0.61/day)
System Name BOX
Processor Core i7 6950X @ 4,25GHz (1,28V)
Motherboard X99 SOC Champion (BIOS F23c + bifurcation mod)
Cooling Thermalright Venomous-X + 2x Delta 38mm PWM (Push-Pull)
Memory Patriot Viper Steel 4000MHz CL16 4x8GB (@3240MHz CL12.12.12.24 CR2T @ 1,48V)
Video Card(s) Titan Xp Star Wars "Empire" edition
Storage Samsung SM961 256GB NVMe, 1x WD10EZEX (1TB), 2x HGST HUS726060ALE610 (6TB)
Display(s) LG 27GP950-B
Case Fractal Design Meshify 2 XL
Audio Device(s) Motu M4 (audio interface) + ATH-A900Z + Behringer C-1
Power Supply Seasonic X-760 (760W)
Mouse Logitech RX-250
Keyboard HP KB-9970
Software Windows 10 Pro x64
Benchmark Scores https://www.passmark.com/baselines/V10/display.php?id=157143230295
@GerKNG Max boost clock doesn't mean anything (as it's seen only for fraction of second before card throttles).
Since "Guaranteed boost clock" is seen in GPU-z is more pointless than memory clock, I say frequency placement is good as is.
I think most cards with cooling problems on NV will either throttle to base clock, or will have VRM issues (blown MOSFETs/shunt failure), before improper cooling becomes serious problem.
 
Joined
Mar 21, 2021
Messages
2,780 (4.46/day)
Location
Colorado, U.S.A.
System Name HP Compaq 8000 Elite CMT
Processor Intel Core 2 Quad Q9550
Motherboard Hewlett-Packard 3647h
Memory 16GB DDR3
Video Card(s) NVIDIA GeForce GT 1030 GDDR5 (fan-less)
Storage 2TB Seagate Firecuda 3.5"
Display(s) Dell P2416D (2560 x 1440)
Power Supply 12V HP proprietary
Software Windows 10 Pro 64-bit

Attachments

  • GPU-Z.jpg
    GPU-Z.jpg
    86.1 KB · Views: 92
Joined
Feb 6, 2021
Messages
1,588 (2.38/day)
Location
Germany
Processor AMD Ryzen 7 5800X3D (-30 CO)
Motherboard MSI B550 Tomahawk
Cooling Noctua NH-D15
Memory 2x16GB G.Skill Trident Z Neo 3600 Mhz CL16
Video Card(s) Gainward RTX 4090 Phantom Golden Sample
Storage Corsair MP 600 Pro 1TB, Crucial P5 Plus 2TB, 2x Samsung 870 QVO 4TB (external)
Display(s) Alienware AW2723DF, LG GP850-B, LG GN950-B
Case Streacom BC1 V2
Audio Device(s) Bose Companion Series 2 III, Sennheiser GSP600 and HD599 SE - Sharkoon Gaming DAC Pro S V2
Power Supply bequiet! Dark Power Pro 12 1200w Titanium
Mouse Logitech G303 Shroud Edition, G502 X
Keyboard Corsair K65 RGB Mini
VR HMD Oculus Rift S
@GerKNG Max boost clock doesn't mean anything (as it's seen only for fraction of second before card throttles).
Since "Guaranteed boost clock" is seen in GPU-z is more pointless than memory clock, I say frequency placement is good as is.
I think most cards with cooling problems on NV will either throttle to base clock, or will have VRM issues (blown MOSFETs/shunt failure), before improper cooling becomes serious problem.
we talk about memory.
 
Joined
Mar 21, 2021
Messages
2,780 (4.46/day)
Location
Colorado, U.S.A.
System Name HP Compaq 8000 Elite CMT
Processor Intel Core 2 Quad Q9550
Motherboard Hewlett-Packard 3647h
Memory 16GB DDR3
Video Card(s) NVIDIA GeForce GT 1030 GDDR5 (fan-less)
Storage 2TB Seagate Firecuda 3.5"
Display(s) Dell P2416D (2560 x 1440)
Power Supply 12V HP proprietary
Software Windows 10 Pro 64-bit
Yes, the present ordering suggests (at least to me) that boost goes with memory.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
24,876 (3.67/day)
Processor Core i7-8700K
Memory 32 GB
Video Card(s) RTX 3080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Historically there was only gpu and memory clocks, then there was shader, and eventually boost, for which i repurposed the 3rd field (shader clock).

Changing it now will probably confuse everyone. I do have plans for a rewrite of GPU-Z, which will include a new GUI, at that point changing the clock order seems like a great idea
 
Joined
Mar 21, 2021
Messages
2,780 (4.46/day)
Location
Colorado, U.S.A.
System Name HP Compaq 8000 Elite CMT
Processor Intel Core 2 Quad Q9550
Motherboard Hewlett-Packard 3647h
Memory 16GB DDR3
Video Card(s) NVIDIA GeForce GT 1030 GDDR5 (fan-less)
Storage 2TB Seagate Firecuda 3.5"
Display(s) Dell P2416D (2560 x 1440)
Power Supply 12V HP proprietary
Software Windows 10 Pro 64-bit
Technology is a harsh mistress.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
37,639 (6.68/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
@GerKNG Max boost clock doesn't mean anything (as it's seen only for fraction of second before card throttles).
Since "Guaranteed boost clock" is seen in GPU-z is more pointless than memory clock, I say frequency placement is good as is.
I think most cards with cooling problems on NV will either throttle to base clock, or will have VRM issues (blown MOSFETs/shunt failure), before improper cooling becomes serious problem.
It helps to find correct bios though, all 3 values, because on the aib pages it shows those values typically.

Ive gotten so used to the way gpu z is sequenced that its exactly the same in the vga bios collection
 
Joined
Oct 12, 2014
Messages
23 (0.01/day)
Location
Australia
Historically there was only gpu and memory clocks, then there was shader, and eventually boost, for which i repurposed the 3rd field (shader clock).

Changing it now will probably confuse everyone. I do have plans for a rewrite of GPU-Z, which will include a new GUI, at that point changing the clock order seems like a great idea
So what happens if you scan a card that has shader cores? Will the "Boost" word change to "Shader Core Clock" or stay "Boost" and confuse people? As I also agree with OP that that "Boost" looks like it's referring to the memory clock speed's boost (if it has one) and not the GPU's core clock speed boost.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
37,639 (6.68/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
So what happens if you scan a card that has shader cores? Will the "Boost" word change to "Shader Core Clock" or stay "Boost" and confuse people? As I also agree with OP that that "Boost" looks like it's referring to the memory clock speed's boost (if it has one) and not the GPU's core clock speed boost.
SPs
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
24,876 (3.67/day)
Processor Core i7-8700K
Memory 32 GB
Video Card(s) RTX 3080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
So what happens if you scan a card that has shader cores? Will the "Boost" word change to "Shader Core Clock"
Correct
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
37,639 (6.68/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Joined
Oct 8, 2015
Messages
604 (0.23/day)
Location
Earth's Troposphere
System Name cvasi-rocket
Processor AMD Ryzen R7 5800X3D @ 1V doing 4GHz
Motherboard Asus ROG Crosshair VI Extreme
Cooling Air;oldie but a goodie/brand new monoblock in box
Memory 4 8GB SK Hynix dual rank DDR4 dimms @ 2133MT-3200MT
Video Card(s) Gigabyte RX 6900XT Gaming OC @underclocked and undervolted
Storage 480 GB Corsair MP510 , Kingston A400 480 GB SSD
Display(s) Dell UltraSharp U2410 , HP 24x
Case in the box
Audio Device(s) onboard
Power Supply Seasonic Focus PX-550
Mouse Razer Deathadder (2018 model)
Keyboard wireless , Logitech
Software W10.someting or another
@regorwin56 As far as I recall from the early-mi 2000s when I got into PCs, all Radeon GPU had the sharders running at the same clocks as the rest(whole) GPU, cant speak about earlyer ones and then Nvidia has its own GPU architectures.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
24,876 (3.67/day)
Processor Core i7-8700K
Memory 32 GB
Video Card(s) RTX 3080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
NVIDIA, like AMD, doesn't want display shaders (?
On NVIDIA's Tesla architecture (~2006), the shader cores (aka GPU cores, the math units) were clocks at a higher rate than the rest of the GPU (like TMUs and ROPs).
Nowadays all GPUs run their important parts at the same clock speed

I remember seeing that AMD didn't want GPU Z display shaders before
So wondering if NVIDIA has the same requirement
There is no such requirement from either company
 
Joined
Aug 27, 2021
Messages
106 (0.23/day)
On NVIDIA's Tesla architecture (~2006), the shader cores (aka GPU cores, the math units) were clocks at a higher rate than the rest of the GPU (like TMUs and ROPs).
Nowadays all GPUs run their important parts at the same clock speed


There is no such requirement from either company
I may have misremembered that, sorry
 
Joined
Mar 21, 2021
Messages
2,780 (4.46/day)
Location
Colorado, U.S.A.
System Name HP Compaq 8000 Elite CMT
Processor Intel Core 2 Quad Q9550
Motherboard Hewlett-Packard 3647h
Memory 16GB DDR3
Video Card(s) NVIDIA GeForce GT 1030 GDDR5 (fan-less)
Storage 2TB Seagate Firecuda 3.5"
Display(s) Dell P2416D (2560 x 1440)
Power Supply 12V HP proprietary
Software Windows 10 Pro 64-bit
The other thing that I find strange about some of the boxes is they suggest they are selectable, perhaps simply because they have a white background.
 

Attachments

  • GPU-Z.jpg
    GPU-Z.jpg
    109.4 KB · Views: 34
  • CPU-Z.jpg
    CPU-Z.jpg
    99.9 KB · Views: 35
Joined
Apr 28, 2011
Messages
990 (0.23/day)
Location
Botevgrad, Bulgaria, Europe
System Name Main PC/OldPC/3rd PC
Processor Intel Core i7-3770K Ivy Bridge/Core i5-3470 Ivy Bridge/Core i3-4330 Haswell
Motherboard ASUS P8Z77-V/ASRock Z68 Pro3 Gen1/ASUS H81M2
Cooling Cooler Master Hyper 212 EVO/Intel Box cooler/Intel Box cooler
Memory 32GB Corsair Vengeance/32GB ADATA/16GB ADATA
Video Card(s) SAPPHIRE R9 290 Tri-X OC 4GB/MSI RX 480 8GB/SAPPHIRE R9 390 8GB
Storage 2x1TB ADATA SSDs in RAID0+3 HDDs/2xCrucial 1TB SSDs in RAID0+3 HDDs/Samsung 1TB SSD+8TB+4TB HDDs
Display(s) Philips 274E5QHAB@HDMI + Philips 273EQH@DVI (both 27")
Case Fractal Design Define R4 Titanium
Audio Device(s) Kenwood Mini HiFi system/Microlab speakers/Philips HDMI (main)+LG TV monitor HDMI + Apple headphones
Power Supply Cooler Master Silent ProM 600 W (modular)
Mouse Microsoft Ergonomic Sculpt Desktop 2.0 (combo)@Razer Goliath mousepad (Medium speed)
Keyboard Microsoft Ergonomic Sculpt Desktop 2.0 (combo)
Software Win10 64-bit (Main PC v.1809 RTM Enterprise/2nd PC v.1903 Insider Preview Pro/3rd PC - same as 2nd)
Nothing "Selectable" here!?
Please note the static text boxes (also known as "labels") are simply using a sunken edge effect, a common thing in WIndows GUI programming to make the boxes/text in them stand out and easier to read. :)
If the sunken edge was not there, the GUI would look much worse & harder to read. :)
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
24,876 (3.67/day)
Processor Core i7-8700K
Memory 32 GB
Video Card(s) RTX 3080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
He means the checkboxes, which are not in "disabled" state (gray background). I specifically decided against that, to not suggest "unknown" or "feature disabled", which would puzzle the user. I rather have them click and notice they can't disable CUDA
 
Top