• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU-Z box ordering

Joined
Mar 21, 2021
Messages
5,510 (3.64/day)
Location
Colorado, U.S.A.
System Name CyberPowerPC ET8070
Processor Intel Core i5-10400F
Motherboard Gigabyte B460M DS3H AC-Y1
Memory 2 x Crucial Ballistix 8GB DDR4-3000
Video Card(s) MSI Nvidia GeForce GTX 1660 Super
Storage Boot: Intel OPTANE SSD P1600X Series 118GB M.2 PCIE
Display(s) Dell P2416D (2560 x 1440)
Power Supply EVGA 500W1 (modified to have two bridge rectifiers)
Software Windows 11 Home
I find the clock rates arrangement possibly confusing

GPU Clock 1228 MHz | Memory 1502 MHz | Boost 1468 MHz

would this not be better arranged

GPU Clock/boost 1228 MHz/1468 MHz | Memory 1502 MHz

or else it almost looks like the Memory slows down on boost
 

Attachments

  • GPU-Z.jpg
    GPU-Z.jpg
    9.3 KB · Views: 176
Last edited:
Hi,
Doesn't memory have a boost clock to ?
 
Please could you give a screenshot of whole GPU-Z window, without cutting it?
 
@GerKNG Max boost clock doesn't mean anything (as it's seen only for fraction of second before card throttles).
Since "Guaranteed boost clock" is seen in GPU-z is more pointless than memory clock, I say frequency placement is good as is.
I think most cards with cooling problems on NV will either throttle to base clock, or will have VRM issues (blown MOSFETs/shunt failure), before improper cooling becomes serious problem.
 

Attachments

  • GPU-Z.jpg
    GPU-Z.jpg
    86.1 KB · Views: 152
@GerKNG Max boost clock doesn't mean anything (as it's seen only for fraction of second before card throttles).
Since "Guaranteed boost clock" is seen in GPU-z is more pointless than memory clock, I say frequency placement is good as is.
I think most cards with cooling problems on NV will either throttle to base clock, or will have VRM issues (blown MOSFETs/shunt failure), before improper cooling becomes serious problem.
we talk about memory.
 
Yes, the present ordering suggests (at least to me) that boost goes with memory.
 
Historically there was only gpu and memory clocks, then there was shader, and eventually boost, for which i repurposed the 3rd field (shader clock).

Changing it now will probably confuse everyone. I do have plans for a rewrite of GPU-Z, which will include a new GUI, at that point changing the clock order seems like a great idea
 
Technology is a harsh mistress.
 
@GerKNG Max boost clock doesn't mean anything (as it's seen only for fraction of second before card throttles).
Since "Guaranteed boost clock" is seen in GPU-z is more pointless than memory clock, I say frequency placement is good as is.
I think most cards with cooling problems on NV will either throttle to base clock, or will have VRM issues (blown MOSFETs/shunt failure), before improper cooling becomes serious problem.
It helps to find correct bios though, all 3 values, because on the aib pages it shows those values typically.

Ive gotten so used to the way gpu z is sequenced that its exactly the same in the vga bios collection
 
Historically there was only gpu and memory clocks, then there was shader, and eventually boost, for which i repurposed the 3rd field (shader clock).

Changing it now will probably confuse everyone. I do have plans for a rewrite of GPU-Z, which will include a new GUI, at that point changing the clock order seems like a great idea
So what happens if you scan a card that has shader cores? Will the "Boost" word change to "Shader Core Clock" or stay "Boost" and confuse people? As I also agree with OP that that "Boost" looks like it's referring to the memory clock speed's boost (if it has one) and not the GPU's core clock speed boost.
 
So what happens if you scan a card that has shader cores? Will the "Boost" word change to "Shader Core Clock" or stay "Boost" and confuse people? As I also agree with OP that that "Boost" looks like it's referring to the memory clock speed's boost (if it has one) and not the GPU's core clock speed boost.
SPs
 
So what happens if you scan a card that has shader cores? Will the "Boost" word change to "Shader Core Clock"
Correct
 
@regorwin56 As far as I recall from the early-mi 2000s when I got into PCs, all Radeon GPU had the sharders running at the same clocks as the rest(whole) GPU, cant speak about earlyer ones and then Nvidia has its own GPU architectures.
 
NVIDIA, like AMD, doesn't want display shaders (?
On NVIDIA's Tesla architecture (~2006), the shader cores (aka GPU cores, the math units) were clocks at a higher rate than the rest of the GPU (like TMUs and ROPs).
Nowadays all GPUs run their important parts at the same clock speed

I remember seeing that AMD didn't want GPU Z display shaders before
So wondering if NVIDIA has the same requirement
There is no such requirement from either company
 
On NVIDIA's Tesla architecture (~2006), the shader cores (aka GPU cores, the math units) were clocks at a higher rate than the rest of the GPU (like TMUs and ROPs).
Nowadays all GPUs run their important parts at the same clock speed


There is no such requirement from either company
I may have misremembered that, sorry
 
The other thing that I find strange about some of the boxes is they suggest they are selectable, perhaps simply because they have a white background.
 

Attachments

  • GPU-Z.jpg
    GPU-Z.jpg
    109.4 KB · Views: 89
  • CPU-Z.jpg
    CPU-Z.jpg
    99.9 KB · Views: 99
Nothing "Selectable" here!?
Please note the static text boxes (also known as "labels") are simply using a sunken edge effect, a common thing in WIndows GUI programming to make the boxes/text in them stand out and easier to read. :)
If the sunken edge was not there, the GUI would look much worse & harder to read. :)
 
He means the checkboxes, which are not in "disabled" state (gray background). I specifically decided against that, to not suggest "unknown" or "feature disabled", which would puzzle the user. I rather have them click and notice they can't disable CUDA
 
Back
Top