• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Gpu-Z frequency location alteration idea/request?

Nice will be a good update to GPUz
 
1723831171903.png


My suggestion would be to add another row between the GPU and Default clock with "Current Clocks", and update the "Pixel Fillrate" and "Texture Fillrate" entries to reflect the "Current Clock" instead, like memory bandwidth does.
 
^^^ That is a good idea.

Let's hope you will make it beautiful and modern. Maybe include Aero look.

Why not replace the AMD/nvidia brand logo with a photo of the actual graphics card in use?!
Also, Technologies must be somewhere up, not at the bottom.
You can order it this way:
1 row. Name
2 row. GPU; Die Size; Transistors (count in Bln, Mln)
3 row. Manufacturing Process; Revision; Release Date
4 row. AIB; Device ID
5 row. VBIOS Version
6 row. Shaders; Shaders Clock; ROPs/TMUs Quantity
7 row. VRAM Type; Memory Clock; Bus Width
8 row. Bus Interface;
9 row. Pixel & Texture Fillrate; Memory Bandwidth
10 row. DirectX Support; Technologies; Computing
11 row. Driver Data
 
One change that was not made in 2.60.0 was the speed ordering:

GPU Clock/Memory/Boost
  • The first refers to the GPU
  • The second to the memory
  • The third to the GPU again
box ordering.jpg
 
Last edited:
One change that was not made in 2.60.0 was the speed ordering:

GPU Clock/Memory/Boost
  • The first refers to the GPU
  • The second to the memory
  • The third to the GPU again

Older versions had the thing called "Boost" replaced by something called "Shader". So called "uncore" (GPU Clock) and "core" (Shader) clocks... :rolleyes:

1723843412270.png
 
Now don't go confusing me...

Life is hard enough already
 
Older versions had the thing called "Boost" replaced by something called "Shader". So called "uncore" (GPU Clock) and "core" (Shader) clocks... :rolleyes:

View attachment 359277
Honestly ?
I would just drop support for older GPUs, since there is no way or point to have it all.
Just make one of older versions of GPU-z a "Legacy", and tweak it a bit to fit those cards better.
Making GPU-z for WinXP/7 or "Legacy edition" - should be done eventually (maybe adding few old "weird" GPUs in there like Matrox Parhelia/SIS stuff + and some extra checking for old drivers compatibility ?).
Well, you only have to do it once, since there won't be any new cards that support Windows XP or Win7 at this point after all.

Side note : I'm using v0.7.2 of GPU-z for XP because older drivers (NV 93.71) have issues with newer ones (BSOD/crash)
 
Last edited:
GPU-Z needs to be fixed. Confusion would eventually stop ;)
It's not broken though??

Older versions had the thing called "Boost" replaced by something called "Shader". So called "uncore" (GPU Clock) and "core" (Shader) clocks... :rolleyes:

View attachment 359277
These boxes change automatically depending on what GPU is installed. Since yours has shader clocks that can be unlinked from the core clocks and can differ in clock speed, this is why it's there. Because you over clock shader separately. An RTX doesn't support this feature, so it doesn't display in this fashion.
 
It's not broken though??

What is the difference between the upper and lower row, stating GPU Clock - Default Clock; Memory - Memory, Boost - Boost? When do you see a difference in the values on these rows?
Why does a "Boost" section even exist, when we know that these are marketing clocks - there are base clock, game clock and boost clock? Which is which in GPU-Z? :D

1723845828467.png
 
What is the difference between the upper and lower row, stating GPU Clock - Default Clock; Memory - Memory, Boost - Boost? When do you see a difference in the values on these rows?
Why does a "Boost" section even exist, when we know that these are marketing clocks - there are base clock, game clock and boost clock? Which is which in GPU-Z? :D

View attachment 359278
That's not GPU-Z or the owners fault that NVidia is fKn stupid and advertises this garbage.

So again.

GPU-Z works great!! I just would love to see some live clocks on the main tab because of exactly what you are saying. Catch my drift?
 
That's a direct contradiction, it neither works well, because of its chaotic arrangement of things on the main tab, nor it doesn't need fixes in order to make all areas work, even you admit that.



No..
So it's W1zards fault that NV advertises base clocks and boost clocks, both of which the modern video cards hardly ever or never run and you're bitching about the arrangement of things on the main tab?

If the information we are given isn't accurate, then it doesn't matter what the Fk order it's put in.

None of this means GPU-Z is broken. That's just you making shit up.
 
NVidia is fKn stupid and advertises this garbage

So it's W1zards fault that NV advertises

It's not nvidia, but AMD. The aforementioned clocks are for AMD Radeon.
GPU-Z fault is that the clocks on the main tab don't represent anything - they are neither actual/live readings, nor do they represent base/game/boost values.

1723898008732.png


GPU-Z

1723898037910.png
 
Last edited:
So it's W1zards fault that NV advertises base clocks and boost clocks, both of which the modern video cards hardly ever or never run and you're bitching about the arrangement of things on the main tab?

If the information we are given isn't accurate, then it doesn't matter what the Fk order it's put in.

None of this means GPU-Z is broken. That's just you making shit up.
The main reason I like GPU-Z is that yeah it shows the official specs of the card I own on the main page, its a very quick advanced spec info tool. Can then google GPU-Z screenshots and what not to find and compare to other cards. Sensors is a nice added bonus, but not main reason I use the tool, this is why I asked wizzard above to try and not lose any of the static spec info.

Interestingly TPU web site has a section that shows the info as well.
 
Last edited:
Side note : I'm using v0.7.2 of GPU-z for XP because older drivers (NV 93.71) have issues with newer ones (BSOD/crash)
how about reporting the issue, so that I can fix it?

they are neither actual/live readings, nor do they represent base/game/boost values.
What do you see on Advanced -> General?

These are the base/game/boost clocks reported by the card (driver). And the 1st tab should show these game and boost numbers

Which card is that?
 
how about reporting the issue, so that I can fix it?
I can ?
It's a Windows XP driver on Windows XP OS, I thought those are "not supported" ?
 
It's not nvidia, but AMD. The aforementioned clocks are for AMD Radeon.
GPU-Z fault is that the clocks on the main tab don't represent anything - they are neither actual/live readings, nor do they represent base/game/boost values.

View attachment 359344

GPU-Z

View attachment 359345

This is nitpicking - "base clock" is the absolute worst that AMD will warranty here under maximum synthetic load - it shouldn't go below this. 2280 MHz is the nominal, default clock speed, and it will go up to 2695 if cooling and power constraints allow. The GPU-Z mismatch 2283/2693 are accurate, more so than the manufacturer's description, and this is due to granularity concerns. 2280 and 2695 are invalid targets, so it rounds off to the nearest valid target.

I can ?
It's a Windows XP driver on Windows XP OS, I thought those are "not supported" ?
GPU-Z is still supported in XP and Vista, @W1zzard fixed two bugs I found and reported on Vista regarding the installer and driver version detection on this release, thank you very much by the way! :toast:
 
Last edited:
I can ?
It's a Windows XP driver on Windows XP OS, I thought those are "not supported" ?
1724050501851.png


Of course, always report, even if was unsupported, I'll still look into it and do my best to fix/workaround/explain why not supported
 
@W1zzard - Sorry to be a bother, but have an additional request, if I may.

The camera for screen shots saves as a GIF. Is there any way to have this save as PNG or JPEG?

Many a thanks in advance, and thanks for your time!
 
I hope I'm not an absolute arse but...

I'd wish to see these full-on lowercase:

1726684859236.png
 
Back
Top