• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hardware accelerated gpu scheduling , worth enabling?

izy

Joined
Jun 30, 2022
Messages
1,076 (1.03/day)
Hi , i am just curious if enabling it gives any benefits or not , from what ive heard it helps people with lower end CPUs (in CPU bound scenarios) or it can help with 1% lows / input latency in games.
I didnt notice and changes with 3700x , maybe because its an 8 core CPU and it has spare cores for scheduling? I havent benchmark it in many games / apps but at the first look i didnt notice any benefit having it on.
Share your experience and write your CPU/GPU in the post so we can see for what hardware / scenario it makes any difference having it enabled (if it does) and if is positive or negative.
 
Last edited:
Your cpu is Zen 2 based with the chiplet fully enabled, ie: 2CCX (4core each with 16MB l3) making up 1CCD(one chiplet).
As far as I've read, correct if I'm wrong about this, but , crosstalk between ccx'es within same die goes over IF(infinityfabric) and thus some heightened inner latency apears.
le:It should not be of great difference for general purpose gaming.
 
Last edited:
Your cpu is Zen 2 based with the chiplet fully enabled, ie: 2CCX (4core each with 16MB l3) making up 1CCD(one chiplet).
As far as I've read, correct if I'm wrong about this, but , crosstalk between ccx'es within same die goes over IF(infinityfabric) and thus some heightened inner latency apears.
It should be of great difference for general purpose gaming.
Yes , but low threaded games i dont think they are using both CCX, for example in World of Warcraft so i dont know if it really helps or not as the cpu its mostly free.
A picture with unlocked FPS in WoW, HAGS on: ( WoW is hard to benchmark anyway )
1658315973651.png

Edit: Anyway , ive heard that in GPU bound scenarios you can have lower performance with it ON.
 
Last edited:
I didnt notice and changes
i didnt notice any benefit having it on.

I think you answered your own question. You tried it, you didn't notice. End of story.

For you and your system, it provides no help.
 
I think you answered your own question. You tried it, you didn't notice. End of story.

For you and your system, it provides no help.
I was curious in general not just for my system, people reporting different results over the internet and maybe in some scenarios there are some benefits and i just dont notice them as i havent done any in depth benchmarking.
 
Last edited:
HAGS was hyped as some sort of savior that would provide immediately measurable benefits to presentation latency and graphics performance, but it is not the case.

Treat HAGS as a groundwork being laid for future low-level optimizations in graphics drivers. Whether it is on or off should make no difference to end users right now. Leave it enabled if your hardware supports it, otherwise, do not lose any sleep over it. Here is a post by Microsoft on how it works:

Cheers :toast:
 
Last edited:
or maybe i just dont notice them as i havent done any in depth benchmarking.
This is actually why I was very careful how I worded my last reply.

IMO, too many people allow benchmarking results to influence, not just their decisions, but their perceptions as well. And that, IMO, is wrong.

Too many times I have seen where gamers were perfectly happy with their computers - thoroughly enjoying and totally engrossed in their games - until they ran a benchmark program and learned somebody else had a few more FPS, or a got a few more clock cycles out of their CPU, or somebody else was running a few degrees cooler. Then suddenly, they "saw" stuttering they never saw before. Or suddenly this game "seemed" sluggish, no longer as fun. Nonsense. The placebo effect works both way. It can make one think things are better, or it can make one think things are worse. Yet in reality, in a blind, side-by-side A/B test, they could not tell the difference.

I am not saying benchmark programs are worthless or harmful. I just think they frequently are used incorrectly. I use them with new builds to establish a baseline. Then, in a year or two, or after a major upgrade, I "might" run it again to see how it compares to the original baseline - not to someone else's computer.

I was curious in general not just for my system
Yeah, I get that. But effectively, every single one of the ~1.6 billion Windows computers out there became a unique computer within the first few minutes after the very first boot. This as users set up their own unique networking, attach their own peripherals (from 1000s of different makers), install and configure their own security and applications.

My point is, such comparisons really don't even satisfy curiosity. You would need two, 100% identical computers, set up identically. And you pretty much have to sit at the end of a Dell assembly line and catch the next two identical computers as they fall off the conveyer belt, then set them up identically - with the only difference being the feature enabled in one, and disabled in the other. And even then, it will NOT tell you how it will work (or not work) with your computer.
 
My point is, such comparisons really don't even satisfy curiosity. You would need two, 100% identical computers, set up identically. And you pretty much have to sit at the end of a Dell assembly line and catch the next two identical computers as they fall off the conveyer belt, then set them up identically - with the only difference being the feature enabled in one, and disabled in the other. And even then, it will NOT tell you how it will work (or not work) with your computer.
You are right but maybe we could see the difference for example between an 4, 6, 8 cores cpu or Zen 2 , Zen 3 and intel (or different GPUs) etc. , maybe for some setups are some improvements, from what ive read around seem that you can get some improvements with 4 cores cpu when you are CPU bound and you have a decent GPU or that you could get higher lows fps even with better cpus, anyway my question was for my curiosity as i dont see people talking about it anymore. If some folks with 4 core cpus see this post maybe they will enable it ( if there is a real benefit enabling it for their setup, many dont even know that option exist in windows)
 
try enabling cppc and cppc preferred cores then see if it makes a difference also enter developer mode nvidia control panel and enable gpu counters
 
See where? That's my point. On benchmarks? Sure. In real world game play? Maybe not.

And again, everything else would have to be equal to ensure nothing else influences or skews the results. For example, it would have to be the exact same motherboard running the exact same BIOS firmware version. The exact same RAM. Same drives. Even the voltages coming out of the PSUs.

Again, I understand your point. I am just saying, unless EVERYTHING else is identical, any resulting scores will have to be posted with an asterisk (*). And that will always leave doubt as to whether the resulting score is truly valid.
 
See where? That's my point. On benchmarks? Sure. In real world game play? Maybe not.

And again, everything else would have to be equal to ensure nothing else influences or skews the results. For example, it would have to be the exact same motherboard running the exact same BIOS firmware version. The exact same RAM. Same drives. Even the voltages coming out of the PSUs.

Again, I understand your point. I am just saying, unless EVERYTHING else is identical, any resulting scores will have to be posted with an asterisk (*). And that will always leave doubt as to whether the resulting score is truly valid.
I agree , i understand your point too , the margins would be too low to be sure if there are some improvements or not but maybe you can notice your 1% lows (or less to none stuttering etc.).
I found an youtube video from last year with some benchmarks HAGS off vs on: (more interesting if it was done with lower end hardware)
 
Last edited:
bottom line would be i dont believe it hurts anything.
 
bottom line would be i dont believe it hurts anything.
Certainly does not hurt to try and see what happens.
 
imo it's easy to fall into an obsession with things like FPS. I went through a stage years ago where I was headed in that direction by running FRAPS and checking it frequently but I eventually stopped because in some cases 60 FPS mattered and in some cases it didn't. I like to have at least 60 FPS in shooters but in some other genres 40 to 50 FPS is fine with me. I don't need FRAPS to tell me when my FPS are too low. It's obvious.
 
imo it's easy to fall into an obsession with things like FPS. I went through a stage years ago where I was headed in that direction by running FRAPS and checking it frequently but I eventually stopped because in some cases 60 FPS mattered and in some cases it didn't. I like to have at least 60 FPS in shooters but in some other genres 40 to 50 FPS is fine with me. I don't need FRAPS to tell me when my FPS are too low. It's obvious.
^ THIS.

The feel/frame pacing is the most important and it's something u feel. Some games feel like garbage even at high fps.
 
With HAGS on and after i changed LineBased to MSI for my GPU the CPU score increased in 3dmark (not by much, just about 200 - 300 points more than what i was getting before ) ^_^
 
whats Linebased?
 
whats Linebased?
You can switch from Linebased signaling to MSI with NVCleanstall (for your GPU) or you can use this tool MSI_util_v3. Nvidia for some reason is still using Linebased signaling.
You can find more info (and the tool) here:
 
whats Linebased?

They are referring to the PCI Express interrupt method.

With HAGS on and after i changed LineBased to MSI for my GPU the CPU score increased in 3dmark (not by much, just about 200 - 300 points more than what i was getting before ) ^_^

This is nonsense and your result is within margin of error. You should re-enable message signaled interrupts, modern GPUs are designed to work with them.
 
This is nonsense and your result is within margin of error. You should re-enable message signaled interrupts, modern GPUs are designed to work with them.
Nvidia by default is using Linebased signaling and that is exactly what i did , i enabled MSI. It can be within margin of error but i was getting like 10200 on multiple benchmarks before and after i changed those two im getting 10500, but you might be right.
1658396344968.png
 
Nvidia by default is using Linebased signaling and that is exactly what i did , i enabled MSI. It can be within margin of error but i was getting like 10200 on multiple benchmarks before and after i changed those two im getting 10500, but you might be right.
View attachment 255475

NVIDIA has not used line-based signaling in their drivers by default for several years at this point, unless you are running an obsolete platform.

Pay close attention to the IRQ value on the tool, if it is negative, it's not operating in line-based mode:

Screenshot 2022-07-21 064654.png


Either way these are low-level "tweaks" that more often than not backfire on the end user, believe me when I tell you you are reading way too much into it. This is intended to lower interrupt latency overhead and will not cause your 3DMark scores to improve. In fact, if you set the incorrect interrupt mode, all you are going to do is brick your system unless you have a way to restore it prepared beforehand.

If you want more performance, overclock or upgrade.
 
I know how it works and my GPU (2060 super) was using line-based signaling after a clean driver install (tested 2 times). I wasnt expecting higher performance but after changing HAGS to ON and enabled MSI on the GPU i just ran 3dmark to test stability and stuff and noticed that i have higher scores than before, not by much but i wasnt hitting those values before.
 
If anyone cares, these are the results for my Nvidia card using the latest driver (531.18) on Windows 10.

PUBG - HAGS OFF
Avg: 85.417 - Min: 62 - Max: 167
Avg: 86.781 - Min: 64 - Max: 168
Avg: 86.392 - Min: 64 - Max: 162

PUBG - HAGS ON
Avg: 87.528 - Min: 64 - Max: 170
Avg: 87.319 - Min: 64 - Max: 167
Avg: 87.306 - Min: 64 - Max: 166

CS:GO - HAGS OFF
207.14 fps ( 4.83 ms/f) 14.659 fps variability
206.02 fps ( 4.85 ms/f) 13.715 fps variability
207.24 fps ( 4.83 ms/f) 13.806 fps variability

CS:GO - HAGS ON
206.27 fps ( 4.85 ms/f) 13.481 fps variability
206.68 fps ( 4.84 ms/f) 13.531 fps variability
209.72 fps ( 4.77 ms/f) 12.714 fps variability

AoE 2 - HAGS OFF
1152
1152
1145

AoE 2 - HAGS ON
1172
1152
1159

Crysis Remastered - HAGS OFF
Run 1
Average FPS: 53.46
Average FPS: 69.94
Average FPS: 70.21
Average FPS: 70.08

Run 2
Average FPS: 54.28
Average FPS: 69.15
Average FPS: 70.51
Average FPS: 70.42

Run 3
Average FPS: 58.56
Average FPS: 70.40
Average FPS: 70.35
Average FPS: 70.22

Crysis Remastered - HAGS ON
Run 1
Average FPS: 54.81
Average FPS: 70.94
Average FPS: 70.88
Average FPS: 70.91

Run 2
Average FPS: 58.94
Average FPS: 71.17
Average FPS: 72.41
Average FPS: 70.93

Run 3
Average FPS: 58.75
Average FPS: 71.01
Average FPS: 71.13
Average FPS: 71.01
 
Are all those games heavily threaded?

Where its most likely to help is on single and dual threaded games when often cpu cores are maxed out.
 
It's also required to use DLSS3
 
Back
Top