• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Idle issue since 5060 ti installed

Mine's always showing PerfCap as VRel, even at idle.

I ran Unigine Superposition on the 1080p Extreme benchmark preset for you, I picked that because you can download and run it for free to make comparisons.

View attachment 408280

This is an old AM4 system on PCIe 4.0. Gigabyte X570 running W11 24H2. Everything is bone-stock, and the Asus 5060Ti isn't a factory OC model.
This is so weird. I've been going crazy over this for 2 days haha.
I'll post mine, on the left its me playing cyberpunk completely maxed out to try and max the gpu. and on the right its the superstition benchmark..
I'm also on AM4 but I'm on PCI3 since my board is an MSI A520M.
 

Attachments

  • cyberpunk maxed out.png
    cyberpunk maxed out.png
    320.5 KB · Views: 23
  • superstition.png
    superstition.png
    1.4 MB · Views: 23
I don't think you have anything to worry about.

You're getting 100% GPU utilisation, your lower scores are likely CPU differences, and your GPU clocks are in the right ballpark. GPU Boost is opportunistic and depends on silicon lottery - ASUS says your OC model should boost to at least 2647MHz, and you're getting 2700MHz.

It would seem that the 5060Ti is rare among modern Geforce cards in that it's not artificially power-limited by a restrictive TBP for the sake of product segmentation. The only way I can reliably hit 100% TBP is with power-virus tests like OCCT and Furmark, so I'm not in the least bit concerned about my card or your card failing to hit 100% TBP when gaming.
 
I don't think you have anything to worry about.

You're getting 100% GPU utilisation, your lower scores are likely CPU differences, and your GPU clocks are in the right ballpark. GPU Boost is opportunistic and depends on silicon lottery - ASUS says your OC model should boost to at least 2647MHz, and you're getting 2700MHz.

It would seem that the 5060Ti is rare among modern Geforce cards in that it's not artificially power-limited by a restrictive TBP for the sake of product segmentation. The only way I can reliably hit 100% TBP is with power-virus tests like OCCT and Furmark, so I'm not in the least bit concerned about my card or your card failing to hit 100% TBP when gaming.
Shouldn't I be drawing a bit more in superstition and in games? We supposedly have the same gpu but mine is the OC version. I don't know, this is weird
 
Shouldn't I be drawing a bit more in superstition and in games? We supposedly have the same gpu but mine is the OC version. I don't know, this is weird
Nope, not weird at all.
The performance difference between using a 5600X and 5800X3D is likely to be much larger than the 1.7% difference between the vanilla and OC models. Factory OCs are basically a marketing scam. A 1.7% overclock will result in about a 1% increase in performance. The silicon lottery is way more significant than the factory OC.

Look at any CPU review - Using the exact same graphics card with a range of different processors can have massive impacts on GPU performance:
 
Nope, not weird at all.
The performance difference between using a 5600X and 5800X3D is likely to be much larger than the 1.7% difference between the vanilla and OC models. Factory OCs are basically a marketing scam. A 1.7% overclock will result in about a 1% increase in performance. The silicon lottery is way more significant than the factory OC.

Look at any CPU review - Using the exact same graphics card with a range of different processors can have massive impacts on GPU performance:
Thank you so much for taking your time wofj this. I'm starting to feel more relieved. So even getting only 79.9% draw in superstition is considered normal?
 
Thank you so much for taking your time wofj this. I'm starting to feel more relieved. So even getting only 79.9% draw in superstition is considered normal?
Yeah, I think so.
Like I said earlier, it's unusual for a recent Geforce model to have more power headroom than it needs. Most of the 20, 30, 40, and 50-series cards will bump into their board power limits in many games.

For whatever Reason, Nvidia gave the 5060Ti 20W more power budget than the 4060Ti, yet it uses lower-voltage GDDR7 and that more than offsets the additional power draw of the extra CUDA cores the 5060Ti has. The end result is that the 5060Ti doesn't need all 180W in most situations, though it's nice that the headroom is there because CP2077 with path-tracing, frame-gen, and DLSS4 all running at once do actually manage to show PWR as the PerfCap reason occasionally.
 
Yeah, I think so.
Like I said earlier, it's unusual for a recent Geforce model to have more power headroom than it needs. Most of the 20, 30, 40, and 50-series cards will bump into their board power limits in many games.

For whatever Reason, Nvidia gave the 5060Ti 20W more power budget than the 4060Ti, yet it uses lower-voltage GDDR7 and that more than offsets the additional power draw of the extra CUDA cores the 5060Ti has. The end result is that the 5060Ti doesn't need all 180W in most situations, though it's nice that the headroom is there because CP2077 with path-tracing, frame-gen, and DLSS4 all running at once do actually manage to show PWR as the PerfCap reason occasionally.
Ok, I thought everything was good but seems to me more issues are present I reinstalled my drivers cause I was having blurry image on low resolution playing CS2. And I tried furmark I was only drawing 120w with 8X Msaa. As soon as I changed the default settings to performance on nvidia control pannel I was back to 180w. Only happens in 8x Msaa. This is so weird. On the left I have the default plan as you can see and on the right I have the performance plan. (The drivers were default on the left, right after installation). I swear i'm going insane with this
 

Attachments

  • default plan.png
    default plan.png
    1.2 MB · Views: 15
  • perfomance plan.png
    perfomance plan.png
    1.2 MB · Views: 15
I never see "vRel" or "PWR" when on desktop/browsing forums on GPU-z - it's just "Idle :
gpuz.png
Granted, that's on RTX 2070... but I don't see why it would be any different cards ?
Maybe GPU-z fails to detect bus activity properly on 50-series ?
 
Ok, I thought everything was good but seems to me more issues are present I reinstalled my drivers cause I was having blurry image on low resolution playing CS2. And I tried furmark I was only drawing 120w with 8X Msaa. As soon as I changed the default settings to performance on nvidia control pannel I was back to 180w. Only happens in 8x Msaa. This is so weird. On the left I have the default plan as you can see and on the right I have the performance plan. (The drivers were default on the left, right after installation). I swear i'm going insane with this
MSAA is ancient, no modern games use it.

Chances are good that most modern GPUs will be ROP-limited in Furmark with 8x MSAA, which basically breaks Furmark, preventing it from doing its one and only job of stressing the GPU cores with a high-power workload.

When you set the Nvidia image quality slider to performance, it's just an override that disables AA entirely; The slider in the old Nvidia control panel is just three presets, and the "performance" preset is simply this:

1752954489326.png

...so it's giving you the 180W power draw that you'd expect with no MSAA set in Furmark, because MSAA is disabled with the driver override.

You should definitely set everything back to defaults and put it back on "application decides" because most modern games will have issues unless they can use the AA/AF/AO/Gamma settings the developer intended. You absolutely do not want to enforce global driver overrides on a modern system. That's going to screw up most modern games. Honestly, most of the old Nvidia control panel stuff is irrelevant in 2025 unless you're trying to run retro Windows XP games from 20 years ago.

Unless you understand exactly what each setting does and why you're changing it, you're better off switching to the new Nvidia app or just ignoring it entirely - that old Nvidia control panel does more harm than good - most of the stuff that is relevant in 2025 can be set through Windows settings rather than requiring an Nvidia control panel.
 
MSAA is ancient, no modern games use it.

Chances are good that most modern GPUs will be ROP-limited in Furmark with 8x MSAA, which basically breaks Furmark, preventing it from doing its one and only job of stressing the GPU cores with a high-power workload.

When you set the Nvidia image quality slider to performance, it's just an override that disables AA entirely; The slider in the old Nvidia control panel is just three presets, and the "performance" preset is simply this:

View attachment 408496

No AA, no matter what the application asks for. You should definitely set everything back to defaults and put it back on "application decides" because most modern games will have issues unless they can use the AA/AF/AO/Gamma settings the developer intended.

Honestly, most of the old Nvidia control panel stuff is irrelevant unless you're trying to run retro Windows XP games from 20 years ago. For modern titles it's either broken or not applicable. Unless you understand exactly what each setting does and why you're changing it, you're better off switching to the new Nvidia app or just ignoring it entirely.
The thing is, I notice huge fps difference in games while having that option on default and on performance. And from my understanding on the 3000 series GPU that shouldn't happen, let alone in the 5000 series
 
Well yeah, of course the performance is higher. You're forcing lower quality graphics and cutting your in-game settings menu out of the equation. You'd expect higher fps at lower graphics settings, and doing it via the driver is just turning some of those in-game settings to "off" with the driver override.

It doesn't matter what you've set in-game, in Nvidia's "performance" preset you've disabled AA, AF, hardware AO, running fast approximate filtering, and a few other things that affect image quality. You're basically running some graphics settings at minimum or below what the game can set as minimum.

Some games are okay with this and look okay still
Some games are okay with this but look like ass
Some games break.

If you want to fine-tune your game to get you the most fps for the most cost-effective settings, you can usually look up the game on Digital Foundry - they typically give you the most efficient in-game settings to get the best balance of image quality and framerate.
 
Well yeah, of course the performance is higher. You're forcing lower quality graphics and cutting your in-game settings menu out of the equation. You'd expect higher fps at lower graphics settings, and doing it via the driver is just turning some of those in-game settings to "off" with the driver override.

It doesn't matter what you've set in-game, in Nvidia's "performance" preset you've disabled AA, AF, hardware AO, running fast approximate filtering, and a few other things that affect image quality. You're basically running some graphics settings at minimum or below what the game can set as minimum.

Some games are okay with this and look okay still
Some games are okay with this but look like ass
Some games break.

If you want to fine-tune your game to get you the most fps for the most cost-effective settings, you can usually look up the game on Digital Foundry - they typically give you the most efficient in-game settings to get the best balance of image quality and framerate.
I find it very weird that just by adjusting the slider I get different 120 and 180w on furmark 8x msaa. Can you confirm if you're having the same issue if you set the settings default? Sorry for the hassle but I'm considering some upgrades on my pc and I have to make sure if everything's fine before I invest money into this
 
Why is that weird? It's exactly the expected behaviour.

just by adjusting the slider I get different 120 and 180w on furmark 8x msaa.
I don't think you understand what the slider does. Let me be clear - the slider set to performance is an override that IGNORES the application setting and forcibly disables MSAA.

You are not running Furmark at 8x MSAA with the slider set to performance.

This is also why you shouldn't mess with the slider and just set everything back to the default "let the application decide" because it's doing stuff you don't understand and there's plenty of stuff that will misbehave or look wrong when you force global overrides via the Nvidia control panel.

Nvidia's slider are global overrides and in simple terms they break every game's own graphics settings menu.
 
Why is that weird? It's exactly the expected behaviour.


I don't think you understand what the slider does. Let me be clear - the slider set to performance is an override that IGNORES the application setting and forcibly disables MSAA.

You are not running Furmark at 8x MSAA with the slider set to performance.

This is also why you shouldn't mess with the slider and just set everything back to the default "let the application decide" because it's doing stuff you don't understand and there's plenty of stuff that will misbehave or look wrong when you force global overrides via the Nvidia control panel.

Nvidia's slider are global overrides and in simple terms they break every game's own graphics settings menu.
Sure I understand. But as I have an rtx 3070, a 1660 super and a 5060ti why does it only occur with the 5060ti? I did the test on 3 different GPUS and only on 5060ti I'm having this issue, they all draw maximum watts and have the same performance on the same game regardless of what I do with the slider, except the 5060ti (either performance or default) Sorry if I'm being naive, I like to learn about this. Not being rude, I'm not native English so sorry if I sound rude.
 
This is definitively not a normal behavior in opposite to what some people say here.

However it seems there are no solutions so far.
 
Maximum Watts when gaming was normal for 20, 30, 40, and many 50-series GPUs, but you are seeing what I'm seeing and it's also apparent in reviews. Look at W1zzard's power consumption charts - The power draw is lower than the 180W TBP in gaming loads, only showing up as 180W in raytracing or Furmark.

Vsync, GSync, and your monitor refresh rate can have an impact on power draw if you're hitting your refresh rate.

You haven't said how many fps you're gaining when you use the performance preset. If it's a ~5-10%, then that's expected when the driver override disables AA, AF, and texture LOD settings. If it's more then you might have some kind of power management issue with your GPU+Motherboard combination. At that point I would wipe the slate clean and do (in this order):
  1. The latest BIOS update for your motherboard - one that's ideally newer than the 5060Ti
  2. Enable "Above 4G Decoding" and "Resizable BAR" in your freshly-reset BIOS
  3. Update to the latest AMD chipset driver for your A520 board
  4. Remove the current graphics driver using DDU in safe mode
  5. Install 576.88 WHQL
If you really want to test whether it's a power management problem, then don't use the slider in the control panel because that turns off all kinds of other stuff that also impacts performance, which means you can't isolate the variable you're trying to test. Go to the advanced "manage 3D settings" section and just change that one value:

1753005389085.png
 
you're trying to create a problem, this is what people that ever dealt with customers have to deal with, people that waste time and make people with real issues sometimes be dismissed.
 
you're trying to create a problem, this is what people that ever dealt with customers have to deal with, people that waste time and make people with real issues sometimes be dismissed.
You just got one person 3 replys up from this one saying this is not normal behavior. Why would you say I'm trying to create a problem when I'm literally trying to figure out and learn about it? No one needs to reply to a post, only if they wanna help or know something about it. But maybe that's the way I think.

Maximum Watts when gaming was normal for 20, 30, 40, and many 50-series GPUs, but you are seeing what I'm seeing and it's also apparent in reviews. Look at W1zzard's power consumption charts - The power draw is lower than the 180W TBP in gaming loads, only showing up as 180W in raytracing or Furmark.

Vsync, GSync, and your monitor refresh rate can have an impact on power draw if you're hitting your refresh rate.

You haven't said how many fps you're gaining when you use the performance preset. If it's a ~5-10%, then that's expected when the driver override disables AA, AF, and texture LOD settings. If it's more then you might have some kind of power management issue with your GPU+Motherboard combination. At that point I would wipe the slate clean and do (in this order):
  1. The latest BIOS update for your motherboard - one that's ideally newer than the 5060Ti
  2. Enable "Above 4G Decoding" and "Resizable BAR" in your freshly-reset BIOS
  3. Update to the latest AMD chipset driver for your A520 board
  4. Remove the current graphics driver using DDU in safe mode
  5. Install 576.88 WHQL
If you really want to test whether it's a power management problem, then don't use the slider in the control panel because that turns off all kinds of other stuff that also impacts performance, which means you can't isolate the variable you're trying to test. Go to the advanced "manage 3D settings" section and just change that one value:

View attachment 408535
I tried it in furmark as you said the power plan had no effect on fps, either on furmark or on games. Only changing manually the AA. So maybe it's a driver issue and I just gotta wait it out. I'm seeing a 40 to 60 fps difference in games
 
I tried it in furmark as you said the power plan had no effect on fps, either on furmark or on games. Only changing manually the AA. So maybe it's a driver issue and I just gotta wait it out. I'm seeing a 40 to 60 fps difference in games
Good, that's working as intended with the power plan at least. 40-60fps from disabling AA and AF alone is huge, unless these are eSports titles that are running at hundreds of fps.
What games, exactly? (approximate before/after fps numbers, too)
 
Good, that's working as intended with the power plan at least. 40-60fps from disabling AA and AF alone is huge, unless these are eSports titles that are running at hundreds of fps.
What games, exactly? (approximate before/after fps numbers, too)
That issue was fixed, had some issues with the 3d definitions which were making the game cap its fps. Thank you
Already managed to go to a friends house with a 5070 and do some testing on his PC and make some comparisons. We had the same idle behaviours, same wattage lock with 8x msaa.
The only difference is that he hits power limit on furmark while I only hit voltage limit.
Screenshots bellow. Him on the left and me on the right
 

Attachments

  • limbu.png
    limbu.png
    2.4 MB · Views: 1
  • me.png
    me.png
    2.2 MB · Views: 1
Back
Top