• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RX 580 hits max frequency only when on high graphics

chefi

New Member
Joined
Nov 8, 2022
Messages
2 (0.00/day)
I've been feeling recently like my RX 580 isn't performing like it used to.

So I downloaded FurMark to stress test it and it turns out when I go through the FurMark test with Anti-Aliasing off, the GPU dances between 1230MHz 1270MHz.
1667950248489.png


But, when I go through the test with Anti-Aliasing on 8X MSAA, it's set on 1366MHz, which is the clock speed the card is supposed to be on.
1667950178109.png

If I OC the card it only reaches the OC clock speed when the test is on 8X MSAA.

You can also see that even though in both cases the load is 100%, fan speed is significantly higher when on the lower clock speed.

WTH is going on?
 
Have you recently updated your video drivers ?
 
I've been feeling recently like my RX 580 isn't performing like it used to.

So I downloaded FurMark to stress test it and it turns out when I go through the FurMark test with Anti-Aliasing off, the GPU dances between 1230MHz 1270MHz.
View attachment 269136

But, when I go through the test with Anti-Aliasing on 8X MSAA, it's set on 1366MHz, which is the clock speed the card is supposed to be on.
View attachment 269135
If I OC the card it only reaches the OC clock speed when the test is on 8X MSAA.

You can also see that even though in both cases the load is 100%, fan speed is significantly higher when on the lower clock speed.

WTH is going on?
The upper screenshot shows probably thermal throttling. (which should begin at 80°C on Polaris)
 
AMD considers Furmark a power virus and under normal condition it will automatically limit max frequency in the driver.



Try running something else to test your GPU. I personally avoid running that POS software on any of my GPUs.
 
Yes, I reinstalled the AMD driver about a month ago. It is still the most up to date (Recommended) driver.
Hello.
Sorry to inform you but it is not recommended to install the latest driver for Polaris cards.
On the contrary, you should uninstall them and use an older driver which was made for those cards.
Recent drivers only improve (but very very rare) some bugs from games.
Please uninstall the driver using DDU in safe mode w/o network.
Have prepared (download it on your computer) the good driver.
Install it.
Reconnect the network again.
 
...

Try running something else to test your GPU. I personally avoid running that POS software on any of my GPUs.
In that thread you linked (Tom's...) one member says it well: Furmark like Prime95 is to stress-test the hardware more than you need, so that it performs well in all situations that you actually need. If the hardware lacks good thermal protection, it's not Furmark author's fault. And yes, I do like it, by the way.

@chefi don't worry about frequencies in Furmark, all you're seeing there are thermally-corrected clock speeds, so the card survives the stress-test. Which is good. If you'll open GPU-Z in the background of any game, you'll observe that it was at your set frequencies but only if not limited by high temperatures (thermal protection).
 
I've been feeling recently like my RX 580 isn't performing like it used to.

So I downloaded FurMark to stress test it and it turns out when I go through the FurMark test with Anti-Aliasing off, the GPU dances between 1230MHz 1270MHz.
View attachment 269136

But, when I go through the test with Anti-Aliasing on 8X MSAA, it's set on 1366MHz, which is the clock speed the card is supposed to be on.
View attachment 269135
If I OC the card it only reaches the OC clock speed when the test is on 8X MSAA.

You can also see that even though in both cases the load is 100%, fan speed is significantly higher when on the lower clock speed.

WTH is going on?
Furmark is absolutely useless for testing what you are trying to test, as it (as mentioned above) is a power virus - an entirely unrealistic load that does not resemble gaming loads at all, but instead loads the GPU hardware in specific ways that pull as much power as possible and generate as much heat as possible. This means that your GPU will reduce its clocks when running Furmark. Period. I don't think I've seen a single GPU I've owned maintain its clocks while running FurMark.

If anything, the much lower fan speed of the lower screenshot shows that for whatever reason, your GPU is consuming much less power with MSAA enabled, which is why it's maintaining clocks unlike the above screenshot where it's throttling slightly. This seems to indicate some other bottleneck - maybe FurMark at that resolution with 8x MSAA is exceeding the VRAM of your card?


Overall, nothing in those screenshots look like any indication that there's anything wrong with your cooling. Of course we aren't seeing any power draw numbers, but given that the GPU is running sufficiently hot to have its fans at 97% with thermals still sitting at the throttle limit, that indicates that you're pushing a lot of power through the card - and it's still just clocking down by ~6.8%. That's nothing. A <7% performance drop from cooling wouldn't "feel like your GPU isn't performing like it used to". For most people a performance change below 10% isn't noticeable at all.


If you're worried about your cooling: have you cleaned your cooler recently? Does your case have good airflow? You could always repaste your GPU, but without seeing power draw numbers I'd say that seems unnecessary based on the information provided.


To aid in further troubleshooting: please post your full system specs.

Other than that: have you installed any new software on your PC that runs in the background? What is your RAM and CPU usage looking like on the desktop and in-game?

A more useful tool for ensuring you aren't losing performance over time: run 3DMark or a similar scored benchmark that saves your scores. Then re-test later to look for any changes. 3DMark is quite good for this as it tests and scores both CPU and GPU combined and separately, allowing for pretty close monitoring of any performance changes over time.
 
Furmark is absolutely useless for testing what you are trying to test, as it (as mentioned above) is a power virus - an entirely unrealistic load that does not resemble gaming loads at all, but instead loads the GPU hardware in specific ways that pull as much power as possible and generate as much heat as possible. This means that your GPU will reduce its clocks when running Furmark. Period. I don't think I've seen a single GPU I've owned maintain its clocks while running FurMark.

If anything, the much lower fan speed of the lower screenshot shows that for whatever reason, your GPU is consuming much less power with MSAA enabled, which is why it's maintaining clocks unlike the above screenshot where it's throttling slightly. This seems to indicate some other bottleneck - maybe FurMark at that resolution with 8x MSAA is exceeding the VRAM of your card?
...
I have to reply furtherer. :)
Furmark only appears to be useless now-days because of how big and power hungry the graphics hardware has become. With it being so big, it heats up promptly and overclocking is not observable anymore.
Besides - graphics companies themselves proved that when implementing many steps of clock speeds depending on load and temperature and power and voltage.

As for not being that hot when using 8xAA, simply because Radeon up till RX 6xxx doesn't have 8 Z samples per pixel per clock and the chip has to wait on ROPs to do their thing, so it rests a while. :)
 
Last edited:
I have to reply furtherer. :)
Furmark only appears to be useless now-days because of how big and power hungry the graphics hardware has become. With it being so big, it heats up promptly and overclocking is not observable anymore.
Besides - graphics companies themselves proved that when implementing many steps of clock speeds depending on load and temperature and power and voltage.

As for not being that hot when using 8xAA, simply because Radeon up till RX 6xxx doesn't have 8 Z samples per pixel per clock and the chip has to wait on ROPs to do their thing, so it rests a while. :)
No, it was useless long before then on any range of TDP on cards.

Both Nvidia and AMD limit the card under Furmark, so you're NOT testing for limits you might never see at all. You're testing with limitations. One of them being a lower peak clock, and thus more voltage for a lower clock, which kills card efficiency, and thus creates more heat. You're not even testing a realistic heat load on Furmark because of that. You're also not testing your clock V/F curve because it gets changed. You're running the GPU in Furmark, that's what you're testing, and not a single bit more. An underclocked GPU reaches 99% utilization with a lower load, as well, so you're also not testing behaviour under full utilization. That only happens when you let GPUs boost proper while working on a real load which they don't in Furmark.

It is 100% pointless, but some people remain adamant its not. Power to them. Literally :)
 
I have to reply furtherer. :)
Furmark only appears to be useless now-days because of how big and power hungry the graphics hardware has become. With it being so big, it heats up promptly and overclocking is not observable anymore.
Besides - graphics companies themselves proved that when implementing many steps of clock speeds depending on load and temperature and power and voltage.
True - it was somewhat useful back when clocks were fixed and GPUs didn't have boost or automatic clock regulation. But by that metric we'd be talking about nearly a decade back, and even then FurMark was only useful for illustrating absolute worst-case thermals. Today it's only really "useful" for stress testing cooling solutions, but at the risk of damaging your GPU unless it has very good hotspot detection (which most today thankfully have).
As for not being that hot when using 8xAA, simply because Radeon up till RX 6xxx doesn't have 8 Z samples per pixel per clock and the chip has to wait on ROPs to do their thing, so it rests a while. :)
Interesting! That explains where the bottleneck lies, at least.
 
In that thread you linked (Tom's...) one member says it well: Furmark like Prime95 is to stress-test the hardware more than you need, so that it performs well in all situations that you actually need. If the hardware lacks good thermal protection, it's not Furmark author's fault. And yes, I do like it, by the way.

@chefi don't worry about frequencies in Furmark, all you're seeing there are thermally-corrected clock speeds, so the card survives the stress-test. Which is good. If you'll open GPU-Z in the background of any game, you'll observe that it was at your set frequencies but only if not limited by high temperatures (thermal protection).
Its completely useless test that most people run for the wrong reasons. When the driver limits the clock frequency before the card can thermal throttle. What does it stress then?

Older GPUS which lack some protection features and that are all out of warranty can actually be killed running this useless and pointless software. Yes I really hate it. :p

 
Last edited:
Alright - I have to admit I always ran Furmark with a fair bit of cautiousness, never did I run it for long, just a quick ~30s run to see what power the system draws or how efficient my fan curve is on my cards and I usually tend to limit them in some way even before running this furry mess of a donut. :D
 
Back
Top