• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RX 9000 series GPU Owners Club

I found the issue with V-Sync, and honestly, I feel a bit retarded for not realizing it sooner. I'm sharing my experience in case others run into the same problem.

I have a 2K, 165Hz G-Sync/Freesync Pro monitor. With my old RTX 3080, the NVIDIA driver automatically read the monitor's capabilities, and V-Sync just worked—if the game and GPU could handle it, I'd get 165 FPS. However, after switching to an AMD GPU, enabling any flavour of V-Sync capped my framerate at 60 FPS!

So, what was the issue?
The key difference is in how NVIDIA and AMD handle refresh rate detection. NVIDIA queries the monitor directly, but AMD relies on the Windows settings. If the refresh rate in Settings > System > Display > Advanced Display Settings is left at the default 60Hz, AMD’s Adrenalin software assumes that’s the monitor's maximum refresh rate.

Once I manually changed it to 165Hz in Windows, everything started to work as expected.
 
I found the issue with V-Sync, and honestly, I feel a bit retarded for not realizing it sooner. I'm sharing my experience in case others run into the same problem.

I have a 2K, 165Hz G-Sync/Freesync Pro monitor. With my old RTX 3080, the NVIDIA driver automatically read the monitor's capabilities, and V-Sync just worked—if the game and GPU could handle it, I'd get 165 FPS. However, after switching to an AMD GPU, enabling any flavour of V-Sync capped my framerate at 60 FPS!

So, what was the issue?
The key difference is in how NVIDIA and AMD handle refresh rate detection. NVIDIA queries the monitor directly, but AMD relies on the Windows settings. If the refresh rate in Settings > System > Display > Advanced Display Settings is left at the default 60Hz, AMD’s Adrenalin software assumes that’s the monitor's maximum refresh rate.

Once I manually changed it to 165Hz in Windows, everything started to work as expected.
Yep, your desktop refresh rate is your max refresh rate. This isn't new. You could have asked, we would have gladly helped. :)
 
Yep, your desktop refresh rate is your max refresh rate. This isn't new. You could have asked, we would have gladly helped. :)
Now I feel even more retarded :laugh:
 
Now I feel even more retarded

Don't worry, I was denying everyone who said that I've had cpu (ryzen 2700) bottleneck cause the cpu was using only 30% of it's power xD, and after changing the new cpu (ryzen 5700x) uses 50% of it's power max, but the gpu boost properly -.-
I think I'm dumber :D
 
So If I have a 240hz monitor Freesync won't go beyond 60hz unless I change that setting in Windows?!
 
So If I have a 240hz monitor Freesync won't go beyond 60hz unless I change that setting in Windows?!
Pretty much, yes. If it's 240 Hz, just use it at 240 Hz, and you won't have a problem.
 
Pretty much, yes. If it's 240 Hz, just use it at 240 Hz, and you won't have a problem.
I'm glad I kept an eye on this thread as I would have been struggling with the same issue, although I doubt I'll be reaching those sorts of heights until multi frame generation rocks out.
 
I'm glad I kept an eye on this thread as I would have been struggling with the same issue, although I doubt I'll be reaching those sorts of heights until multi frame generation rocks out.
That's the beauty of Freesync - you don't have to worry about matching your max refresh rate because your monitor automatically adapts to whatever FPS you've got (within a certain range of course). :)
 
I'm glad I kept an eye on this thread as I would have been struggling with the same issue, although I doubt I'll be reaching those sorts of heights until multi frame generation rocks out.
Depends on games, Space Dwarves (Deep Rock Galactic) works for me with around 350 fps :D (with 75Hz display :) )
 
Depends on games, Space Dwarves (Deep Rock Galactic) works for me with around 350 fps :D (with 75Hz display :) )
Yea, apparently certain games set to get better.

That's the beauty of Freesync - you don't have to worry about matching your max refresh rate because your monitor automatically adapts to whatever FPS you've got (within a certain range of course). :)
Yea, but then flicker starts to rear its ugly head, no? This is where multi frame generation might come in handy, not sure if you could use it selectively (predictively) to stop wild frame jumps?
 
Last edited:
Yea, but then flicker starts to rear its ugly head, no? This is where multi frame generation might come in handy, not sure if you could use it selectively (predictively) to stop wild frame jumps?
currently I don't think so, at least for FSR Framegen or Nvidia (M)FG. But LS (Lossless Scaling) can do it as far as I know.

Just set it to 2X and cap the max Framerate over AMD/Nvidia Control Panel & it should only create so many frames it needs.

Example: You get 91 Frames and set it to 120, it should only create 29 fake frames (but with lower quality than the other two)
 

I've read that AMD will be announcing more FSR 4 partners at Computex but I wonder how things would have looked without something like Optiscaler...
It would have looked preeeetty bad. Even with Optiscaler a game still requires FSR 3.1 to use FSR 4 so most FSR titles are still not supported.

I've been trying out different games where Optiscaler works.

3.1 360p to 1080p
ezgif-4947f3f6142743.gif

4.0 360p to 1080p
ezgif-4863562a723555.gif

Naturally, 1440p and 4k will be even better.

The results are great, just needs to actually support more games.
 
Love how good FSR 4 looks even with a really low render solution. I've been trying things out in the opposite direction. Lots of older games only go up to the "FSR Quality" preset which renders at 66% of the target resolution. With Optiscaler you can crank that up to 100% and maybe even add a bit of CAS. The result is a detailed, sharp, stable image that is perfect for streaming 1080p@120hz to my TV.
 
Here is something to try out. Lower temps with the same performance, with some adjustments in AMD Adrenalin. Tested on Sapphire Pure 9070 XT.

Is your AMD Radeon RX 9070 XT running hotter or drawing more power than you'd like? While its raster performance is impressive for the price, the stock power efficiency can be a concern, especially compared to Nvidia's RTX 5070 Ti. But what if you could significantly lower power consumption, temps, and fan noise WITHOUT losing gaming performance? In this video, we dive deep into undervolting and power limiting the RX 9070 XT using AMD's Adrenalin software.I'll show you the exact settings I used (-85mV Core Offset, -15% Power Limit, +Memory OC) on my Sapphire PURE model to achieve:
✅ Lower Power Draw: Cutting up to 80W in some games!
✅ Reduced Temps & Noise: Making for a cooler, quieter rig.
✅ MATCHING (or even BEATING) Stock FPS: Seriously, no performance loss!
We test this optimized profile against both stock and a power-hungry overclocked setup across 10 demanding games at 1440p, including Spider-Man 2, Kingdom Come: Deliverance 2, Black Ops 6, and more. PLUS: We'll see how this tuned RX 9070 XT stacks up against an undervolted RTX 5070 Ti in terms of raw performance AND efficiency (performance-per-watt). The results might surprise you! If you own an RX 9070 XT or are considering one, this is essential viewing for optimizing your experience. Stop leaving performance and efficiency on the table!
 
I use +10% Power Limit because there isn't +30% :shadedshu:
 
Yea, but then flicker starts to rear its ugly head, no? This is where multi frame generation might come in handy, not sure if you could use it selectively (predictively) to stop wild frame jumps?
That entirely depends on your monitor and its low framerate compensation (LFC) algorithm.

For example, mine has a Freesync range of 48-144 Hz. When I have 72-144 FPS, my frame rate matches my monitor's refresh rate, and the image is buttery smooth. When I've got under 72 FPS, LFC kicks in, and my monitor doubles or triples its refresh rate relative to my frame rate (so for example, it works at 100 Hz with 50 FPS, or 90 Hz with 30 FPS). If the frame rate is rock stable, there's no problem as long as I'm within 48-144 FPS/Hz. But when there's minor fluctuations in the 50-55 FPS range, even if you can't see it, it can result in the monitor rapidly turning LFC on and off, which leads to some backlight flicker. It's not a big deal, but quite observable on a light image.

But like I said, it depends on your monitor alone.
 
Last edited:
-85mV Core Offset
If you enjoy random crashes when you play your games, -85mV is perfectly fine!

I wish people could stop spreading greedy non stable settings. That's sad.
 
If you enjoy random crashes when you play your games, -85mV is perfectly fine!

I wish people could stop spreading greedy non stable settings. That's sad.
It maybe doesn't fit everyone. See that as a test purpose. It works fine for me so far. You have to find your own sweet spot for your hardware.
 
Love how good FSR 4 looks even with a really low render solution. I've been trying things out in the opposite direction. Lots of older games only go up to the "FSR Quality" preset which renders at 66% of the target resolution. With Optiscaler you can crank that up to 100% and maybe even add a bit of CAS. The result is a detailed, sharp, stable image that is perfect for streaming 1080p@120hz to my TV.
If upscaling gets any better there'll be like no reason to play native. It'll be a huge deal for consoles hooked up to massive TVs.

I do enjoy native AA though, I think that's what you're referring to?
 
It maybe doesn't fit everyone. See that as a test purpose. It works fine for me so far. You have to find your own sweet spot for your hardware.
This message goal is not to start a war nor to be rude.

I belive it would be responsible to add an edit to your post to specify what you just wrote.
 
Last edited:
If upscaling gets any better there'll be like no reason to play native. It'll be a huge deal for consoles hooked up to massive TVs.

I do enjoy native AA though, I think that's what you're referring to?

Yeah, I'm referring to using FSR 4 Antialiasing with a native resolution input, similar to Nvidia's DLAA. It's mostly overkill at 4k, but very useful at 1080p.
 
  • Like
Reactions: NSR
This message goal is not to start a war nor to be rude.

I belive it would be responsible to add an edit to your post to specify what you just wrote.
I am not rude. You seem on the edge, chill down. My post was a tip, a start to make lower temps and keep the performance. But you keep make things up for nothing.
 
I think that there is a language barrier, and sometimes what looks like rude remark is just wrongly set words, and if I see something like that I think that it is just language barrier, and move on with topic :)
or I'm delusional xD
 
Back
Top