• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

PowerColor Radeon RX 9060 XT Reaper 8 GB

That is not a PCIe 8x VS 16x problem. It's something else. Maybe driver glitch?
Most likely a PCIe problem in general. I'd say Nvidia would sort it out quickly, were it a driver problem.
Nvidia has been known for good drivers, now they release new version every week, fixing this and that.
Also, if you recall, RTX 5060 series launch was postponed due to performance issues that needed to be sorted out.
IMHO, there's a most probably a hardware issue and Nvidia's trying to workaround it as much as possible, but it has limits.

Anyway, given this bug, I would not recommend 8 GB variants of RTX 5060 (Ti) for anyone with PCIe 5.0 boards, just to be safe.

My video showed the "High Preset" in native 1080p with High RT, the main diferences between that and "Ultra Preset" (besides the high res textures ofc) are things like Mesh Quality, Distant Shadows, Ambient Occlusion and Sky/Clouds. I ran again the benchmark with those settings maxed out and i couldn't spot any "borked textures", however there were a few instances where performance was just awful.

But I think this is more due to the "Horsepower" of the gpu itself, not the vram because, as i said, the textures retain their original quality(And visuals, I seriously didn't notice any significant difference between the High and Ultra presets).
Holy sheet. How can you play this blurry mess? FXAA+TXAA (+upscaling) is disastrous combination.
 
8GB? Any graphics cards with 8GB is useless.
The minimum low end range standard nowadays should be 12GB.
The minimum mid range should be 16GB. And anything above that defined as high end+
 
Holy sheet. How can you play this blurry mess? FXAA+TXAA (+upscaling) is disastrous combination.
It's not upscaled, the game just looks really bad no matter what kind of aa you use. Even with DLAA it looks blurry (idk how is that's possible).

Any graphics cards with 8GB is useless.
I think that any gpu with this price should have at least 12gb, but i wouldn't say that 8gb gpus are useless, if you already have one you can basically play any game with decent settings at native 1080p without any mayor issues, even with some light 1440p.
 
Last edited:
It's not upscaled, the game just looks really bad no matter what kind of aa you use. Even with DLAA it looks blurry (idk how is that's possible).


I think that any gpu with this price should have at least 12gb, but i wouldn't say that 8gb gpus are useless, if you already have one you can basically play any game with decent settings at native 1080p without any mayor issues, even with some light 1440p.
I initially thought the blurry quality in your video and screen shots was from FXAA+TXAA. I'm starting to think the texture quality is being degraded with the settings you're using due to a lack of VRAM. Just not as severely degraded as when you use the the hi-res textures. I can't recreate your settings exactly because I have Linux installed and the benchmark disables the RT option for AMD cards in this game for some reason. My textures look noticeably better on high than yours do. How does it look if you turn off RT?

Screenshot From 2025-06-11 12-21-54.png
Captura de pantalla 2025-06-10 212606.png
Screenshot From 2025-06-11 12-24-41.png
imagen_2025-06-10_220047988.png
Screenshot From 2025-06-11 12-25-14.png
 
I forgot to say that i recorded the video using the Windows Game Mode recording tool, for some reason the Nvidia app doesn't properly capture RTSS metrics in this game. i think that's the main reason why my footage looks that blurry. This is what it looks like when recorded with the Nvidia app:
(1080p with DLAA, capped at 30fps)

Compared to the B580 benchmarks available on YouTube, apparently there's no problem at all with the textures.
It still looks blurry. You should just do a screenshot while the benchmark is running rather than taking it from the video capture.

Captura de pantalla 2025-06-11 150828.png
Screenshot From 2025-06-11 12-08-45.png
 
It's not upscaled, the game just looks really bad no matter what kind of aa you use. Even with DLAA it looks blurry (idk how is that's possible).

Thats a GPU and memory limitation, both impacting performance and image quality. At these higher quality settings, the dynamic texture hit and LOD scaling kicks in more heavily. In this sort of scenario, the system will scale settings down, at times so significantly that selecting a high preset still results in medium or even low quality like visuals.... basically the illusion of higher quality settings.

I'm in a similar boat but on a much smaller scale in a few of my favourite games at 1440p - running a 10GB 3080. Still a solid card for that res and usually hits my targets without much compromise. Luckily, I haven't jumped into some of these newer more demanding titles that batter even top-shelf GPUs at high res, but where the pear-shaped wobblers are felt in the current games im playing, the gameplay is usually strong enough to let it slide (or go unnoticed).

Have you tried to upscale? FG?
 
Last edited:
It still looks blurry. You should just do a screenshot while the benchmark is running rather than taking it from the video capture.
Sure.

Thats a GPU and memory limitation, both impacting performance and image quality.
I don't think that's the case here tbh.
 

Attachments

  • 20250611161213_1.jpg
    20250611161213_1.jpg
    256.6 KB · Views: 29
  • 20250611161344_1.jpg
    20250611161344_1.jpg
    352.7 KB · Views: 31
  • 20250611161354_1.jpg
    20250611161354_1.jpg
    388.2 KB · Views: 31
  • 20250611161408_1.jpg
    20250611161408_1.jpg
    388.8 KB · Views: 30
Here are some more examples of how PCIE bandwidth can affect performance when VRAM runs out. 5060 Ti testing video should be coming next week.


Sure.


I don't think that's the case here tbh.
Okay. That looks much closer when I compare side by side. The video capture seems to be the culprit.
 
Okay. That looks much closer when I compare side by side. The video capture seems to be the culprit.
I don't know if I would call that OK though. Sure, it looks closer to example provided by you. But still, AAA game released in 2025, demanding like it is - when using high setting and RT, my expectation of graphical fidelity is much higher.

Thats a GPU and memory limitation, both impacting performance and image quality. At these higher quality settings, the dynamic texture hit and LOD scaling kicks in more heavily. In this sort of scenario, the system will scale settings down, at times so significantly that selecting a high preset still results in medium or even low quality like visuals.... basically the illusion of higher quality settings.
Is that only an illusion of higher quality settings, or illusion of 8GB graphic card running OK in 2025 using high preset in latest AAA game?

I think there should be a separate quality preset called dynamic. Explaining the gamer what it does exactly. When you select high preset is should be firm high. If 8GB isn't sufficient it should be immediately visible through lack of performance. Not compensated by secretly lowering the image quality and tricking the gamer.

I have no problem with 8GB VRAM cards in 2025. But be honest to the gamer, set an honest price. Make it known it's a low-end card with low end price. Then, maybe, the game studios will stop trying to make their games run on high preset with RT on low end cards. The expectation for 300USD card is different than for 200USD card.
 
New beta drivers let you to retest The Last of Us Part 1 and fix "Avg performance charts sheets"

AMD Software: Adrenalin Edition 25.6.2 Optional are already in AMD's website​

 
Is that only an illusion of higher quality settings, or illusion of 8GB graphic card running OK in 2025 using high preset in latest AAA game?

I think there should be a separate quality preset called dynamic. Explaining the gamer what it does exactly. When you select high preset is should be firm high. If 8GB isn't sufficient it should be immediately visible through lack of performance. Not compensated by secretly lowering the image quality and tricking the gamer.

I have no problem with 8GB VRAM cards in 2025. But be honest to the gamer, set an honest price. Make it known it's a low-end card with low end price. Then, maybe, the game studios will stop trying to make their games run on high preset with RT on low end cards. The expectation for 300USD card is different than for 200USD card.

Yep, hardware limitations (esp. 8GB) are definitely guilty as charged. Dynamic scaling and on the fly adjustments have been a clever trick for years, and still a very useful feature but only when the effects go unnoticed or the compromise is subtle. Problem is, we've reached a point where "smart" adjustments in newer demanding games are no longer smart but "dumb" quality sacrifices, all in the name of stingy GPU manufacturers who want mainstream gamers to stagnate on their "pay more, get less" milking philosophy. Its not just VRAM but also increasingly aggressive cuts across performance segments, where the performance gap between top-tier and lower-tier GPUs is widening with every generation.

Newer games just aren't playing nice anymore. They demand more, expect more and they're done babysitting skimped hardware. Cant blame them - even developers seem less motivated to meticulously optimize every nook and cranny of a game when "mainstream" hardware continues to stagnate. To make matters worse, they're literally marketing increasingly more VRAM hungry features like RT/PT/FG and baking in RT effects into modern games (esp. UE5), all while desperately praying people blindly pull the trigger before noticing the performance cracks. Its smoke and mirrors and the milking-moos know it.

I think there should be a separate quality preset called dynamic. Explaining the gamer what it does exactly. When you select high preset is should be firm high.

Great idea! Or keep all options available but add a manual override, like a checkbox to lock in fixed quality presets. This is not only useful for the user to identify the level of compromise but also lets reviewers benchmark GPUs in true like-for-like conditions, instead of the skewed results we’re stuck with today. So many times i've seen people backing these 8GB newer Gen cards purely based on how they stack up in the charts, intentionally ignoring how modern games dynamically adjust image quality based on hardware capabilities/limitations. Doesn't matter how many times these concerns are brought up, its the same ole FPS-only song, dance and knock yourself out gig. Its like racing to bake two cakes at the same time, one uses all the right ingredients and the other cuts corners left and right. As long as they finish baking together, who cares if one tastes great and the other is like chewing cardboard - as long as the timer says they’re equal!
 
Yep, hardware limitations (esp. 8GB) are definitely guilty as charged. Dynamic scaling and on the fly adjustments have been a clever trick for years, and still a very useful feature but only when the effects go unnoticed or the compromise is subtle. Problem is, we've reached a point where "smart" adjustments in newer demanding games are no longer smart but "dumb" quality sacrifices, all in the name of stingy GPU manufacturers who want mainstream gamers to stagnate on their "pay more, get less" milking philosophy. Its not just VRAM but also increasingly aggressive cuts across performance segments, where the performance gap between top-tier and lower-tier GPUs is widening with every generation.

Newer games just aren't playing nice anymore. They demand more, expect more and they're done babysitting skimped hardware. Cant blame them - even developers seem less motivated to meticulously optimize every nook and cranny of a game when "mainstream" hardware continues to stagnate. To make matters worse, they're literally marketing increasingly more VRAM hungry features like RT/PT/FG and baking in RT effects into modern games (esp. UE5), all while desperately praying people blindly pull the trigger before noticing the performance cracks. Its smoke and mirrors and the milking-moos know it.
I think the answer is easy. Drop (or make it a choosable option in the settings) the non-transparent aggressive dynamic scaling and just make your games yours. I remember the days "but can it run Crysis?" At that time Crytek didn't care if current hardware is capable running the game. They just made their game - and it was a success.

GPU manufacturers are giving us less for more each generation. And game studios are compensating for it by making their games look like crap. I mean, I understand - guy owning RTX 4060, paying 300USD for it is expecting his GPU to at least run games in 1080p, and off course, with at least high settings. Game studios are just trying to reach greater audience and make more money. That's their motive.

If there are buyers there is no incentive to change it. It's the reality we are living in. I don't like it, but I understand.

My view is simple. There is a GPU which is low-end. Set the low price for it. Make games that look nice and are playable in 1080p low / medium on low-end GPU. Potential buyer goes by and doesn't like it. So, he either buys something more expensive, or nothing.
 
I remember the days "but can it run Crysis?" At that time Crytek didn't care if current hardware is capable running the game. They just made their game - and it was a success.

That game is the most advanced, modern and best looking ever created. 12 years later the only thing they can achieve is Minecraft with RT, or lipstick on a pig's mouth.
 
paying 300USD for it is expecting his GPU to at least run games in 1080p

*1440p with 60 FPS with high settings *

The 720p and 1080p days are over. I also like retro games like mount & blade 1. That can be played with the ryzen 7600x cpu graphics in windows 11 pro in whqd

--

Hardware unboxed recycles their articles also on techspot

Much better to check techspot every 3 months to find their content. Much easier and faster to check the pictures and text there.
I just checked the pictures with the lines stating "Framerate". Enough to get a viewpoint of those 8Gib cards.

 
I don't have much of a choice here in my country, i have to choose a 8GB card, even the B580 is more expensive.

Should i get the 5060 8GB or the 9060xt 8GB ? i plan to use it until 2028 at least, with a ryzen 5500 (pci 3.0) playing at 1080p low/medium settings AAA games , i'm poor, 9060xt 16GB is impossible to me right now
 
Get RX 9060 XT 8 GB if you intend to keep that Ryzen 5500. For yet unknown reason, RTX 5060 (Ti) 8 GB variants seem to be affected by some bug which causes significant performance drops with older PCIe generations. If possible, save up some money, so you can buy 16 GB version. It will make your life much easier in the future if you can't afford upgrading your RIG every 2-3 years.
 
Back
Top