• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Software Adrenalin 22.7.1 Released, Includes OpenGL Performance Boost and AI Noise-Suppression

Happy to help test this new OpenGL driver, there are some minors bits to fix yet, but the jump from what used to be the worst OpenGL Windows driver to the current status is nothing short of amazing. Almost no rendering issues and double the framerate in yuzu is amazing.
 
I'll test it in Unigine Valley to get a bunch of numbers. 'cuz numbers are good and all that.

I'll use it in Minecraft to get an actually good framerate. Hopefully, finally. The only other OpenGL game I play is Stardew Valley (2D, 60fps) but that worked fine before, surely that can't break.......

OK here we go. Sapphire Pulse RX 6400 in Optiplex 9020 i7-4790, 16GB 1600MHz CL11

Minecraft 1.19 with Fabric, Sodium, Lithium, Phosphor performance mods
Forest biome standing on tallest tree, 32 chunk render distance (usual maximum), Fabulous (highest) settings

22.6.1 June 29 driver: 95-106 fps
22.7.1
beta July 26 driver: 132-147 fps

39%
improvement, seems legit.

Valley Benchmark using OpenGL
Extreme HD setting (1080p)

22.6.1 June 29 driver: 38.2 fps, 1605 score
22.7.1
beta July 26 driver: 46.9 fps, 1964 score

22-23%
improvement, but what you see here visually tells you everything. The old driver varies by 20 fps up and down all the time during the entire benchmark run while the new driver stays very even at that upper frame rate. A gigantic visual improvement.

Valley Benchmark using OpenGL
Extreme HD (1080p), but reduce: 8xAA to 4xAA, Render Quality from Ultra to High (to simulate a suite of settings more targeted to this GPU)

22.6.1 June 29 driver: 51.2 fps, 2143 score
22.7.1
beta July 26 driver: 58.9 fps, 2463 score

15%
improvement, interesting that it's smaller at lower quality settings. The old driver 20fps variation was equally visible here and of course equally fixed with the new driver.
 
Well this broke HDR for me - Windows 10 now thinks my monitor doesn't support HDR. First AMD software regression in a long time, but still I now have to choose HDR or openGL performance (HDR wins that argument).
 
Well this broke HDR for me - Windows 10 now thinks my monitor doesn't support HDR. First AMD software regression in a long time, but still I now have to choose HDR or openGL performance (HDR wins that argument).

no issues here.

1658964676316.png
 
I went ahead and tested the performance difference in Minecraft on my main system (R7 5800X, RX 6800 XT).

I used a 1.17.1 instance with shaders and optimization mods, and I also tested my favorite mod pack (All The Mods 3).

I saw a substantial improvement nearly across the board. The only exception was running SEUS PTGI shaders.

The settings and location are identical, and the time of day is approximately the same.

dj8ynQ8.jpg
QikmFpP.jpg


1.17 no shaders:
22.6.1: 252 FPS
22.7.1: 500 FPS

NebvpVB.jpg
JGxVT6Q.jpg


1.17 Sildur's Enhanced Default:
22.6.1: 139 FPS
22.7.1: 239 FPS

3Uo7FSh.jpg
ytEucZB.jpg


1.17 SEUS PTGI:
22.6.1: 40 FPS
22.7.1: 51 FPS

1ozujE0.png
eHlzor7.png


1.12.2 All The Mods 3:
22.6.1: 155 FPS
22.7.1: 240 FPS

All I can say is, about fricken time, AMD.

EDIT: Yay, they also fixed that stupid S3 sleep bug I was having.

Whenever I would wake my PC from S3 sleep, I'd find that the driver crashed and my underclock settings would be reset. That doesn't appear to be happening anymore.
 
Last edited:
I went ahead and tested the performance difference in Minecraft on my main system (R7 5800X, RX 6800 XT).

I used a 1.17.1 instance with shaders and optimization mods, and I also tested my favorite mod pack (All The Mods 3).

I saw a substantial improvement nearly across the board. The only exception was running SEUS PTGI shaders.

The settings and location are identical, and the time of day is approximately the same.

dj8ynQ8.jpg
QikmFpP.jpg


1.17 no shaders:
22.6.1: 252 FPS
22.7.1: 500 FPS

NebvpVB.jpg
JGxVT6Q.jpg


1.17 Sildur's Enhanced Default:
22.6.1: 139 FPS
22.7.1: 239 FPS

3Uo7FSh.jpg
ytEucZB.jpg


1.17 SEUS PTGI:
22.6.1: 40 FPS
22.7.1: 51 FPS

1ozujE0.png
eHlzor7.png


1.12.2 All The Mods 3:
22.6.1: 155 FPS
22.7.1: 240 FPS

All I can say is, about fricken time, AMD.

Thanks for the numbers! Looks great, I'll be trying my 6600XT this evening with shaders. But I see that along with modest PTGI FPS improvements, the Radeon dark shadow bug is still present. Oh well at least it's no worse and many other shaders look good.
 
Not 100% ready for prime time. Just installed on Asus AMD Advantage laptop (5900HX/6800M). Lost all input on reboot - USB device, trackpad, and keyboard. I had to System Restore to roll back.
 
Not 100% ready for prime time. Just installed on Asus AMD Advantage laptop (5900HX/6800M). Lost all input on reboot - USB device, trackpad, and keyboard. I had to System Restore to roll back.

How on Earth would a graphics driver break your keyboard, trackpad and USB functionality? That seems incredibly odd to me. Something seems off.
 
The boost on these drivers are nice

41% gain in FPS

1658977217410.png



1658977312240.png
 
Quite a few peeps on reddit lost HDR too so not system specific. Out of interest do you use an 8 or 10 bit monitor? And do you use display screen compression for your resolution/framerate mix? I'm trying to figure out if it is a DSC related bug (as my monitor uses DSC at 4k 144hz) or a but depth related issue.
 
How on Earth would a graphics driver break your keyboard, trackpad and USB functionality? That seems incredibly odd to me. Something seems off.
I don't know if the keyboard and trackpad use a USB connection and it's a USB problem, or if the whole system froze. Everything worked until a few seconds after the desktop came up - definitely a driver issue.

I updated through the driver app. Might try downloading from the website.
 
How on Earth would a graphics driver break your keyboard, trackpad and USB functionality? That seems incredibly odd to me. Something seems off.
The gfx driver installs a PCI driver, USB is driven by PCI on mobo, so until driver reloaded you lose USB - just (educated?) guessing from stuff I've read on other forums.
 
Where will you test it/use it?
Beyond the obvious Minecraft, there are various Id titles, the vast majority of indie games and the often forgotten emulators. For Linux gamers there is also Linux ports and Wine/Proton. (Then there is also non-gaming workloads like CAD, etc.)
 
Great, it isn't limited to the latest GPUs
 
Quite a few peeps on reddit lost HDR too so not system specific. Out of interest do you use an 8 or 10 bit monitor? And do you use display screen compression for your resolution/framerate mix? I'm trying to figure out if it is a DSC related bug (as my monitor uses DSC at 4k 144hz) or a but depth related issue.
The LG 34GP83A-B is 10bit at 144hz and if I overclock to 160hz it will drop to 8bit currently at 10bit.

No screen compression.
 
The gfx driver installs a PCI driver, USB is driven by PCI on mobo, so until driver reloaded you lose USB - just (educated?) guessing from stuff I've read on other forums.

I'm quite aware of the PCIe filter driver, but I've never seen a single case where that would break system input basically permanently that way.

I don't know if the keyboard and trackpad use a USB connection and it's a USB problem, or if the whole system froze. Everything worked until a few seconds after the desktop came up - definitely a driver issue.

I updated through the driver app. Might try downloading from the website.

Give it a try, with a clean install option ticked. Might have some conflict with an earlier version of some file in there somewhere. It'd be a wild guess though.
 
How on Earth would a graphics driver break your keyboard, trackpad and USB functionality? That seems incredibly odd to me. Something seems off.
same thing happened to me when I installed these drivers my screen went blank and never recovered and all my USB devices died leaving me with no other recourse than to hit the power button and hard reboot my system

I don't know if the keyboard and trackpad use a USB connection and it's a USB problem, or if the whole system froze. Everything worked until a few seconds after the desktop came up - definitely a driver issue.

I updated through the driver app. Might try downloading from the website.
I used the full install driver package from AMD and had the same issue the monitor failed to recover when it went blank while installing and all my USB devices powered off and failed to come back I had to hard reboot I also tried DDU and a reinstall same thing happened no monitor no USB devices hard reboot
 
With the download from AMD.com and a factory reset 22.7.1 seems to have installed right.
 
The improvements in my modded Minecraft 1.6.4 average FPS is huge! The chunk loading still sucks (which is why 1% and 0.1% are so low) but LOOK at those average and maximum improvements!

RX6600 - Old driver (can't remember what version):
18-05-2022, 17:29:35 javaw.exe benchmark completed, 10548 frames rendered in 52.157 s
Average framerate : 202.2 FPS
Minimum framerate : 171.2 FPS
Maximum framerate : 252.4 FPS
1% low framerate : 48.1 FPS
0.1% low framerate : 30.7 FPS

RX 6600 - New driver (22.7.1):
29-07-2022, 10:46:34 javaw.exe benchmark completed, 29361 frames rendered in 89.625 s
Average framerate : 327.5 FPS
Minimum framerate : 168.4 FPS
Maximum framerate : 589.4 FPS
1% low framerate : 6.2 FPS
0.1% low framerate : 3.2 FPS

It still can't compete with my GTX 1060 6GB in minimum values though......

GTX 1060 6GB:
17-05-2022, 21:38:06 javaw.exe benchmark completed, 46906 frames rendered in 137.562 s
Average framerate : 340.9 FPS
Minimum framerate : 254.3 FPS
Maximum framerate : 463.5 FPS
1% low framerate : 122.3 FPS
0.1% low framerate : 64.4 FPS

Even though Radeon performance is still not great, the improvement in average FPS is very noticeable and a lot of the framerate stutters are gone. It's actually enjoyable now rather than just "playable".
 
It's just a pity that the new drivers have reduced OC VRAM on the RX 6700 XT from 2150 to 2120 MHz :/
 
The improvements in my modded Minecraft 1.6.4 average FPS is huge! The chunk loading still sucks (which is why 1% and 0.1% are so low) but LOOK at those average and maximum improvements!

RX6600 - Old driver (can't remember what version):
18-05-2022, 17:29:35 javaw.exe benchmark completed, 10548 frames rendered in 52.157 s
Average framerate : 202.2 FPS
Minimum framerate : 171.2 FPS
Maximum framerate : 252.4 FPS
1% low framerate : 48.1 FPS
0.1% low framerate : 30.7 FPS

RX 6600 - New driver (22.7.1):
29-07-2022, 10:46:34 javaw.exe benchmark completed, 29361 frames rendered in 89.625 s
Average framerate : 327.5 FPS
Minimum framerate : 168.4 FPS
Maximum framerate : 589.4 FPS
1% low framerate : 6.2 FPS
0.1% low framerate : 3.2 FPS

It still can't compete with my GTX 1060 6GB in minimum values though......

GTX 1060 6GB:
17-05-2022, 21:38:06 javaw.exe benchmark completed, 46906 frames rendered in 137.562 s
Average framerate : 340.9 FPS
Minimum framerate : 254.3 FPS
Maximum framerate : 463.5 FPS
1% low framerate : 122.3 FPS
0.1% low framerate : 64.4 FPS

Even though Radeon performance is still not great, the improvement in average FPS is very noticeable and a lot of the framerate stutters are gone. It's actually enjoyable now rather than just "playable".
In general, isn't RX 6600 about ~50-60% faster than GTX 1060? So on a larger sample set we should expect RX 6600 to beat GTX 1060 soundly then.
Surely, this is just one game, but those minimum/low framerates and your description of the behavior worries me. Being familiar of how OpenGL works, this sounds like updates of vertex buffers have become very slow, which could be very bad news for newer games which updates a lot of geometry.

I don't have an extra computer to throw my Radeon card into to check myself right away, but this needs more investigation, and doesn't sound like mature OpenGL support to me.

But from a subjective standpoint, how does the gameplay feel compared to the old driver and the Nvidia card when moving around to load new chunks, with dips this low you will notice it, right? Granted, Minecraft is jerky even on Nvidia hardware, but I'm curios to whether you can sense the difference.
 
It's just a pity that the new drivers have reduced OC VRAM on the RX 6700 XT from 2150 to 2120 MHz :/

I highly doubt you will see a difference in performance from 30 mhz on memory.
 
I highly doubt you will see a difference in performance from 30 mhz on memory.
It is not about the difference in performance, but about the very fact and reason for doing so. Oddly enough, so far the OC from 2000 MHz to 2150 was every time, and now it is 2120 and once 2150 MHz :confused:
 
In general, isn't RX 6600 about ~50-60% faster than GTX 1060? So on a larger sample set we should expect RX 6600 to beat GTX 1060 soundly then.
Surely, this is just one game, but those minimum/low framerates and your description of the behavior worries me. Being familiar of how OpenGL works, this sounds like updates of vertex buffers have become very slow, which could be very bad news for newer games which updates a lot of geometry.

I don't have an extra computer to throw my Radeon card into to check myself right away, but this needs more investigation, and doesn't sound like mature OpenGL support to me.

But from a subjective standpoint, how does the gameplay feel compared to the old driver and the Nvidia card when moving around to load new chunks, with dips this low you will notice it, right? Granted, Minecraft is jerky even on Nvidia hardware, but I'm curios to whether you can sense the difference.

In Minecraft you can't really expect any scaling in performance that would make sense. You can upgrade from an RTX 2080 to an RTX 3080 and see no improvement at all, even though that would be a big upgrade in other games. There's a couple of reasons for that and they boil down to Minecraft basically being an inefficient Java game (along with the additional inefficiencies of modpacks). The inefficiency of the engine means that it benefits more from processor IPC and that's what limits your framerate performance. CPU bottleneck, basically.

I'm not sure why the minimums are so crap but those big framerate stutters only happen when you're moving around rendering new chunks. Stay in the same place for any amount of time and the 0.1% lows go way up to over 200fps and it's butter smooth. The Nvidia cards also suffer from the same thing but manage to keep framerates above 60. But yes you can definitely feel that it runs worse on Radeon. I think there's a lot of improvement still to go.

But this problem seems to be specific to Minecraft, other games run just fine. Going from the 1060 to the RX6600 with SAM, my minimum framerates went up by 63% in Mad Max and 80% in Forza Horizon 5. So those games definitely scale very well.
 
Back
Top