- Joined
- Feb 22, 2009
- Messages
- 762 (0.14/day)
System Name | Lenovo 17IMH05H |
---|---|
Processor | Core i7 10750H |
Video Card(s) | GTX 1660 Ti |
Audio Device(s) | SSL2 |
Software | Windows 10 Pro 22H2 |
Benchmark Scores | i've got a shitload of them in 15 years of TPU membership |
After toying with the new AMD RX Vega 56 i finally made all the tests i needed to understand how this graphic card works. Anyone can use this data and post it in other forums to indulge other people or use this as source for any sort of resolutions.
POWERCOLOR AMD RX VEGA 56
Using a Sony X800D 4K TV i have made 20 game benchmarks at 3840X2160 and the first thing i will compare is the performance difference in my tested games between the different drivers used - 17.9.3 vs. 17.10.2.
TEST SETUP
Intel Core i7 5775C 4.3 GHz OC {all cores}
Gigabyte GA-Z97X-Gaming 3
Crucial Ballistic Tactical 2X8 GB DDR3 1600 MHz CL8
AMD Radeon Vega RX 56 8 GB HBM2 {reference design}
Windows 7 X64 Pro
Radeon Software 17.9.3 WHQL/Radeon Software 17.10.2 BETA
So let's start with a performance comparison between two different driver versions
As you see, the newer beta drivers improve performance in 3 games, but decrease performance in 7 games. Also percentage wise the performance decrease is much worse than increase. 10 games remain unaffected. The thing is that the improvements in those 3 games are not really noticeable, as those 3 games run more or less smooth at 4K resolution anyway. Much worse is the fact that performance has been reduced in games, which already run very slow at 4K resolution.. Having this in mind it's no brainer to stay away from beta drivers.
DirectX vs. OpenGL vs. Mantle vs. Vulkan
I can confirm that both drivers 17.9.3 and 17.10.2 work with DirectX and OpenGL API. However... Both drivers crash when Mantle or Vulkan API is selected. YES THAT'S RIGHT! Even though Mantle or Vulkan is an "officially" supported API, and Vulcan libraries are installed, whenever one of those application programing interfaces is selected to render any of my 20 games that actually supports those API, the game crashes instantly! EPIC FAIL AMD!
OVERCLOCKING 17.9.3 vs. 17.10.2
The trade off by using 17.10.2 drivers is that AMD Radeon Software can overclock RX Vega 56 above it's max 1590 MHz core boost clock without games crashing. By using 17.9.3 drivers i could not overclock RX Vega 56 above it's max 1590 MHz core boost clock without games crashing.
The solution for overclocking with 17.9.3 drivers is using second party programs like MSI Afterburner.
REAL CORE CLOCKS vs. REFERENCE CORE CLOCKS
The worst thing about how RX Vega 56 works together with Radeon Software is that the "real world" core boost clocks are always slower than the referenced core boost clocks. This is unlike NVIDIA hardware + NVIDIA software, where "real world" core boost clocks are always higher than the referenced core boost clocks (i had lots of NVIDIA graphic cards to confirm this).
Having said that i made a comparison chart that shows just how "real world" core boost clocks work in my tested games during 15 seconds Fraps benchmarking that i do since 2009. I show the difference in core boost clocks between:
1. RX Vega 56 working on default AMD Wattman profile, where minimal power state is 0.
2. RX Vega 56 working in altered AMD Wattman profile, where minimal power state is 6.
3. RX Vega 56 working in MSI Afterburner profile, no overclocking, default power limit and FAN.
4. RX Vega 56 working in MSI Afterburner profile, 1650 MHz OC, power limit +50 %, unlocked FAN.
The most important thing to understand here that this comparison only works in MY BENCHMARK, because the working core clocks are custom to the specific scenes/places where i benchmark games. In most games those places are static, and in less games those places are dynamic, resulting in higher core clocks at a certain given milliseconds time. However the point here is that even with AMD Wattman settings set for the core clock to never go below 1537 MHz, and reach 1590 MHz max core boost clock (how it should be), this algorithm utterly fails, as real clocks in games are significantly worse than what the RX Vega is capable off. However when MSI Afterburner is "activated", it overwrites AMD Wattman commands coming to the video card, making it work at it's referenced core boost clocks more properly, even though the core boost clocks don't reach the referenced 1590 MHz target.
That 1590 MHz core boost clock is displayed in every program i can think of, even though the vendor of the card states that the maximum boost clock is 1471 MHz... This leaves me wonder what's the case here.
Better yet, MSI Afterburner allows stable overclocking, not only increasing the referenced core boost clock, but directly increasing the "real world" core boost clocks proportionally. I did not dare overclocking the card above 1650 MHz, as it is a reference design model and i don't have a warranty either. HBM2 speeds remain stable at 800 MHz only with MSI Afterburner activated. When AMD Wattman settings are in charge, the "real world" memory clocks would go from 800 MHz to 500 MHz....
FAN SPEED AND TEMPERATURE INFLUENCE IN THROTTLING
I have observed that increased GPU temperatures due to default FAN low RPM mode under default AMD Wattman settings do not affect video card frequencies. That said, if you increase FAN RPM to lower the temperatures, you will not get higher clock speeds and higher FPS. This is something that i did not expect, as i thought that throttling is affected by temperatures - it is not...
HBCC INFLUENCE
Our forum member @fullinfusion can tell you in detail how high bandwidth memory cache controller affects performance on AMD RX Vega cards. I am wasted already a lot of time with this card..
FURMARK, SUPERPOSITION AND CINEBENCH
All these programs were running using MSI Afterburner settings, no overclocking.
As you can see, the max core boost clock was only 1350 MHz during the whole run, even though Furmark "detected" 1640 MHz for reference values for whatever reason.
Max core boost clock would only go as high as 1350 MHz.
Finally a program where max core boost clock ran even higher than 1590 MHz.
4K benchmarks: GTX980 Ti Strix vs. GTX1070 Strix vs. GTX1080 SC vs. RX Vega 56 OC
Coming up next (in two hours time).
POWERCOLOR AMD RX VEGA 56
Using a Sony X800D 4K TV i have made 20 game benchmarks at 3840X2160 and the first thing i will compare is the performance difference in my tested games between the different drivers used - 17.9.3 vs. 17.10.2.
TEST SETUP
Intel Core i7 5775C 4.3 GHz OC {all cores}
Gigabyte GA-Z97X-Gaming 3
Crucial Ballistic Tactical 2X8 GB DDR3 1600 MHz CL8
AMD Radeon Vega RX 56 8 GB HBM2 {reference design}
Windows 7 X64 Pro
Radeon Software 17.9.3 WHQL/Radeon Software 17.10.2 BETA
So let's start with a performance comparison between two different driver versions
As you see, the newer beta drivers improve performance in 3 games, but decrease performance in 7 games. Also percentage wise the performance decrease is much worse than increase. 10 games remain unaffected. The thing is that the improvements in those 3 games are not really noticeable, as those 3 games run more or less smooth at 4K resolution anyway. Much worse is the fact that performance has been reduced in games, which already run very slow at 4K resolution.. Having this in mind it's no brainer to stay away from beta drivers.
DirectX vs. OpenGL vs. Mantle vs. Vulkan
I can confirm that both drivers 17.9.3 and 17.10.2 work with DirectX and OpenGL API. However... Both drivers crash when Mantle or Vulkan API is selected. YES THAT'S RIGHT! Even though Mantle or Vulkan is an "officially" supported API, and Vulcan libraries are installed, whenever one of those application programing interfaces is selected to render any of my 20 games that actually supports those API, the game crashes instantly! EPIC FAIL AMD!
OVERCLOCKING 17.9.3 vs. 17.10.2
The trade off by using 17.10.2 drivers is that AMD Radeon Software can overclock RX Vega 56 above it's max 1590 MHz core boost clock without games crashing. By using 17.9.3 drivers i could not overclock RX Vega 56 above it's max 1590 MHz core boost clock without games crashing.
The solution for overclocking with 17.9.3 drivers is using second party programs like MSI Afterburner.
REAL CORE CLOCKS vs. REFERENCE CORE CLOCKS
The worst thing about how RX Vega 56 works together with Radeon Software is that the "real world" core boost clocks are always slower than the referenced core boost clocks. This is unlike NVIDIA hardware + NVIDIA software, where "real world" core boost clocks are always higher than the referenced core boost clocks (i had lots of NVIDIA graphic cards to confirm this).
Having said that i made a comparison chart that shows just how "real world" core boost clocks work in my tested games during 15 seconds Fraps benchmarking that i do since 2009. I show the difference in core boost clocks between:
1. RX Vega 56 working on default AMD Wattman profile, where minimal power state is 0.
2. RX Vega 56 working in altered AMD Wattman profile, where minimal power state is 6.
3. RX Vega 56 working in MSI Afterburner profile, no overclocking, default power limit and FAN.
4. RX Vega 56 working in MSI Afterburner profile, 1650 MHz OC, power limit +50 %, unlocked FAN.
The most important thing to understand here that this comparison only works in MY BENCHMARK, because the working core clocks are custom to the specific scenes/places where i benchmark games. In most games those places are static, and in less games those places are dynamic, resulting in higher core clocks at a certain given milliseconds time. However the point here is that even with AMD Wattman settings set for the core clock to never go below 1537 MHz, and reach 1590 MHz max core boost clock (how it should be), this algorithm utterly fails, as real clocks in games are significantly worse than what the RX Vega is capable off. However when MSI Afterburner is "activated", it overwrites AMD Wattman commands coming to the video card, making it work at it's referenced core boost clocks more properly, even though the core boost clocks don't reach the referenced 1590 MHz target.
That 1590 MHz core boost clock is displayed in every program i can think of, even though the vendor of the card states that the maximum boost clock is 1471 MHz... This leaves me wonder what's the case here.
Better yet, MSI Afterburner allows stable overclocking, not only increasing the referenced core boost clock, but directly increasing the "real world" core boost clocks proportionally. I did not dare overclocking the card above 1650 MHz, as it is a reference design model and i don't have a warranty either. HBM2 speeds remain stable at 800 MHz only with MSI Afterburner activated. When AMD Wattman settings are in charge, the "real world" memory clocks would go from 800 MHz to 500 MHz....
FAN SPEED AND TEMPERATURE INFLUENCE IN THROTTLING
I have observed that increased GPU temperatures due to default FAN low RPM mode under default AMD Wattman settings do not affect video card frequencies. That said, if you increase FAN RPM to lower the temperatures, you will not get higher clock speeds and higher FPS. This is something that i did not expect, as i thought that throttling is affected by temperatures - it is not...
HBCC INFLUENCE
Our forum member @fullinfusion can tell you in detail how high bandwidth memory cache controller affects performance on AMD RX Vega cards. I am wasted already a lot of time with this card..
FURMARK, SUPERPOSITION AND CINEBENCH
All these programs were running using MSI Afterburner settings, no overclocking.
As you can see, the max core boost clock was only 1350 MHz during the whole run, even though Furmark "detected" 1640 MHz for reference values for whatever reason.
Max core boost clock would only go as high as 1350 MHz.
Finally a program where max core boost clock ran even higher than 1590 MHz.
4K benchmarks: GTX980 Ti Strix vs. GTX1070 Strix vs. GTX1080 SC vs. RX Vega 56 OC
Coming up next (in two hours time).
Last edited: