• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 6900 XT Graphics Card OpenCL Score Leaks

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,057 (1.08/day)
AMD has launched its RDNA 2 based graphics cards, codenamed Navi 21. These GPUs are set to compete with NVIDIA's Ampere offerings, with the lineup covering the Radeon RX 6800, RX 6800 XT, and RX 6900 XT graphics cards. Until now, we have had reviews of the former two, but not the Radeon RX 6900 XT. That is because the card is coming at a later date, specifically on December 8th, in just a few days. As a reminder, the Radeon RX 6900 XT GPU is a Navi 21 XTX model with 80 Compute Units that give a total of 5120 Stream Processors. The graphics card uses a 256-bit bus that connects the GPU with 128 MB of its Infinity Cache to 16 GB of GDDR6 memory. When it comes to frequencies, it has a base clock of 1825 MHz, with a boost speed of 2250 MHz.

Today, in a GeekBench 5 submission, we get to see the first benchmarks of AMD's top-end Radeon RX 6900 XT graphics card. Running an OpenCL test suite, the card was paired with AMD's Ryzen 9 5950X 16C/32T CPU. The card managed to pass the OpenCL test benchmarks with a score of 169779 points. That makes the card 12% faster than RX 6800 XT GPU, but still slower than the competing NVIDIA GeForce RTX 3080 GPU, which scores 177724 points. However, we need to wait for a few more benchmarks to appear to jump to any conclusions, including the TechPowerUp review, which is expected to arrive once NDA lifts. Below, you can compare the score to other GPUs in the GeekBench 5 OpenCL database.


View at TechPowerUp Main Site
 
Somehow Geekbench manages to be utter crap even for OpenCL.


Actual OpenCL performance :

1606982365321.png
 
CDNA should be more optimized for OpenCL no?
 
3% Slower than RTX3090@4k
 
Forums would be a better place if Geekbench just rolled over and died.

All it's good for is generating hype, conjecture, and FUD.
 
What was the 6800xt and where did it end up with gaming etc?

Forums would be a better place if Geekbench just rolled over and died.

All it's good for is generating hype, conjecture, and FUD.
Can we add userbench to that? Some people still think its The Gospel too... :p
 
those 3% are definitely worth the $500 and 100W extra power draw
Not that I disagree, but also

8GB / 50% more VRAM
Very strong productivity performance
NVidia software suite (gamestream, recording, voice etc)
Stronger RT performance
DLSS

Granted if any or all of those mean bupkiss to you, thats fine too, but theres a bit more to it than 3% for +$500 and +100w. In some ways at least, you're definitely getting more product.

Not to mention availability, the 6900XT is going to be unobtanium, and Nvidia might just sell a few 3090's to people who want a GPU right away/sooner.
 
Not to mention availability, the 6900XT is going to be unobtanium, and Nvidia might just sell a few 3090's to people who want a GPU right away/sooner.

With the 3090 costing $2000-2400 CAD there both unobtanium as far as my wallet is concerned.
 
On old games. With new games that comes with RT, it’s a 3070 competitor for $1000. No thanks.

I guess its good that my Library of games that I play 0 support RT.

So for some of us that isn't a deciding factor on anything.
 
Buying a $1000 graphics card to play new games at PS4/Xbox one era graphics is borderline insane.

We have PC games that don't have RT that are far above PS4/Xbox graphics so not sure what you mean there.

And the point of my post is it will all depend on the games you play and not everyone cares for RT performance.

When 50% of games have it and we are on the Nv 4000 series and RX 7000 series then maybe it will be a bigger factor for me.
 
Last edited:
Buying a $1000 graphics card to play new games at PS4/Xbox one era graphics is borderline insane.

Pure rasterization in every game is going to be supported for at least another 2-3 years at least. By the time RT becomes mainstream in most games being released, the current high end RT cards are going to be at least a generation behind and comparatively slow at RT.
 
Pure rasterization in every game is going to be supported for at least another 2-3 years at least. By the time RT becomes mainstream in most games being released, the current high end RT cards are going to be at least a generation behind and comparatively slow at RT.

I have no RTRT games, and just like other things like AF that took litteral years to perfect, RTRT will be the same, and currently AF is essentially "free" where it doesn't cost enough performance to matter for the improved image quality.

I intend on buying a 6xxx series, will still probably play on 1080 for another year with eye candy turned to max, and with upgrade mods on games to make them look as good or better than RTRT with mediocre texture and too shiny items.
 
Somehow Geekbench manages to be utter crap even for OpenCL.


Actual OpenCL performance :

View attachment 177963

"RDNA is not made for compute" they say... and yet its beating the Radeon VII there. And both are 7nm chips.

Heh, anyone who has seen that RDNA ISA can tell that AMD has put a lot of work at improving the assembly code this generation. The real issue is that RDNA remains unsupported on ROCm/HIP, so Radeon VII remains the best consumer-ish card to run in an ROCm/HIP setting. (Maybe Rx 580 if you wanted something cheaper). AMD's compute line is harder to get into as long as these RDNA cards don't work with ROCm/HIP.

ROCm/OpenCL support is pleasant to hear, but I bet it still doesn't run Blender reliably (I realize ROCm is under active development and things continue to improve, but I really haven't had much luck with OpenCL)
 
Screw that.
The 6900 series shouldn't exist - it's the true "XT" card artificially inflated to rip-off pricing just because it's close enough to Nvidia's own rip-off card that AMD can get away with it.

Historically, the XT model has always been the full silicon, and the overclocked binned silicon of those called the XTX.

What we have with the 6900 is the XTX variant, the 6800XT is what would normally have been the vanilla SKU with a cut down core harvesting those defective dies, and then the current vanilla 6800 would be an OEM-only LE version because at 60CU out of a potential 80CU, it's a seriously hampered piece of silicon using some incredibly defective dies. In an RX 6800, 25% of the cores are disabled, and that's huge.
 
ROCm/OpenCL support is pleasant to hear, but I bet it still doesn't run Blender reliably (I realize ROCm is under active development and things continue to improve, but I really haven't had much luck with OpenCL)

When I had my RX 5700 XT in October, Blender was still buggy with OpenCL enabled on the latest open-source driver. You can use it fine if you disable OpenCL though.
 
On old games. With new games that comes with RT, it’s a 3070 competitor for $1000. No thanks.
there he is, the cute little shill, moving goalposts as usual!
 
there he is, the cute little shill, moving goalposts as usual!
I'd call it looking at the big picture, personally. If the games you play use RT and you want to use those abilities, the AMD cards aren't as good it seems. Their RT performance, this moment, is no better than first gen RT from NV. It's just that simple. If you don't care or don't plan on playing those games (of which more titles come out monthly... that will ramp up as time goes on since consoles all have it - a lot faster than the last two years with only NV in the market) it makes sense to put the blinders on and ignore that performance metric.
 
Back
Top