Tuesday, July 25th 2017
AMD Radeon RX Vega Put Through 3DMark
Ahead of its July 27 unveiling at AMD's grand media event on the sidelines of SIGGRAPH, performance benchmarks of the elusive Radeon RX Vega consumer graphics card surfaced once again. Someone with access to an RX Vega sample, with its GPU clocked at 1630 MHz and memory at 945 MHz, put it through 3DMark. One can tell that it's RX Vega and not Pro Vega Frontier Edition, looking at its 8 GB video memory amount.
In three test runs, the RX Vega powered machine yielded a graphics score of 22,330 points, 22,291 points, and 20.949 points. This puts its performance either on-par or below that of the GeForce GTX 1080, but comfortably above the GTX 1070. The test-bench consisted of a Core i7-5960X processor, and graphics driver version 22.19.640.2.
Source:
VideoCardz
In three test runs, the RX Vega powered machine yielded a graphics score of 22,330 points, 22,291 points, and 20.949 points. This puts its performance either on-par or below that of the GeForce GTX 1080, but comfortably above the GTX 1070. The test-bench consisted of a Core i7-5960X processor, and graphics driver version 22.19.640.2.
175 Comments on AMD Radeon RX Vega Put Through 3DMark
improvement we see here is from IPC gain and uarch changes...
honestly I dont see the point of this card
with costly HBM2 memory chip and delayed product yet still outclassed by one years old GPU
this is worse than Fury X, people used to compare Fury X with 980 Ti
but this, this is just so bad....
we need another Jim Keller type guy to research/design new GPU uarch
This testing is done with the rig in my specs
5960X@4.8, Asus X99m WS, 4x8GB DDR4 @2666CL14, 512GB NVMe, 2 EVGA SC BLACK 1080Ti's (SLi disabled OBVIOUSLY)
50% TDP rest of card settings at stock, this yielded a graphics score of 20343 and consumed 125W at the GPU (under mining stress load)
55% TDP rest of card settings at stock, this yielded a graphics score of 23158 and consumed 138W at the GPU (under mining stress load)
58% TDP rest of card settings at stock, this yielded a graphics score of 24016 and consumed 144W at the GPU (under mining stress load)
So with a similarly sized GPU I can basically run my cards silent for the same performance. Fan curve never broke 20%, my ceiling fan is louder.
They could have just made a 4608-SP Polaris card with HBM and it would have done FAR better...
-Massive hype
-Tons of delays
-Completely new arch with high power usage
I am sure it will (And it does) dominate at certain professional workloads just like Bulldozer did. However gamers will say "Why didn't they just die shrink Phenom II/Polaris?!" Polaris is actually more memory starved than compute. Tests (That I have replicated btw) have shown that the RX 480 would have scaled its performance almost linearly with 30% more bandwidth.
But that's what it is
To those that predicted a +20/30% performance win (because drivers!) from the 'not meant for gaming' Vega... yeah, hope you learned something here
Fury X v2 is reality
Let's face it, GCN is like Intel Core right now, as DX12 has landed its been scaled up to the max. Not the best timing for AMD and I really hope they have something up their sleeve for Navi besides glueing GCN together.
Perhaps the only advantage of dual RX480 would have been a much, much cheaper to produce high end, that could have been to market a year ago.
@ratirt not sure how much more you need here to know what's what. Even if it still gains 10% from drivers, which I find reasonably plausible (and still a feat on its own tbh, go look at pre- and post driver launch benches of the past) it would make 24k points and still be much too hungry for too little performance and a high cost to make. Keep in mind that high board TDP also severely limits OC headroom.
Have you read/watched reviews of Frontier? The Radeon Menu's don't even work yet lol. There are clearly some massive driver issues.
Will AMD ever get their drivers to work? Nobody knows. But at the very least the potential is there, and Frontier clearly wasn't using all of its features. Exactly.
If Vega really turns out this bad, it will be because Vega was never built for gaming. GCN=gaming, Vega=professional work.
However if this is the case it is insanely bone-headed for AMD to even bother making a gaming version of Vega, and I will expect a GDDR6 large Polaris card by early 2018.
This would actually explain why AMD was so cocky at first and then now they are acting super cagey and unsure of themselves.
In other news you're double posting again ;)
First, are clocks given in the 3dmark benchmark leak fixed? Are all turbos, boosts, etc. disabled? If so, then a 1630 MHz Vega is performing almost the same as a 1924 MHz GTX 1080.
Second, from the Gigabyte article I linked, all the game benchmark results are performed with the default settings of the Gigabyte card not the overclocked results (we don't know anything about overclocking a Vega).
As shown on the clock profiles page, the clocks range from 1696 to 2038 MHz. I assume they change based on the load and card temperature. The average is around 1982 MHz.
www.techpowerup.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/34.html
It might seem that 1924 MHz is very different from the max 2038 MHz because there is a 2 instead of a 1 but the difference is actually quite small (~5.5%). Plus, it is hard to compare a moving clock speed to a fixed clock speed benchmark result. But if we take the average of 1982 MHz on the Gigabyte card (3% above 1924 MHz) and adjust the 3dmark score, you get ~23260. The 1630 MHz Vega received a score of 22330 which is 4% lower. Yes you can probably overclock a GTX 1080 an additional 4% to get above 2 GHz. You also might be able to overclock a Vega 4%. Time will tell.
So again, all I'm saying is that Vega is equivalent to an overclocked (factory or manual) GTX 1080 according to these leaked 3dmark scores.
If we look at power consumption, TP measured 243 Watts peak on the factory overclocked Gigabtye card. Rumors peg the liquid cooled Vega (I'm assuming that is highest score) at 375 Watts. So AMD is getting about the same performance as Nvidia but at 35% higher power. That suuuuuuuuuuuucccccccccccccckkkkkkkkkkkkksss in my book. Others may not care.