Tuesday, July 25th 2017

AMD Radeon RX Vega Put Through 3DMark

Ahead of its July 27 unveiling at AMD's grand media event on the sidelines of SIGGRAPH, performance benchmarks of the elusive Radeon RX Vega consumer graphics card surfaced once again. Someone with access to an RX Vega sample, with its GPU clocked at 1630 MHz and memory at 945 MHz, put it through 3DMark. One can tell that it's RX Vega and not Pro Vega Frontier Edition, looking at its 8 GB video memory amount.

In three test runs, the RX Vega powered machine yielded a graphics score of 22,330 points, 22,291 points, and 20.949 points. This puts its performance either on-par or below that of the GeForce GTX 1080, but comfortably above the GTX 1070. The test-bench consisted of a Core i7-5960X processor, and graphics driver version 22.19.640.2.
Source: VideoCardz
Add your own comment

175 Comments on AMD Radeon RX Vega Put Through 3DMark

#51
bug
My question is, if it won't beat the GTX 1080, why does it need HBM?
Posted on Reply
#52
Supercrit
This is not a gaming card, wait for the gaming version for 20% extra performance! /s
Posted on Reply
#53
chaosmassive
yo, its still 4096 cores same as Fiji silicon
improvement we see here is from IPC gain and uarch changes...
honestly I dont see the point of this card
with costly HBM2 memory chip and delayed product yet still outclassed by one years old GPU

this is worse than Fury X, people used to compare Fury X with 980 Ti
but this, this is just so bad....

we need another Jim Keller type guy to research/design new GPU uarch
Posted on Reply
#54
EarthDog
SupercritThis is not a gaming card, wait for the gaming version for 20% extra performance! /s
It is... Take a second to read the title, the link, etc.. RX Vega (the gaming card).. not Vega Frontier Edition. ;)
Posted on Reply
#55
Supercrit
EarthDogIt is... Take a second to read the title, the link, etc.. RX Vega (the gaming card).. not Vega Frontier Edition. ;)
Poe's law so powerful /s didn't even work.
Posted on Reply
#56
cdawall
where the hell are my stars
So this brought up some curiosity from me, let us see what it takes to get a 1080Ti which is a similarly sized GPU with similar performance numbers in bandwidth etc.

This testing is done with the rig in my specs

5960X@4.8, Asus X99m WS, 4x8GB DDR4 @2666CL14, 512GB NVMe, 2 EVGA SC BLACK 1080Ti's (SLi disabled OBVIOUSLY)

50% TDP rest of card settings at stock, this yielded a graphics score of 20343 and consumed 125W at the GPU (under mining stress load)



55% TDP rest of card settings at stock, this yielded a graphics score of 23158 and consumed 138W at the GPU (under mining stress load)



58% TDP rest of card settings at stock, this yielded a graphics score of 24016 and consumed 144W at the GPU (under mining stress load)



So with a similarly sized GPU I can basically run my cards silent for the same performance. Fan curve never broke 20%, my ceiling fan is louder.
Posted on Reply
#57
RejZoR
Still don't quite understand why haven't they just slammed two Polaris GPU's on a single card with internal CrossfireX. Or just shrunk the Fury X and clock it higher and call it a day. I would probably be nothing in terms of R&D compared to RX Vega as whole new GPU design and they'd essentially get this kind of performance anyway. Unless AMD is trolling us all on purpose with such scores and all the mystery around RX Vega to first generate interest (drama) via underwhelming results and then slam the market with crazy performance, taking NVIDIA by surprise and generating even more drama because it would shake everything we've known about Vega to date. Not really sure which is it anymore...
Posted on Reply
#58
Captain_Tom
Hope this is cut down RX Vega or it is going to be a pathetic launch. Even if it's still just immature drivers, it should never be released in this state. A 300w card competing with a 1070. WTF?!


They could have just made a 4608-SP Polaris card with HBM and it would have done FAR better...
Posted on Reply
#59
cdawall
where the hell are my stars
Captain_TomHope this is cut down RX Vega or it is going to be a pathetic launch. Even if it's still just immature drivers, it should never be released in this state. A 300w card competing with a 1070. WTF?!


They could have just made a 4608-SP Polaris card with HBM and it would have done FAR better...
HBM is a waste of money a fat polaris card with GDDR5X would have happily competed with the 1080.
Posted on Reply
#60
EarthDog
SupercritPoe's law so powerful /s didn't even work.
There is a reason most forums have emoticons.. to clear up how the written word's tone should be read... I'd consider using them for sarcasm. :)
Posted on Reply
#61
Captain_Tom
RejZoRStill don't quite understand why haven't they just slammed two Polaris GPU's on a single card with internal CrossfireX. Or just shrunk the Fury X and clock it higher and call it a day. I would probably be nothing in terms of R&D compared to RX Vega as whole new GPU design and they'd essentially get this kind of performance anyway. Unless AMD is trolling us all on purpose with such scores and all the mystery around RX Vega to first generate interest (drama) via underwhelming results and then slam the market with crazy performance, taking NVIDIA by surprise and generating even more drama because it would shake everything we've known about Vega to date. Not really sure which is it anymore...
A lot of parallels to Bulldozer going on here:

-Massive hype
-Tons of delays
-Completely new arch with high power usage

I am sure it will (And it does) dominate at certain professional workloads just like Bulldozer did. However gamers will say "Why didn't they just die shrink Phenom II/Polaris?!"
cdawallHBM is a waste of money a fat polaris card with GDDR5X would have happily competed with the 1080.
Polaris is actually more memory starved than compute. Tests (That I have replicated btw) have shown that the RX 480 would have scaled its performance almost linearly with 30% more bandwidth.
Posted on Reply
#62
ratirt
RejZoRStill don't quite understand why haven't they just slammed two Polaris GPU's on a single card with internal CrossfireX. Or just shrunk the Fury X and clock it higher and call it a day. I would probably be nothing in terms of R&D compared to RX Vega as whole new GPU design and they'd essentially get this kind of performance anyway. Unless AMD is trolling us all on purpose with such scores and all the mystery around RX Vega to first generate interest (drama) via underwhelming results and then slam the market with crazy performance, taking NVIDIA by surprise and generating even more drama because it would shake everything we've known about Vega to date. Not really sure which is it anymore...
Wait for Navi and Volta. Both will be dual GPU's
Posted on Reply
#63
Vayra86
I'll not say I predicted this already when the 1080 prices were slashed and I bought one.

But that's what it is

To those that predicted a +20/30% performance win (because drivers!) from the 'not meant for gaming' Vega... yeah, hope you learned something here

Fury X v2 is reality
Posted on Reply
#64
cdawall
where the hell are my stars
Captain_TomPolaris is actually more memory starved than compute. Tests (That I have replicated btw) have shown that the RX 480 would have scaled its performance almost linearly with 30% more bandwidth.
Two polaris dies worth of BS would have allowed a 512bit GDDR5X bus which would in turn fix that issue. Remember the 352bit GDDR5X bus on the 1080Ti is about even with the HBM bus on vega
Posted on Reply
#65
Vayra86
cdawallTwo polaris dies worth of BS would have allowed a 512bit GDDR5X bus which would in turn fix that issue. Remember the 352bit GDDR5X bus on the 1080Ti is about even with the HBM bus on vega
They would also be stuck with a board TDP of over 320W that way. 512 bit GDDR5X is quite costly I reckon and a single RX480 happily goes towards 170W typical draw

Let's face it, GCN is like Intel Core right now, as DX12 has landed its been scaled up to the max. Not the best timing for AMD and I really hope they have something up their sleeve for Navi besides glueing GCN together.
Posted on Reply
#66
ratirt
Vayra86I'll not say I predicted this already when the 1080 prices were slashed and I bought one.

But that's what it is

To those that predicted a +20/30% performance win from the 'not meant for gaming' Vega... yeah, hope you learned something here

Fury X v2 is reality
So what did you predict for Vega? Leaked benchmarks? Hype train? the card is not out yet so chillax.
Posted on Reply
#67
cdawall
where the hell are my stars
Vayra86They would also be stuck with a board TDP of over 320W that way. 512 bit GDDR5X is quite costly I reckon and a single RX480 happily goes towards 170W typical draw
So that would be different than the Vega stuff in what way? Board power is 350w and they can't even get enough hbm to sell the cards.
Posted on Reply
#68
Vayra86
cdawallSo that would be different than the Vega stuff in what way? Board power is 350w and they can't even get enough hbm to sell the cards.
Yeah you're right, they have no place to go feasibly, its horrifying and to me shows a glaring lack of planning ahead. As well as they set up their CPU right now, you'd expect different. They should have pulled Navi towards 2H 2017 and skipped Vega altogether.

Perhaps the only advantage of dual RX480 would have been a much, much cheaper to produce high end, that could have been to market a year ago.

@ratirt not sure how much more you need here to know what's what. Even if it still gains 10% from drivers, which I find reasonably plausible (and still a feat on its own tbh, go look at pre- and post driver launch benches of the past) it would make 24k points and still be much too hungry for too little performance and a high cost to make. Keep in mind that high board TDP also severely limits OC headroom.
Posted on Reply
#69
Captain_Tom
Vayra86I'll not say I predicted this already when the 1080 prices were slashed and I bought one.

But that's what it is

To those that predicted a +20/30% performance win (because drivers!) from the 'not meant for gaming' Vega... yeah, hope you learned something here

Fury X v2 is reality
RX Vega is a 13.1+ TF card with HBM2, RPM, HBC, and finally Tiled Rasterization. It is a monster card, period.

Have you read/watched reviews of Frontier? The Radeon Menu's don't even work yet lol. There are clearly some massive driver issues.


Will AMD ever get their drivers to work? Nobody knows. But at the very least the potential is there, and Frontier clearly wasn't using all of its features.
cdawallSo that would be different than the Vega stuff in what way? Board power is 350w and they can't even get enough hbm to sell the cards.
Exactly.


If Vega really turns out this bad, it will be because Vega was never built for gaming. GCN=gaming, Vega=professional work.

However if this is the case it is insanely bone-headed for AMD to even bother making a gaming version of Vega, and I will expect a GDDR6 large Polaris card by early 2018.
Posted on Reply
#70
cdawall
where the hell are my stars
Captain_TomRX Vega is a 13.1+ TF card with HBM2, RPM, HBC, and finally Tiled Rasterization. It is a monster card, period.

Have you read/watched reviews of Frontier? The Radeon Menu's don't even work yet lol. There are clearly some massive driver issues.


Will AMD ever get their drivers to work? Nobody knows. But at the very least the potential is there, and Frontier clearly wasn't using all of its features.
TBR doesn't work at all. So unless something changes that is just another missing cog from the hype train.
Posted on Reply
#71
Patriot
ERazerthat gotta hurt for ppl that waited this long
I bought a 1080ti while waiting, and it still hurts lol.
Posted on Reply
#72
Captain_Tom
cdawallTBR doesn't work at all. So unless something changes that is just another missing cog from the hype train.
And I 100% agree with you on that. But like I said - the potential is there. If Vega doesn't pan out, it's because the arch proved just WAY too complicated for AMD's driver team to come to terms with.

This would actually explain why AMD was so cocky at first and then now they are acting super cagey and unsure of themselves.
Posted on Reply
#73
Vayra86
Captain_TomAnd I 100% agree with you on that. But like I said - the potential is there. If Vega doesn't pan out, it's because the arch proved just WAY too complicated for AMD's driver team to come to terms with.

This would actually explain why AMD was so cocky at first and then now they are acting super cagey and unsure of themselves.
You gotta admit progress has been quite stagnant over the past six months driver wise. We can get all hung up on TBR but core count + clocks already tell me more than enough.

In other news you're double posting again ;)
Posted on Reply
#74
cdawall
where the hell are my stars
Captain_TomAnd I 100% agree with you on that. But like I said - the potential is there. If Vega doesn't pan out, it's because the arch proved just WAY too complicated for AMD's driver team to come to terms with.

This would actually explain why AMD was so cocky at first and then now they are acting super cagey and unsure of themselves.
I don't think even a combo of Intel's and nvidias driver team could fix a card that only competes with a 1080ti downclocked on a 55% tdp...that is a lot of ground to make up for.
Posted on Reply
#75
Daven
cdawallDid you even read that review?



Stock boost was 2038
Clocks are definitely difficult to decipher nowadays. I like to learn but I'm not sure the picture is very clear yet. I have a question and a comment:

First, are clocks given in the 3dmark benchmark leak fixed? Are all turbos, boosts, etc. disabled? If so, then a 1630 MHz Vega is performing almost the same as a 1924 MHz GTX 1080.

Second, from the Gigabyte article I linked, all the game benchmark results are performed with the default settings of the Gigabyte card not the overclocked results (we don't know anything about overclocking a Vega).

As shown on the clock profiles page, the clocks range from 1696 to 2038 MHz. I assume they change based on the load and card temperature. The average is around 1982 MHz.
www.techpowerup.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/34.html

It might seem that 1924 MHz is very different from the max 2038 MHz because there is a 2 instead of a 1 but the difference is actually quite small (~5.5%). Plus, it is hard to compare a moving clock speed to a fixed clock speed benchmark result. But if we take the average of 1982 MHz on the Gigabyte card (3% above 1924 MHz) and adjust the 3dmark score, you get ~23260. The 1630 MHz Vega received a score of 22330 which is 4% lower. Yes you can probably overclock a GTX 1080 an additional 4% to get above 2 GHz. You also might be able to overclock a Vega 4%. Time will tell.

So again, all I'm saying is that Vega is equivalent to an overclocked (factory or manual) GTX 1080 according to these leaked 3dmark scores.

If we look at power consumption, TP measured 243 Watts peak on the factory overclocked Gigabtye card. Rumors peg the liquid cooled Vega (I'm assuming that is highest score) at 375 Watts. So AMD is getting about the same performance as Nvidia but at 35% higher power. That suuuuuuuuuuuucccccccccccccckkkkkkkkkkkkksss in my book. Others may not care.
Posted on Reply
Add your own comment
Apr 24th, 2024 14:00 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts