• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX Vega Put Through 3DMark

I think the big issue with this card is that it comes so late, the predicted disastrous power efficiency compared to the competition is secondary. For fanboys and fantasyland dreamers Vega is shaping up to be a disappointment. The unrealistic expectations were fueled by AMD themselves but that shouldn't come as a surprise, Fury being an overclockers dream was golden. For those who have been paying attention, Vega is more or less in line with what RTG is capable of doing compared to Nvidia.
 
My question is, if it won't beat the GTX 1080, why does it need HBM?
 
This is not a gaming card, wait for the gaming version for 20% extra performance! /s
 
yo, its still 4096 cores same as Fiji silicon
improvement we see here is from IPC gain and uarch changes...
honestly I dont see the point of this card
with costly HBM2 memory chip and delayed product yet still outclassed by one years old GPU

this is worse than Fury X, people used to compare Fury X with 980 Ti
but this, this is just so bad....

we need another Jim Keller type guy to research/design new GPU uarch
 
Last edited:
It is... Take a second to read the title, the link, etc.. RX Vega (the gaming card).. not Vega Frontier Edition. ;)
Poe's law so powerful /s didn't even work.
 
So this brought up some curiosity from me, let us see what it takes to get a 1080Ti which is a similarly sized GPU with similar performance numbers in bandwidth etc.

This testing is done with the rig in my specs

5960X@4.8, Asus X99m WS, 4x8GB DDR4 @2666CL14, 512GB NVMe, 2 EVGA SC BLACK 1080Ti's (SLi disabled OBVIOUSLY)

50% TDP rest of card settings at stock, this yielded a graphics score of 20343 and consumed 125W at the GPU (under mining stress load)

capture010.jpg


55% TDP rest of card settings at stock, this yielded a graphics score of 23158 and consumed 138W at the GPU (under mining stress load)

capture011.jpg


58% TDP rest of card settings at stock, this yielded a graphics score of 24016 and consumed 144W at the GPU (under mining stress load)

capture012.jpg


So with a similarly sized GPU I can basically run my cards silent for the same performance. Fan curve never broke 20%, my ceiling fan is louder.
 
Still don't quite understand why haven't they just slammed two Polaris GPU's on a single card with internal CrossfireX. Or just shrunk the Fury X and clock it higher and call it a day. I would probably be nothing in terms of R&D compared to RX Vega as whole new GPU design and they'd essentially get this kind of performance anyway. Unless AMD is trolling us all on purpose with such scores and all the mystery around RX Vega to first generate interest (drama) via underwhelming results and then slam the market with crazy performance, taking NVIDIA by surprise and generating even more drama because it would shake everything we've known about Vega to date. Not really sure which is it anymore...
 
Hope this is cut down RX Vega or it is going to be a pathetic launch. Even if it's still just immature drivers, it should never be released in this state. A 300w card competing with a 1070. WTF?!


They could have just made a 4608-SP Polaris card with HBM and it would have done FAR better...
 
Hope this is cut down RX Vega or it is going to be a pathetic launch. Even if it's still just immature drivers, it should never be released in this state. A 300w card competing with a 1070. WTF?!


They could have just made a 4608-SP Polaris card with HBM and it would have done FAR better...

HBM is a waste of money a fat polaris card with GDDR5X would have happily competed with the 1080.
 
Still don't quite understand why haven't they just slammed two Polaris GPU's on a single card with internal CrossfireX. Or just shrunk the Fury X and clock it higher and call it a day. I would probably be nothing in terms of R&D compared to RX Vega as whole new GPU design and they'd essentially get this kind of performance anyway. Unless AMD is trolling us all on purpose with such scores and all the mystery around RX Vega to first generate interest (drama) via underwhelming results and then slam the market with crazy performance, taking NVIDIA by surprise and generating even more drama because it would shake everything we've known about Vega to date. Not really sure which is it anymore...

A lot of parallels to Bulldozer going on here:

-Massive hype
-Tons of delays
-Completely new arch with high power usage

I am sure it will (And it does) dominate at certain professional workloads just like Bulldozer did. However gamers will say "Why didn't they just die shrink Phenom II/Polaris?!"

HBM is a waste of money a fat polaris card with GDDR5X would have happily competed with the 1080.

Polaris is actually more memory starved than compute. Tests (That I have replicated btw) have shown that the RX 480 would have scaled its performance almost linearly with 30% more bandwidth.
 
Last edited by a moderator:
Still don't quite understand why haven't they just slammed two Polaris GPU's on a single card with internal CrossfireX. Or just shrunk the Fury X and clock it higher and call it a day. I would probably be nothing in terms of R&D compared to RX Vega as whole new GPU design and they'd essentially get this kind of performance anyway. Unless AMD is trolling us all on purpose with such scores and all the mystery around RX Vega to first generate interest (drama) via underwhelming results and then slam the market with crazy performance, taking NVIDIA by surprise and generating even more drama because it would shake everything we've known about Vega to date. Not really sure which is it anymore...
Wait for Navi and Volta. Both will be dual GPU's
 
I'll not say I predicted this already when the 1080 prices were slashed and I bought one.

But that's what it is

To those that predicted a +20/30% performance win (because drivers!) from the 'not meant for gaming' Vega... yeah, hope you learned something here

Fury X v2 is reality
 
Polaris is actually more memory starved than compute. Tests (That I have replicated btw) have shown that the RX 480 would have scaled its performance almost linearly with 30% more bandwidth.

Two polaris dies worth of BS would have allowed a 512bit GDDR5X bus which would in turn fix that issue. Remember the 352bit GDDR5X bus on the 1080Ti is about even with the HBM bus on vega
 
Two polaris dies worth of BS would have allowed a 512bit GDDR5X bus which would in turn fix that issue. Remember the 352bit GDDR5X bus on the 1080Ti is about even with the HBM bus on vega

They would also be stuck with a board TDP of over 320W that way. 512 bit GDDR5X is quite costly I reckon and a single RX480 happily goes towards 170W typical draw

Let's face it, GCN is like Intel Core right now, as DX12 has landed its been scaled up to the max. Not the best timing for AMD and I really hope they have something up their sleeve for Navi besides glueing GCN together.
 
I'll not say I predicted this already when the 1080 prices were slashed and I bought one.

But that's what it is

To those that predicted a +20/30% performance win from the 'not meant for gaming' Vega... yeah, hope you learned something here

Fury X v2 is reality
So what did you predict for Vega? Leaked benchmarks? Hype train? the card is not out yet so chillax.
 
They would also be stuck with a board TDP of over 320W that way. 512 bit GDDR5X is quite costly I reckon and a single RX480 happily goes towards 170W typical draw

So that would be different than the Vega stuff in what way? Board power is 350w and they can't even get enough hbm to sell the cards.
 
So that would be different than the Vega stuff in what way? Board power is 350w and they can't even get enough hbm to sell the cards.

Yeah you're right, they have no place to go feasibly, its horrifying and to me shows a glaring lack of planning ahead. As well as they set up their CPU right now, you'd expect different. They should have pulled Navi towards 2H 2017 and skipped Vega altogether.

Perhaps the only advantage of dual RX480 would have been a much, much cheaper to produce high end, that could have been to market a year ago.

@ratirt not sure how much more you need here to know what's what. Even if it still gains 10% from drivers, which I find reasonably plausible (and still a feat on its own tbh, go look at pre- and post driver launch benches of the past) it would make 24k points and still be much too hungry for too little performance and a high cost to make. Keep in mind that high board TDP also severely limits OC headroom.
 
Last edited:
I'll not say I predicted this already when the 1080 prices were slashed and I bought one.

But that's what it is

To those that predicted a +20/30% performance win (because drivers!) from the 'not meant for gaming' Vega... yeah, hope you learned something here

Fury X v2 is reality

RX Vega is a 13.1+ TF card with HBM2, RPM, HBC, and finally Tiled Rasterization. It is a monster card, period.

Have you read/watched reviews of Frontier? The Radeon Menu's don't even work yet lol. There are clearly some massive driver issues.


Will AMD ever get their drivers to work? Nobody knows. But at the very least the potential is there, and Frontier clearly wasn't using all of its features.

So that would be different than the Vega stuff in what way? Board power is 350w and they can't even get enough hbm to sell the cards.

Exactly.


If Vega really turns out this bad, it will be because Vega was never built for gaming. GCN=gaming, Vega=professional work.

However if this is the case it is insanely bone-headed for AMD to even bother making a gaming version of Vega, and I will expect a GDDR6 large Polaris card by early 2018.
 
Last edited by a moderator:
RX Vega is a 13.1+ TF card with HBM2, RPM, HBC, and finally Tiled Rasterization. It is a monster card, period.

Have you read/watched reviews of Frontier? The Radeon Menu's don't even work yet lol. There are clearly some massive driver issues.


Will AMD ever get their drivers to work? Nobody knows. But at the very least the potential is there, and Frontier clearly wasn't using all of its features.

TBR doesn't work at all. So unless something changes that is just another missing cog from the hype train.
 
TBR doesn't work at all. So unless something changes that is just another missing cog from the hype train.

And I 100% agree with you on that. But like I said - the potential is there. If Vega doesn't pan out, it's because the arch proved just WAY too complicated for AMD's driver team to come to terms with.

This would actually explain why AMD was so cocky at first and then now they are acting super cagey and unsure of themselves.
 
And I 100% agree with you on that. But like I said - the potential is there. If Vega doesn't pan out, it's because the arch proved just WAY too complicated for AMD's driver team to come to terms with.

This would actually explain why AMD was so cocky at first and then now they are acting super cagey and unsure of themselves.

You gotta admit progress has been quite stagnant over the past six months driver wise. We can get all hung up on TBR but core count + clocks already tell me more than enough.

In other news you're double posting again ;)
 
And I 100% agree with you on that. But like I said - the potential is there. If Vega doesn't pan out, it's because the arch proved just WAY too complicated for AMD's driver team to come to terms with.

This would actually explain why AMD was so cocky at first and then now they are acting super cagey and unsure of themselves.

I don't think even a combo of Intel's and nvidias driver team could fix a card that only competes with a 1080ti downclocked on a 55% tdp...that is a lot of ground to make up for.
 
Back
Top