Thursday, August 3rd 2017

AMD RX Vega 56 Benchmarks Leaked - An (Unverified) GTX 1070 Killer

TweakTown has put forth an article wherein they claim to have received info from industry insiders regarding the upcoming Vega 56's performance. Remember that Vega 56 is the slightly cut-down version of the flagship Vega 64, counting with 56 next-generation compute units (NGCUs) instead of Vega 64's, well, 64. This means that while the Vega 64 has the full complement of 4,096 Stream processors, 256 TMUs, 64 ROPs, and a 2048-bit wide 8 GB HBM2 memory pool offering 484 GB/s of bandwidth, Vega 56 makes do with 3,548 Stream processors,192 TMUs, 64 ROPs, the same 8 GB of HBM2 memory and a slightly lower memory bandwidth at 410 GB/s.

The Vega 56 has been announced to retail for about $399, or $499 with one of AMD's new (famous or infamous, depends on your mileage) Radeon Packs. The RX Vega 56 card was running on a system configured with an Intel Core i7-7700K @ 4.2GHz, 16 GB of DDR4-3000 MHz RAM, and Windows 10 at 2560 x 1440 resolution.
The results in a number of popular games were as follows:

Battlefield 1 (Ultra settings): 95.4 FPS (GTX 1070: 72.2 FPS; 32% in favor of Vega 56)
Civilization 6 (Ultra settings, 4x MSAA): 85.1 FPS (GTX 1070: 72.2 FPS; 17% in favor of Vega 56)
DOOM (Ultra settings, 8x TSAA): 101.2 FPS (GTX 1070: 84.6 FPS; 20% in favor of Vega 56)
Call of Duty: Infinite Warfare (High preset): 99.9 FPS (GTX 1070: 92.1 FPS; 8% in favor of Vega 56)

If these numbers ring true, this means NVIDIA's GTX 1070, whose average pricing stands at around $460, will have a much reduced value proposition compared to the RX Vega 56. The AMD contender (which did arrive a year after NVIDIA's Pascal-based cards) delivers around 20% better performance (at least in the admittedly sparse games line-up), while costing around 15% less in greenbacks. Coupled with a lower cost of entry for a FreeSync monitor, and the possibility for users to get even more value out of a particular Radeon Pack they're eyeing, this could potentially be a killer deal. However, I'd recommend you wait for independent, confirmed benchmarks and reviews in controlled environments. I dare to suggest you won't need to look much further than your favorite tech site on the internet for that, when the time comes. Source: TweakTown
Add your own comment

169 Comments on AMD RX Vega 56 Benchmarks Leaked - An (Unverified) GTX 1070 Killer

#1
AndreiD
Isn't this pretty bad? Considering all of those games tend to favor AMD?
It's probably going to end up slightly slower or on par with a 1070 in TPU's summary, while using more power and seemingly costing more based on MSRP.
Custom 1070s will probably eat Vega 56 alive, this just doesn't look good at all.
Posted on Reply
#2
RejZoR
Again, how do games favor graphic cards from a vendor with hardly any market share. Or more importantly, why? Which makes me think AMD cards are simply... I don't know... better balanced for workloads that matter?

When NVIDIA is better at something, everyone is raving about NVIDIA's "performance supremacy", but when AMD does it, it's because games favor them. C'mon, and I'm being called an AMD fanboy for pointing out shit like this... :rolleyes:
Posted on Reply
#3
Vya Domus
Just as expected the cut down Vega looks to be the best value.
Posted on Reply
#4
cowie
xbox PlayStation?

20%more then a 1070 is a 1080 beater too
Posted on Reply
#5
Fouquin
Assimilator said:
You can't blame NVIDIA that they made a great card that is in such high demand, regardless of the reason, that it commands a nearly 25% price premium over its MSRP.
25% that goes directly to the retailer, not nVidia. I'm sure they're ecstatic that their excellent design is lining the pockets of the companies stocking their shelves with it.
Posted on Reply
#6
B-Real
AndreiD said:
Isn't this pretty bad? Considering all of those games tend to favor AMD?
It's probably going to end up slightly slower or on par with a 1070 in TPU's summary, while using more power and seemingly costing more based on MSRP.
Custom 1070s will probably eat Vega 56 alive, this just doesn't look good at all.
Actually you are the first one complaining about these results. Yes, AMD is faster on DX12 BF1 but isn't 30%+ faster... It's faster in CoD IW, but AMD had nearly 20% advantage there, and here it has less than 10%. It's faster in Doom Vulkan. However, Civilization DX12 is head-to-head (1060 6GB vs RX580 8GB shows 10% difference for AMD, but compared to the 480 there is 2 fps difference in 1440P).

So overall, IF these results are true, there won't be a 20% overall difference between the two, more like 10 or maximum 15%. But, given the HBM2, the promising features (that will be used by NV supported titles like Far Cry 5), the Vega56 looks charming.

I was wondering whether AMD was trolling us and using Vega56 at the comparison events with the 1080, emphasizing Sync... We will see.
Posted on Reply
#7
bug
RejZoR said:
Again, how do games favor graphic cards from a vendor with hardly any market share. Or more importantly, why? Which makes me think AMD cards are simply... I don't know... better balanced for workloads that matter?

When NVIDIA is better at something, everyone is raving about NVIDIA's "performance supremacy", but when AMD does it, it's because games favor them. C'mon, and I'm being called an AMD fanboy for pointing out shit like this... :rolleyes:
Are we pretending sponsored titles do not exist now?
Posted on Reply
#8
oxidized
londiste said:
enhanced sync will use as much tdp as it can, always.
chill is not a perfect solution and won't really drop consumption that much for active gaming. plus, it is still working with a whitelist of games.

the numbers sound about right but should fit into 'trading blows with 1070' well enough. all the listed games have a tendency of running noticeably better on amd hardware.
Don't even bother trying to discredit something coming from AMD with this guy
Posted on Reply
#9
Vya Domus
Fast Sync and Enhanced Sync work in pretty much the same ways. All frames are rendered but only a portion of them are shown , which means TPD isn't reduced. The only feature that can reduce TDP is Radeon Chill.
Posted on Reply
#10
Recus
But it's catching GTX 1080.

Plot twist: it's Vega 64 but TweakTown can't stop hype.

Posted on Reply
#11
Vya Domus
Fury was literally a stone's throw away from Fury X , it wouldn't surprise me at all if the story is the same with Vega.
Posted on Reply
#12
Sempron Guy
where in the world can I get a $350 GTX 1070 these days?
Fast Sync and Enhanced Sync work in pretty much the same ways. All frames are rendered but only a portion of them are shown , which means TPD isn't reduced. The only feature that can reduce TDP is Radeon Chill.
frame limiter works for me
Posted on Reply
#13
londiste
Sempron Guy said:
frame limiter works for me
with frame limiter you are negating the entire point on enhanced sync.
Posted on Reply
#14
Sempron Guy
londiste said:
with frame limiter you are negating the entire point on enhanced sync.
exactly, been using frame limiter + freesync ever since, no reason to use enhanced sync. As I said that combo works for me and I had no complaints.
Posted on Reply
#15
Vya Domus
Sempron Guy said:
exactly, been using frame limiter + freesync ever since, no reason to use enhanced sync. As I said that combo works for me and I had no complaints.
Isn't it unnecessary to use a frame limiter with freesync , I mean framerates wont go over the maximum refresh rate anyway. Unless you want a lower framerate for some reason.
Posted on Reply
#16
Darmok N Jalad
HD64G said:
When the NDA ends on Vegas?
I thought what happens in Vegas is always under NDA. ;)

Sorry, could help it.
Posted on Reply
#17
Liviu Cojocaru
Darmok N Jalad said:
I thought what happens in Vegas is always under NDA. ;)

Sorry, could help it.
*couldn't* :P
Posted on Reply
#18
RejZoR
oxidized said:
Don't even bother trying to discredit something coming from AMD with this guy
Discredit what exactly? Do you think I'm never wrong and that I never admit it? Go F off. Yeah, I automatically forgot that dropped frames are still rendered, just not displayed. Shit, look, I've admitted my mistake. It's not the first time neither the last.

Will I be less of an AMD fanboy if I keep on shitting on AMD relentlessly day after day like all of you are doing and which will literally not change ANYTHING or is simple voting with wallet not enough? If you hate AMD, then buy NVIDIA and stfu already. If I want to buy RX Vega literally out of curiosity, then call me an AMD fanboy. I'm literally starting to not give a s**t anymore with every passing day of listening to all of you whining little children.

People kept pissing over R9 Fury X and yet it turned out to be a graphic card that aged just the same as NVIDIA offerings. In fact it performs better now than does NVIDIA's stuff for the most part, still jusitfying the late arrival. If it came later, that's AMD's release schedule. If you don't like it, then buy NVIDIA again. You know, it's not that difficult concept... but it gets really annoying listening to all of you whining about the same thing day after day and calling me an AMD fanboy day after day just because I'm not shitting all over AMD like all of you are.
Posted on Reply
#19
Vya Domus
Everyone is biased to a degree , if you can't acknowledged that and think everyone beside you is a fanboy you are either delusional or simply a troll.

I also find it hilarious when one gets called a fanboy yet they own products from the opposing camp.

People need to make the difference between being a fan and a fanboy. Because believe it or not you can like one company and not be a mindless moron.
Posted on Reply
#20
bug
RejZoR said:
Discredit what exactly? Do you think I'm never wrong and that I never admit it? Go F off. Yeah, I automatically forgot that dropped frames are still rendered, just not displayed. Shit, look, I've admitted my mistake. It's not the first time neither the last.

Will I be less of an AMD fanboy if I keep on shitting on AMD relentlessly day after day like all of you are doing and which will literally not change ANYTHING or is simple voting with wallet not enough? If you hate AMD, then buy NVIDIA and stfu already. If I want to buy RX Vega literally out of curiosity, then call me an AMD fanboy. I'm literally starting to not give a s**t anymore with every passing day of listening to all of you whining little children.

People kept pissing over R9 Fury X and yet it turned out to be a graphic card that aged just the same as NVIDIA offerings. In fact it performs better now than does NVIDIA's stuff for the most part, still jusitfying the late arrival. If it came later, that's AMD's release schedule. If you don't like it, then buy NVIDIA again. You know, it's not that difficult concept... but it gets really annoying listening to all of you whining about the same thing day after day and calling me an AMD fanboy day after day just because I'm not shitting all over AMD like all of you are.
Not really, no: https://www.hardocp.com/article/2017/01/30/amd_video_card_driver_performance_review_fine_wine/3
It gained significantly in a few titles, but that's not the norm.
Posted on Reply
#21
Raevenlord
News Editor
Assimilator said:
Do you even proofread? The word you're looking for is "median".

And you're also completely ignoring that GTX 1070's MSRP is $349, i.e. $50 less than Vega 56, which is extremely fair considering the (supposed) relative performance of these cards - in fact I'd say the 1070 still wins on price/performance if these Vega 56 numbers are truthful. So calling Vega 56 a "GTX 1070 killer" is laughable.

The price of GTX 1070 cards has only been pushed up because of the cryptomining BS. Vega 56 is unlikely to offer a better hashrate-per-watt than GTX 1070, which means GTX 1070 prices will stay high and they will continue to be bought in volume by miners, whereas Vega 56 will be bought in much smaller quantities by gamers. So NVIDIA still wins in terms of the pure numbers game, and therefore in revenue.

You can't blame NVIDIA that they made a great card that is in such high demand, regardless of the reason, that it commands a nearly 25% price premium over its MSRP.
I do. Thank you for your correction. And the word I'm looking for isn't median either. It's average. You'll pardon me for not being 100% correct at all times in a non-native language. I'm sure you aren't either - even in your native language.

I don't blame NVIDIA. Where did I blame NVIDIA? What I did was mention current GTX 1070 pricing, which is what users look at. No one will approach this from the point of view of "Man, I'll be buying a GTX 1070 at $460 and it will be offering me much better bang for buck than the $399 Vega 56. I mean, its MSRP is much lower, and that's what matters, right?"

So yeah, I completely ignored that fact. On purpose. Because it doesn't make sense to me.
Posted on Reply
#22
the54thvoid
Vya Domus said:
Fury was literally a stone's throw away from Fury X , it wouldn't surprise me at all if the story is the same with Vega.
On the shop shelf perhaps.
Posted on Reply
#23
Sempron Guy
Vya Domus said:
Isn't it unnecessary to use a frame limiter with freesync , I mean framerates wont go over the maximum refresh rate anyway. Unless you want a lower framerate for some reason.
I couldn't explain it right but I see tearing at 75fps in all the games I tested. Capping it at 74fps does the trick. Though I'm not sure about the explanation behind it.
Posted on Reply
#24
Vya Domus
the54thvoid said:
On the shop shelf perhaps.
Care to elaborate ? Don't know what you mean.

Sempron Guy said:
I couldn't explain it right but I see tearing at 75fps in all the games I tested. Capping it at 74fps does the trick. Though I'm not sure about the explanation behind it.
That's odd.
Posted on Reply
#25
oxidized
RejZoR said:
Discredit what exactly? Do you think I'm never wrong and that I never admit it? Go F off. Yeah, I automatically forgot that dropped frames are still rendered, just not displayed. Shit, look, I've admitted my mistake. It's not the first time neither the last.

Will I be less of an AMD fanboy if I keep on shitting on AMD relentlessly day after day like all of you are doing and which will literally not change ANYTHING or is simple voting with wallet not enough? If you hate AMD, then buy NVIDIA and stfu already. If I want to buy RX Vega literally out of curiosity, then call me an AMD fanboy. I'm literally starting to not give a s**t anymore with every passing day of listening to all of you whining little children.

People kept pissing over R9 Fury X and yet it turned out to be a graphic card that aged just the same as NVIDIA offerings. In fact it performs better now than does NVIDIA's stuff for the most part, still jusitfying the late arrival. If it came later, that's AMD's release schedule. If you don't like it, then buy NVIDIA again. You know, it's not that difficult concept... but it gets really annoying listening to all of you whining about the same thing day after day and calling me an AMD fanboy day after day just because I'm not shitting all over AMD like all of you are.
Wow you sound pretty mad, but whatever, every and i say EVERY post i read coming from you it's always pro AMD, i swear, i'm not even joking, that's the difference between someone unbiased and someone neutral, i don't care whose hardware i buy, i just buy whatever is best for my money

Vya Domus said:
Everyone is biased to a degree , if you can't acknowledged that and think everyone beside you is a fanboy you are either delusional or simply a troll.

I also find it hilarious when one gets called a fanboy yet they own products from the opposing camp.

People need to make the difference between being a fan and a fanboy. Because believe it or not you can like one company and not be a mindless moron.
Wrong, i'm as neutral as one can be, i don't care about brands and stuff, i only buy what i think and what is said to be the best thing, and AMD, aside from ryzen, isn't best, in any kind of way atm, not even on polaris
Posted on Reply
Add your own comment