• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD RX Vega 56 Benchmarks Leaked - An (Unverified) GTX 1070 Killer

AMD has tiny market share...

...a lot of games favor Radeon architecture.

Ok, how in bloody hell that even works? If AMD has small market share, why would anyone bother specializing their engines favoriting AMD? Just pointing out the obvious. You know, maybe AMD is just good at it? Why can't that be a possibility? Why that only applies when NVIDIA is good at it?
 
Isn't this pretty bad? Considering all of those games tend to favor AMD?
It's probably going to end up slightly slower or on par with a 1070 in TPU's summary, while using more power and seemingly costing more based on MSRP.
Custom 1070s will probably eat Vega 56 alive, this just doesn't look good at all.
 
Again, how do games favor graphic cards from a vendor with hardly any market share. Or more importantly, why? Which makes me think AMD cards are simply... I don't know... better balanced for workloads that matter?

When NVIDIA is better at something, everyone is raving about NVIDIA's "performance supremacy", but when AMD does it, it's because games favor them. C'mon, and I'm being called an AMD fanboy for pointing out shit like this... :rolleyes:
 
Just as expected the cut down Vega looks to be the best value.
 
You can't blame NVIDIA that they made a great card that is in such high demand, regardless of the reason, that it commands a nearly 25% price premium over its MSRP.

25% that goes directly to the retailer, not nVidia. I'm sure they're ecstatic that their excellent design is lining the pockets of the companies stocking their shelves with it.
 
Isn't this pretty bad? Considering all of those games tend to favor AMD?
It's probably going to end up slightly slower or on par with a 1070 in TPU's summary, while using more power and seemingly costing more based on MSRP.
Custom 1070s will probably eat Vega 56 alive, this just doesn't look good at all.
Actually you are the first one complaining about these results. Yes, AMD is faster on DX12 BF1 but isn't 30%+ faster... It's faster in CoD IW, but AMD had nearly 20% advantage there, and here it has less than 10%. It's faster in Doom Vulkan. However, Civilization DX12 is head-to-head (1060 6GB vs RX580 8GB shows 10% difference for AMD, but compared to the 480 there is 2 fps difference in 1440P).

So overall, IF these results are true, there won't be a 20% overall difference between the two, more like 10 or maximum 15%. But, given the HBM2, the promising features (that will be used by NV supported titles like Far Cry 5), the Vega56 looks charming.

I was wondering whether AMD was trolling us and using Vega56 at the comparison events with the 1080, emphasizing Sync... We will see.
 
Last edited:
Again, how do games favor graphic cards from a vendor with hardly any market share. Or more importantly, why? Which makes me think AMD cards are simply... I don't know... better balanced for workloads that matter?

When NVIDIA is better at something, everyone is raving about NVIDIA's "performance supremacy", but when AMD does it, it's because games favor them. C'mon, and I'm being called an AMD fanboy for pointing out shit like this... :rolleyes:
Are we pretending sponsored titles do not exist now?
 
enhanced sync will use as much tdp as it can, always.
chill is not a perfect solution and won't really drop consumption that much for active gaming. plus, it is still working with a whitelist of games.

the numbers sound about right but should fit into 'trading blows with 1070' well enough. all the listed games have a tendency of running noticeably better on amd hardware.

Don't even bother trying to discredit something coming from AMD with this guy
 
Fast Sync and Enhanced Sync work in pretty much the same ways. All frames are rendered but only a portion of them are shown , which means TPD isn't reduced. The only feature that can reduce TDP is Radeon Chill.
 
But it's catching GTX 1080.

Plot twist: it's Vega 64 but TweakTown can't stop hype.

bf1_2560_1440.png
civ6_2560_1440.png
codiw_2560_1440.png
doom_2560_1440.png
 
Fury was literally a stone's throw away from Fury X , it wouldn't surprise me at all if the story is the same with Vega.
 
Last edited:
where in the world can I get a $350 GTX 1070 these days?

Fast Sync and Enhanced Sync work in pretty much the same ways. All frames are rendered but only a portion of them are shown , which means TPD isn't reduced. The only feature that can reduce TDP is Radeon Chill.

frame limiter works for me
 
with frame limiter you are negating the entire point on enhanced sync.

exactly, been using frame limiter + freesync ever since, no reason to use enhanced sync. As I said that combo works for me and I had no complaints.
 
exactly, been using frame limiter + freesync ever since, no reason to use enhanced sync. As I said that combo works for me and I had no complaints.

Isn't it unnecessary to use a frame limiter with freesync , I mean framerates wont go over the maximum refresh rate anyway. Unless you want a lower framerate for some reason.
 
Don't even bother trying to discredit something coming from AMD with this guy

Discredit what exactly? Do you think I'm never wrong and that I never admit it? Go F off. Yeah, I automatically forgot that dropped frames are still rendered, just not displayed. Shit, look, I've admitted my mistake. It's not the first time neither the last.

Will I be less of an AMD fanboy if I keep on shitting on AMD relentlessly day after day like all of you are doing and which will literally not change ANYTHING or is simple voting with wallet not enough? If you hate AMD, then buy NVIDIA and stfu already. If I want to buy RX Vega literally out of curiosity, then call me an AMD fanboy. I'm literally starting to not give a s**t anymore with every passing day of listening to all of you whining little children.

People kept pissing over R9 Fury X and yet it turned out to be a graphic card that aged just the same as NVIDIA offerings. In fact it performs better now than does NVIDIA's stuff for the most part, still jusitfying the late arrival. If it came later, that's AMD's release schedule. If you don't like it, then buy NVIDIA again. You know, it's not that difficult concept... but it gets really annoying listening to all of you whining about the same thing day after day and calling me an AMD fanboy day after day just because I'm not shitting all over AMD like all of you are.
 
Everyone is biased to a degree , if you can't acknowledged that and think everyone beside you is a fanboy you are either delusional or simply a troll.

I also find it hilarious when one gets called a fanboy yet they own products from the opposing camp.

People need to make the difference between being a fan and a fanboy. Because believe it or not you can like one company and not be a mindless moron.
 
Last edited:
Discredit what exactly? Do you think I'm never wrong and that I never admit it? Go F off. Yeah, I automatically forgot that dropped frames are still rendered, just not displayed. Shit, look, I've admitted my mistake. It's not the first time neither the last.

Will I be less of an AMD fanboy if I keep on shitting on AMD relentlessly day after day like all of you are doing and which will literally not change ANYTHING or is simple voting with wallet not enough? If you hate AMD, then buy NVIDIA and stfu already. If I want to buy RX Vega literally out of curiosity, then call me an AMD fanboy. I'm literally starting to not give a s**t anymore with every passing day of listening to all of you whining little children.

People kept pissing over R9 Fury X and yet it turned out to be a graphic card that aged just the same as NVIDIA offerings. In fact it performs better now than does NVIDIA's stuff for the most part, still jusitfying the late arrival. If it came later, that's AMD's release schedule. If you don't like it, then buy NVIDIA again. You know, it's not that difficult concept... but it gets really annoying listening to all of you whining about the same thing day after day and calling me an AMD fanboy day after day just because I'm not shitting all over AMD like all of you are.

Not really, no: https://www.hardocp.com/article/2017/01/30/amd_video_card_driver_performance_review_fine_wine/3
It gained significantly in a few titles, but that's not the norm.
 
Do you even proofread? The word you're looking for is "median".

And you're also completely ignoring that GTX 1070's MSRP is $349, i.e. $50 less than Vega 56, which is extremely fair considering the (supposed) relative performance of these cards - in fact I'd say the 1070 still wins on price/performance if these Vega 56 numbers are truthful. So calling Vega 56 a "GTX 1070 killer" is laughable.

The price of GTX 1070 cards has only been pushed up because of the cryptomining BS. Vega 56 is unlikely to offer a better hashrate-per-watt than GTX 1070, which means GTX 1070 prices will stay high and they will continue to be bought in volume by miners, whereas Vega 56 will be bought in much smaller quantities by gamers. So NVIDIA still wins in terms of the pure numbers game, and therefore in revenue.

You can't blame NVIDIA that they made a great card that is in such high demand, regardless of the reason, that it commands a nearly 25% price premium over its MSRP.

I do. Thank you for your correction. And the word I'm looking for isn't median either. It's average. You'll pardon me for not being 100% correct at all times in a non-native language. I'm sure you aren't either - even in your native language.

I don't blame NVIDIA. Where did I blame NVIDIA? What I did was mention current GTX 1070 pricing, which is what users look at. No one will approach this from the point of view of "Man, I'll be buying a GTX 1070 at $460 and it will be offering me much better bang for buck than the $399 Vega 56. I mean, its MSRP is much lower, and that's what matters, right?"

So yeah, I completely ignored that fact. On purpose. Because it doesn't make sense to me.
 
Fury was literally a stone's throw away from Fury X , it wouldn't surprise me at all if the story is the same with Vega.

On the shop shelf perhaps.
 
Isn't it unnecessary to use a frame limiter with freesync , I mean framerates wont go over the maximum refresh rate anyway. Unless you want a lower framerate for some reason.

I couldn't explain it right but I see tearing at 75fps in all the games I tested. Capping it at 74fps does the trick. Though I'm not sure about the explanation behind it.
 
Back
Top