Thursday, August 3rd 2017

AMD RX Vega 56 Benchmarks Leaked - An (Unverified) GTX 1070 Killer

TweakTown has put forth an article wherein they claim to have received info from industry insiders regarding the upcoming Vega 56's performance. Remember that Vega 56 is the slightly cut-down version of the flagship Vega 64, counting with 56 next-generation compute units (NGCUs) instead of Vega 64's, well, 64. This means that while the Vega 64 has the full complement of 4,096 Stream processors, 256 TMUs, 64 ROPs, and a 2048-bit wide 8 GB HBM2 memory pool offering 484 GB/s of bandwidth, Vega 56 makes do with 3,548 Stream processors,192 TMUs, 64 ROPs, the same 8 GB of HBM2 memory and a slightly lower memory bandwidth at 410 GB/s.

The Vega 56 has been announced to retail for about $399, or $499 with one of AMD's new (famous or infamous, depends on your mileage) Radeon Packs. The RX Vega 56 card was running on a system configured with an Intel Core i7-7700K @ 4.2GHz, 16 GB of DDR4-3000 MHz RAM, and Windows 10 at 2560 x 1440 resolution.
The results in a number of popular games were as follows:

Battlefield 1 (Ultra settings): 95.4 FPS (GTX 1070: 72.2 FPS; 32% in favor of Vega 56)
Civilization 6 (Ultra settings, 4x MSAA): 85.1 FPS (GTX 1070: 72.2 FPS; 17% in favor of Vega 56)
DOOM (Ultra settings, 8x TSAA): 101.2 FPS (GTX 1070: 84.6 FPS; 20% in favor of Vega 56)
Call of Duty: Infinite Warfare (High preset): 99.9 FPS (GTX 1070: 92.1 FPS; 8% in favor of Vega 56)

If these numbers ring true, this means NVIDIA's GTX 1070, whose average pricing stands at around $460, will have a much reduced value proposition compared to the RX Vega 56. The AMD contender (which did arrive a year after NVIDIA's Pascal-based cards) delivers around 20% better performance (at least in the admittedly sparse games line-up), while costing around 15% less in greenbacks. Coupled with a lower cost of entry for a FreeSync monitor, and the possibility for users to get even more value out of a particular Radeon Pack they're eyeing, this could potentially be a killer deal. However, I'd recommend you wait for independent, confirmed benchmarks and reviews in controlled environments. I dare to suggest you won't need to look much further than your favorite tech site on the internet for that, when the time comes. Source: TweakTown
Add your own comment

169 Comments on AMD RX Vega 56 Benchmarks Leaked - An (Unverified) GTX 1070 Killer

#151
renz496
"Captain_Tom said:
FP16 (It will be a 26+ TFLOP card in many upcoming games), HBC, and asynchronous compute. There are more, but those are the big ones.


Although you are very "vocal" in these forums, so I already know you are aware of them, and I can''t wait to see what your fanboy response will be.
but it will not going to double your frame rate. and the entire scene will not being computed in FP16 because certain effect specifically need FP32 computation. only certain effect can use them. that 26Tflops is massive but the use case is limited.
Posted on Reply
#152
Assimilator
"Th3pwn3r said:
By far the worst post in this thread. If you don't realize that AMD has gone against the grain then you need to read more.
What does that even mean?
Posted on Reply
#153
renz496
"Captain_Tom said:
Volta is professional only at the moment, and I don't expect that to change till MAYBE the end of 2018. In fact I am pretty sure Nvidia confirmed the next series is another maxwell...cough.... Pascal refresh (But likely with more GDDR5X/6. It will be stronger, but lack all of these features.

The only exception I can think of is possibly a cut-down Volta sold as a Titan card for $1500 - $2000.
another case people underestimating what nvidia can do. it seems you were hoping for nvidia not to integrate this more advance feature into their GPU so AMD can get on top. then let me tell you this:
1) nvidia launch 1080ti in march and drop 1080 price by $100 even without AMD product competing in the segment.
2) AMD supply pro drivers with Vega FE as an added feature to their GPU that nvidia titan does not have. in about a month later nvidia responding by releasing pro drivers for the titan

so you still think nvidia will let AMD to easily one up them? consumer volta by the end of 2018? that is your wishful thinking right?
Posted on Reply
#154
HisDivineOrder
I hope it's true. That would be nice for AMD, which is nice for competition.

That being said, I have to think that anyone who wanted that level of performance for that price bought in already. If they waited this long, wouldn't you wait a few months to see what nVidia does next? I mean, I know you can always argue "Wait a while, get more for less" but in this case... AMD is so late with their product that they've already crossed into nVidia's next generation and they're only barely keeping up with what came out way back when.

If AMD had leapfrogged nVidia, it'd be a different story. Mostly matching nVidia at similar pricing isn't really going to cut it imo. Perhaps AMD believes that tying Ryzen success to Vega will boost Vega, too? I just don't know. Are there really that many AMD fans out there that'll ditch their 1070's to get similar performance for similar pricing?
Posted on Reply
#155
laszlo
wow ! so much bs in every vega thread/news comments ... green/red war nonstop ...fudzpowerup?
Posted on Reply
#156
Manu_PT
"Captain_Tom said:
No surprise here.

$400, and it slightly loses to the 1080 while having FAR better long-term technology. This will sell well, and Vega64 should be at least 15% stronger than this!
"future proof" on a GPU? Cool story bro.
Posted on Reply
#157
Vya Domus
"Frick said:
@Vya Domus @oxidized I want to hear your arguments on wether HPC has a use in gaming. No videos.
Every new feature AMD does either doesn't really exist , or it sucks , even before you can see the damn things in action. If it's AMD it's all fake news , everyone knows that right ?
Posted on Reply
#158
vega22
"Frick said:
@Vya Domus @oxidized I want to hear your arguments on wether HPC has a use in gaming. No videos.



Bursting or simply upgrading? Aye it will burst as mining gets harder, but not completely for a while.
it will for sure go down to how deep they're invested already. but i have seen plenty who have already bailed, cashing out to those late to the game. some from panic at the exchanges and others who preempting the jump in difficulty.

burst is probably the wrong word, but i think the tide has changed and the swell now recedes.
Posted on Reply
#159
bug
"Fluffmeister said:
I'm just impressed to hear Maxwell is worthless beyond gaming and mining (lol).
I can confirm. Just the other day I told Maxwell I had laundry to do. It was totally useless.
Posted on Reply
#160
oxidized
"Frick said:
@Vya Domus @oxidized I want to hear your arguments on wether HPC has a use in gaming. No videos.



Bursting or simply upgrading? Aye it will burst as mining gets harder, but not completely for a while.
HBM on Fury was supposed to be a new era, sure... a new era...

I can only imagine what will vega be with its HBM2 and HBC...


Anyway i'll just wait, i have no idea whether Vega will be good or bad yet, i only have a bad feeling, TDPs are way too high and the fact they reached nvidia's performance a year later makes me think i'm right, right on the fact that AMD probably put a huge amount of money on ryzen and left RTG with basically nothing, so i'd say they're actually doing even too good in my opinion - polaris was good enough, for a company with not that good of a situation atm. On only one thing i'm pretty sure, HBM2 won't affect almost at all performance on most of the use of those cards, especially videogames, Fury set an example.
Posted on Reply
#161
jabbadap
"Frick said:
@Vya Domus @oxidized I want to hear your arguments on wether HPC has a use in gaming. No videos.

Bursting or simply upgrading? Aye it will burst as mining gets harder, but not completely for a while.
Well i think you mean HBCC, not high performance computing aka HPC. HBCC has it use if game demands more VRAM than you have in your card. When amd demoed it, they vere using VEGA with "2GB" of available VRAM and running HBCC off and on.
<div class="youtube-embed" data-id="zAIM-EGYa14"><img src="https://i.ytimg.com/vi/zAIM-EGYa14/hqdefault.jpg" /><div class="youtube-play"></div><a href="https://www.youtube.com/watch?v=zAIM-EGYa14" target="_blank" class="youtube-title"></a></div>
"Frick said:
@Vya Domus
Bursting or simply upgrading? Aye it will burst as mining gets harder, but not completely for a while.
And then there will be next coin to mine and all crap begins from the start again.
Posted on Reply
#162
Frick
Fishfaced Nincompoop
"Vya Domus said:

And then there will be next coin to mine and all crap begins from the start again.
There were years between the Bitcoin craze and this latest craze though, so we'll likely to have some quiet time. I hope. :oops:
Posted on Reply
#163
xenocide
"Manu_PT said:
"future proof" on a GPU? Cool story bro.
The more I read that post of his the more hilariously stupid it sounds.
Posted on Reply
#164
Frick
Fishfaced Nincompoop
"xenocide said:
The more I read that post of his the more hilariously stupid it sounds.
Depends on how you look at it I guess, and whether you consider turning down settings to be "proper" gaming. The 7970 lasted for a really long while.
Posted on Reply
#165
xenocide
"Frick said:
Depends on how you look at it I guess, and whether you consider turning down settings to be "proper" gaming. The 7970 lasted for a really long while.
While that's true, to imply--or outright say--that AMD GPU's have some magical ability to be future proof is kind of absurd. GCN was an anomaly that benefited from advancements in API's such as DirectX 12 and Vulkan/OGL more than its competitors. A lot of the cards that came out around that time are still viable with some adjustments and depending on the games you play. No way in hell are you maxing out BF1 with something like a 7870 or 670, but you can easily drop some settings and get it to work well enough. Yea, in modern games the 7xxx series may have "aged" better than the 6xx series, but those cards are what 5 years old now? None of them run modern games well. It's just that while they both run them okayish, one does them slightly more okayish.
Posted on Reply
#166
Vya Domus
"xenocide said:
While that's true, to imply--or outright say--that AMD GPU's have some magical ability to be future proof is kind of absurd. GCN was an anomaly that benefited from advancements in API's such as DirectX 12 and Vulkan/OGL more than its competitors. A lot of the cards that came out around that time are still viable with some adjustments and depending on the games you play. No way in hell are you maxing out BF1 with something like a 7870 or 670, but you can easily drop some settings and get it to work well enough. Yea, in modern games the 7xxx series may have "aged" better than the 6xx series, but those cards are what 5 years old now? None of them run modern games well. It's just that while they both run them okayish, one does them slightly more okayish.
I think you have the wrong idea of what "future proofing" is. Future proofing is not the ability to run games maxed out for ages , that is obviously impossible , no. What it means is to able to play games without any major handicap and while still supporting new software and not be left out. And in this regard GCN is in fact more future proof even though you aren't maxing out games anymore. Not every one is busting 300$+ each 2 years , so slightly more okaysih is actually preferable for something that's 4 years old.
Posted on Reply
#167
Liviu Cojocaru
5 days left and hopefully all of this nonsense disputes will go away :)
Posted on Reply
#168
Th3pwn3r
"Liviu Cojocaru said:
5 days left and hopefully all of this nonsense disputes will go away :)
It'll just create new ones and more idiotic posts from biased people .
Posted on Reply
#169
bug
"Liviu Cojocaru said:
5 days left and hopefully all of this nonsense disputes will go away :)
Only if somehow Vega beats expectations. Otherwise we'll be back into "futureproofing" and "drivers need to mature" discussion territory.
Posted on Reply
Add your own comment