Wednesday, April 26th 2017

AMD Radeon Vega in the League of GTX 1080 Ti and TITAN Xp

In an AMA (ask me anything) session with Tom's Hardware community, AMD desktop processor marketing exec Don Woligrosky answered a variety of AMD Ryzen platform related questions. He did not shy away from making a key comment about the company's upcoming high-end graphics card, Radeon Vega, either. "Vega performance compared to the Geforce GTX 1080 Ti and the Titan Xp looks really nice," Woligrosky stated. This implies that Radeon Vega is in the same league of performance as NVIDIA's two top consumer graphics SKUs, the $650 GeForce GTX 1080 Ti, and the $1,200 TITAN Xp.

It is conceivable that AMD's desktop processor marketing execs will have access to some privileged information from other product divisions, and so if true, this makes NVIDIA's recent memory speed bump for the GTX 1080 a failed gambit. NVIDIA similarly bumped memory speeds of the GTX 1060 6 GB to make it more competitive against the Radeon RX 580. Woligrosky also commented on a more plausible topic, of the royalty-free AMD FreeSync becoming the dominant adaptive v-sync technology, far outselling NVIDIA G-Sync.
Source: Tom's Hardware
Add your own comment

196 Comments on AMD Radeon Vega in the League of GTX 1080 Ti and TITAN Xp

#151
oxidized
etayoriusI definitely think 480 is a better buy. Both cards are around the same Level of Performance but 480 has a DX12 advantage (which is the future) and two extra GB of Ram that will be useful in the future. If someone is more on DX11 and older games just grab the 1060 as it is way superior in that regards. But if someone makes their buy with DX12 in mind the RX480 is the clear winner, since it has proven that it is indeed faster with DX12 titles. Even after the nvidia DX12 Driver the 480 is still ahead. It is very simple, DX11 goes to nvidia hands down. While DX12/Vulkan at the moment and with current titles AMD is the clear Leader. Maybe nVidia finds a way to perform faster with comparable GPU to AMD in the future, Who knows.
You kept out the overclocking part where the 1060 can reach same 480's DX12 levels and increase the gap there is in DX11
Posted on Reply
#152
etayorius
oxidizedYou kept out the overclocking part where the 1060 can reach same 480's DX12 levels and increase the gap there is in DX11
I never denied it, in fact i was first to bring that up. Not long ago i saw a reference RX480 8GB version on Sale in VisionTek site for $170 USD, and i even saw an article today that some RX480 8GB versions are selling for 180 after Rebate. At that price point is not even a contest. The thing is, the RX480 is cheaper most of the time and it seems more future proof than the 1060, this is the reason i would recommend the 480 over 1060 depending on what type of games you play. But i never said the 1060 was a crap card, in fact i was 1 inch close to get a GTX1060 6GB but i decided to wait for Volta or Vega to update my GTX780Ti. I like both Brands and i been on nVidia 95% of the time, but i like being fair to AMD and i see a lot of disinfo and baseless hate toward their products... and even some biased benchmarks from the Tech Press and to be honest, AMD does not deserve all the sh*t they been through. Being fair to both companies, is not too hard to understand that one of them is better at DX11 while the other is currently better at DX12, with the latter also having an extra 2 GB. But yes, you can just grab a 1060 and OC it pass beyond the 480 DX12 performance, it´s a very valid point.
Posted on Reply
#153
Prima.Vera
evernessinceUnfortunately, even if AMD did crush Nvidia in performance they would not gain much marketshare. Most people will always buy Nvidia simply because that's what they always bought. Heck, I still see people spreading the falsehood that AMD drivers are junk even though they have gotten vastly better over the last 3 years and have had far less issues as of late.
Not entirely true. Is not a fanbuoy thingy. I was on ATI/AMD until 4870X2 and SLI HD 5870. And yes, the AMD drivers were utter crap, junk, especially for CrossFire. In the rare situation that the CF was working, the game was lagging and stuttering like hell.
My next card was a 780Ti, then 1080. No more ever CF/SLI for me.
Posted on Reply
#154
ShurikN
While I stand by my claim that the 480 is a better buy, the 1060 outsells the RX480 4.5:1. Even tho it came out 2 months later and they are on the same level. The mindset people have can never be changed that easily. AMD will always fight an uphill battle. Even when AMD/ATI had a vastly superior product (in every tier) they could only get 50% market.
Posted on Reply
#156
Captain_Tom
efikkanIt's just a matter of weeks now before AMD fans are withdrawn to their last stand claiming Vega is still better due to a "newer" memory technology, better fp16 performance (which no games will use anytime soon), etc. We all remember such arguments from the past:
R9 390X vs. GTX 970: R9 390X is more "future proof" because it has 8 GB memory.
Fury X vs. GTX 980 Ti: Memory size suddenly doesn't matter, Fury X still more "future proof" with faster and "newer" memory.
RX 480 vs. GTX 1060: Memory size suddenly makes RX 480 more future proof again.
It all depends on the wind direction…
omg buddy you are just so delusional.

390x VS 970 ?!?!?!

The 390X beat the 980, and in fact it nearly matches the 980 Ti in a lot of the newest games. Fury X is generally in-between the 1070 and 1080 depending on what games you play (Worst case is it trades blows with the 1070).


But who are these Vega AMD fanboys? Most people seem to be pretty conservative with their expectations...
Posted on Reply
#157
ratirt
ShurikNWhile I stand by my claim that the 480 is a better buy, the 1060 outsells the RX480 4.5:1. Even tho it came out 2 months later and they are on the same level. The mindset people have can never be changed that easily. AMD will always fight an uphill battle. Even when AMD/ATI had a vastly superior product (in every tier) they could only get 50% market.
Well, that's not our problem if people don't do a research on a product before buying it. Maybe at some point they will pay more attention to what's going on. If somebody is a slacker then this person will just go easy and buy the most commercial product even if it's not worth buying or you could get better for less or same price.
Posted on Reply
#158
Captain_Tom
oxidizedAlright if you really did that, i also suggest you go check your eyes :)

www.guru3d.com/articles-pages/msi-geforce-gtx-1060-gaming-x-plus-review,1.html
:)

www.techpowerup.com/reviews/Sapphire/RX_570_Pulse/
:)

www.anandtech.com/show/11278/amd-radeon-rx-580-rx-570-review
:)

www.gamersnexus.net/hwreviews/2882-msi-rx-580-gaming-x-review-vs-gtx-1060/page-5
:)
Are you reading like any of the links you just posted?

I am not surprised you linked Anandshill and gamersnexus as they consistently show laughably different results to the rest of the tech sights. But let's throw one of your links in here:

www.guru3d.com/articles_pages/msi_geforce_gtx_1060_gaming_x_plus_review,41.html

^Well there you go the top 580 beats the top 1060 in the "Shoot out". Meanwhile the 580's, 480's, 470's, and 1060 cards are all jumbled up in the list.


The techpowerup review also adds to what I am saying:

1) If you remove the BS games people play FAR less than big games like BF1, Dishonored 2, and Tomb Raider (Among many others) - the 480 is generally above the 1060 (sometimes by 10%+).

2) But let's humor you and include laughable things like a failed Assassin's Creed and Anon - even then the 1060 wins by like 5%. But again, I think we should be able to agree that more people care about BF1 performance more than Styx LMAO.

3) So even if we accept your BS "1060 is 5% stronger than 480" premise, no I do not believe it makes sense for anyone to take 5% more performance in exchange for 25 - 33% less VRAM and Nvidia's horrific long-term performance losses.


Check your eyes, Project Car's rigged benchmarks are probably in the way...
Posted on Reply
#160
Captain_Tom
ShurikNHow?
www.techpowerup.com/reviews/MSI/GTX_1080_Gaming_X_Plus_11_Gbps/30.html
I just saw you post a bench showing exactly what I said. So I guess.... That's how? Read your own link?





^There's several incredibly popular games with the Fury X in-between the 1070 and 1080 (With decent wins over the 1070). And worst case, it trades blows with the 1070.

Pay attention to the performance summaries over time because they keep changing lol. When the 1070 first came out it crushed the Fury X in TPU's summary, but then a few months ago the Fury X won on average. Now they are pretty much tied, but it will change again...
Posted on Reply
#161
notb
ratirtWell, that's not our problem if people don't do a research on a product before buying it. Maybe at some point they will pay more attention to what's going on. If somebody is a slacker then this person will just go easy and buy the most commercial product even if it's not worth buying or you could get better for less or same price.
I would love to see how you do research on everything you buy: food, clothes, car windscreen wipers, pencils, plastic food containers... ;)

Just accept the fact that people want to just buy a graphic card, plug it into the PC (get it plugged in...) and play a game.
And they buy NVIDIA (and Intel, Logitech, HP etc), because they know these brands and they expect them to just work.
But the other side of this situation is that companies like NVIDIA and Intel... well... listen to their customers. They usually aim to make stuff that just works.
So sure, AMD's products are often more cutting-edge, but very likely also a bit less friendly to buy and own. People on this forum call this "exciting" but for most of the population it is just abstruse and tiresome.
Posted on Reply
#162
ratirt
Captain_TomI just saw you post a bench showing exactly what I said. So I guess.... That's how? Read your own link?





^There's several incredibly popular games with the Fury X in-between the 1070 and 1080 (With decent wins over the 1070). And worst case, it trades blows with the 1070.

Pay attention to the performance summaries over time because they keep changing lol. When the 1070 first came out it crushed the Fury X in TPU's summary, but then a few months ago the Fury X won on average. Now they are pretty much tied, but it will change again...
This mostly depends how you look at it. The modern games sometimes perform better on fury than 1070. I assume this stuff will continue in the future. For me AMD is a best buy and honestly I been thinking about buying fury till Vega shows up.
Posted on Reply
#163
ratirt
notbI would love to see how you do research on everything you buy: food, clothes, car windscreen wipers, pencils, plastic food containers... ;)

Just accept the fact that people want to just buy a graphic card, plug it into the PC (get it plugged in...) and play a game.
And they buy NVIDIA (and Intel, Logitech, HP etc), because they know these brands and they expect them to just work.
But the other side of this situation is that companies like NVIDIA and Intel... well... listen to their customers. They usually aim to make stuff that just works.
So sure, AMD's products are often more cutting-edge, but very likely also a bit less friendly to buy and own. People on this forum call this "exciting" but for most of the population it is just abstruse and tiresome.
Dude. Read my post. Are we talking about food or cloths? I think not. Besides, don't judge the books by its covers. Which means cloths mean noting to me and food I make myself. Electronics is a different thing.
With your logic if you buy a car you just buy whatever you can drive or you look for something just for you that suits your needs? Or you just get any that can go and you are good? and don't tell me please that cars are expensive and that is why you being more careful with your purchase. Considering price and tech and what it does they are relatively cheap with their magnitude comparing to video cards which price can be astronomical for their purpose. That's my opinion.
Posted on Reply
#164
oxidized
etayoriusI never denied it, in fact i was first to bring that up. Not long ago i saw a reference RX480 8GB version on Sale in VisionTek site for $170 USD, and i even saw an article today that some RX480 8GB versions are selling for 180 after Rebate. At that price point is not even a contest. The thing is, the RX480 is cheaper most of the time and it seems more future proof than the 1060, this is the reason i would recommend the 480 over 1060 depending on what type of games you play. But i never said the 1060 was a crap card, in fact i was 1 inch close to get a GTX1060 6GB but i decided to wait for Volta or Vega to update my GTX780Ti. I like both Brands and i been on nVidia 95% of the time, but i like being fair to AMD and i see a lot of disinfo and baseless hate toward their products... and even some biased benchmarks from the Tech Press and to be honest, AMD does not deserve all the sh*t they been through. Being fair to both companies, is not too hard to understand that one of them is better at DX11 while the other is currently better at DX12, with the latter also having an extra 2 GB. But yes, you can just grab a 1060 and OC it pass beyond the 480 DX12 performance, it´s a very valid point.
Alright, whatever we can keep ignoring all the facts, no problems.
Captain_TomAre you reading like any of the links you just posted?

I am not surprised you linked Anandshill and gamersnexus as they consistently show laughably different results to the rest of the tech sights. But let's throw one of your links in here:
www.guru3d.com/articles_pages/msi_geforce_gtx_1060_gaming_x_plus_review,41.html
^Well there you go the top 580 beats the top 1060 in the "Shoot out". Meanwhile the 580's, 480's, 470's, and 1060 cards are all jumbled up in the list.
The techpowerup review also adds to what I am saying:
1) If you remove the BS games people play FAR less than big games like BF1, Dishonored 2, and Tomb Raider (Among many others) - the 480 is generally above the 1060 (sometimes by 10%+).
2) But let's humor you and include laughable things like a failed Assassin's Creed and Anon - even then the 1060 wins by like 5%. But again, I think we should be able to agree that more people care about BF1 performance more than Styx LMAO.
3) So even if we accept your BS "1060 is 5% stronger than 480" premise, no I do not believe it makes sense for anyone to take 5% more performance in exchange for 25 - 33% less VRAM and Nvidia's horrific long-term performance losses.
Check your eyes, Project Car's rigged benchmarks are probably in the way...
Hey i linked 4 sites i know over others, and still you keep cherry picking and saying "If we remove the BS games like...", can you even hear yourself? There were many other games tested, i could say the same about doom, about hitman, which are fare Clearly on AMD side, much more clearly than any of those you talked about, but it's not a thing we should do, those are the games, those are the tests, face the truth, the 1060 was and still is a bit better over a list of 20+ games. and is the overall better product, now if you completely don't care about consuming(costs) temperatures, and overclockability, fine get whatever you want, but the real facts are others.
Posted on Reply
#165
ShurikN
Captain_TomI just saw you post a bench showing exactly what I said. So I guess.... That's how? Read your own link?





^There's several incredibly popular games with the Fury X in-between the 1070 and 1080 (With decent wins over the 1070). And worst case, it trades blows with the 1070.

Pay attention to the performance summaries over time because they keep changing lol. When the 1070 first came out it crushed the Fury X in TPU's summary, but then a few months ago the Fury X won on average. Now they are pretty much tied, but it will change again...
You are cherry-picking the results. GTA5, FO4 and the newest Ghost Recon are more popular than COD and DOOM, and the Fury losses in those or gets destroyed in some cases. I gave you the average across all 20 smth games from a review from 5 days ago. You replied with 3 games across 2 resolutions.
And all of that was a response to your "Worst case is it trades blows with the 1070"
If that was the worst case, the FuryX would be ahead on average
Posted on Reply
#166
ratirt
ShurikNYou are cherry-picking the results. GTA5, FO4 and the newest Ghost Recon are more popular than COD and DOOM, and the Fury losses in those or gets destroyed in some cases. I gave you the average across all 20 smth games from a review from 5 days ago. You replied with 3 games across 2 resolutions.
And all of that was a response to your "Worst case is it trades blows with the 1070"
If that was the worst case, the FuryX would be ahead on average
Modern games are not cherry picking I think. This is the future indication of the performance for cards. if you go with older games which give your card better performance that would be cherry picking.

BTW. IT is time to say that 1080p is not a good indication for V-cards performance. Even there's a huge amount of users still playing that resolution. 2k will be the lowest within a year. I myself plan to buy a new monitor with 2k possibility it's just I'm waiting for VEGA. I need to know the benefits and free-sync is way more cheaper than G-sync. Worth to wait.
Posted on Reply
#167
EarthDog
PolglassAt least AMD give a full breakdown of the technology and methodology they have used. Nvidia give you nothing.
what????
Posted on Reply
#168
msroadkill612
I am not saying anything new, but the way understressed big picture IMO is the potential magic of a vega zen combo pc.

gpu & cpu are 2 parts of a whole, a team, but only amd make both.

ATM they have a v.good launch of a new gen cpu, and an almost coinciding intro of a new gen gpu.

i.e the two projects have run in ~parallel. One campaign, two fronts.

so they have had the unique opportunity to really address the whole problem.

it beggars belief they cant come up with some important synergies.

Especially, since not only that, the next campaign is combining the two in a raven ridge apu. Now thats an integrated, powerful ~SOC PC.
Posted on Reply
#169
medi01
FordGT90ConceptVESA embedded DisplayPort adaptive sync. It's an open standard Intel and NVIDIA have access to. I wouldn't be surprised in NVIDIA already has a working implementation of it ready to go but corporate needs to decide if/when to discontinue the G-SYNC program. I doubt they will until monitor manufacturers stop cooperating.
They are using it inside "gsync" notebooks.
I've recall "it's no big deal, there is a standard that allows that in notebook" several week before AMD announced FreeSync. It can't be that NV didn't know about it, ti's just, they couldn't vendor lock with it.
oxidized..., so they together decided that it was the moment to stop giving money...
Oh, please. There is a thing called confirmation bias, bigger market share => more biased trash talk => more people fall victim to it.
And there was at least one clearly paid shill chizow who was always there to post shit about... oh, you know, one company. Who knows how many subtle ones are out there.
Posted on Reply
#170
burebista
btarunr"Vega performance compared to the Geforce GTX 1080 Ti and the Titan Xp looks really nice,"


Sorry but I can't resist. :D
Posted on Reply
#171
bug
medi01They are using it inside "gsync" notebooks.
I've recall "it's no big deal, there is a standard that allows that in notebook" several week before AMD announced FreeSync. It can't be that NV didn't know about it, ti's just, they couldn't vendor lock with it.
G-Sync is more capable (e.g. it does not restrict refresh rates to keep overdrive working), that' why Nvidia won't drop it. Though being able to milk it must also be part of the reason.
Posted on Reply
#172
medi01
bugG-Sync is more capable (e.g. it does not restrict refresh rates to keep overdrive working)
I don't see inherent FreeSync problems with motion blur reduction. gsync chip has it built in and with freesync, vendors need to do their own solutions is the only difference, from what I see.
Posted on Reply
#173
bug
medi01I don't see inherent FreeSync problems with motion blur reduction. gsync chip has it built in and with freesync, vendors need to do their own solutions is the only difference, from what I see.
Yes. And then you have to check the quality of the implementation before buying, because some vendors go overboard with overdrive.

To be clear, the technologies are similar for all intents and purposes. But since there are some differences, by giving up on G-Sync, Nvidia would actually be dropping features. And this leads to user qq and even lawsuits.
Posted on Reply
#174
Slizzo
TheGuruStudApple has stake in it. That's all you to know. :laugh:
OT but, Beats was popular way before Apple bought it out. And yes, Apple owns them outright now. That's how Jay-Z got to be the richest rapper in the world. (i.e. a Billionaire)
Posted on Reply
#175
efikkan
Captain_Tomomg buddy you are just so delusional.

390x VS 970 ?!?!?!

The 390X beat the 980, and in fact it nearly matches the 980 Ti in a lot of the newest games.
Grow up.
You are not even close to correct. Stock R9 390X was slightly ahead of stock GTX 970, but considering the majority of GTX 970s available were clocked much higher, nearly all GTX 970s sold would beat even custom versions of R9 390X. Also take into consideration that R9 390X was more expensive, GTX 970 was a superior choice in every way. That's not my opinion, but a fact.
ShurikNWhile I stand by my claim that the 480 is a better buy, the 1060 outsells the RX480 4.5:1. Even tho it came out 2 months later and they are on the same level. The mindset people have can never be changed that easily. AMD will always fight an uphill battle. Even when AMD/ATI had a vastly superior product (in every tier) they could only get 50% market.
GTX 1060 (stock) performs better unless you cherry-pick games, and is better in all aspects; performance, performance/price, efficiency, energy consumption, OC headroom, etc. If you take into account typical custom versions of GTX 1060 vs. custom RX 480, GTX 1060 looks even better.

There is nothing in GCN that makes it inherently better at Direct3D 12 as many claims, but simply the fact that many of the early Direct3D 12 games have been AMD sponsored and/or console ports.

It's sad to see AMD focusing so much of their resources on getting developers to optimize for their hardware, instead of spending those resources actually making better hardware.
Posted on Reply
Add your own comment
Apr 16th, 2024 19:27 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts