Monday, February 18th 2019

AMD Radeon VII Retested With Latest Drivers

Just two weeks ago, AMD released their Radeon VII flagship graphics card. It is based on the new Vega 20 GPU, which is the world's first graphics processor built using a 7 nanometer production process. Priced at $699, the new card offers performance levels 20% higher than Radeon RX Vega 64, which should bring it much closer to NVIDIA's GeForce RTX 2080. In our testing we still saw a 14% performance deficit compared to RTX 2080. For the launch-day reviews AMD provided media outlets with a press driver dated January 22, 2019, which we used for our review.

Since the first reviews went up, people in online communities have been speculating that these were early drivers and that new drivers will significantly boost the performance of Radeon VII, to make up lost ground over RTX 2080. There's also the mythical "fine wine" phenomenon where performance of Radeon GPUs significantly improve over time, incrementally. We've put these theories to the test by retesting Radeon VII using AMD's latest Adrenalin 2019 19.2.2 drivers, using our full suite of graphics card benchmarks.
In the chart below, we show the performance deltas compared to our original review, for each title three resolutions are tested: 1920x1080, 2560x1440, 3840x2160 (in that order).



Please do note that these results include performance gained by the washer mod and thermal paste change that we had to do when reassembling of the card. These changes reduced hotspot temperatures by around 10°C, allowing the card to boost a little bit higher. To verify what performance improvements were due to the new driver, and what was due to the thermal changes, we first retested the card using the original press driver (with washer mod and TIM). The result was +0.2% improved performance.

Using the latest 19.2.2 drivers added +0.45% on top of that, for a total improvement of +0.653%. Taking a closer look at the results we can see that two specific titles have seen significant gains due to the new driver version. Assassin's Creed Odyssey, and Battlefield V both achieve several-percent improvements, looks like AMD has worked some magic in those games, to unlock extra performance. The remaining titles see small, but statistically significant gains, suggesting that there are some "global" tweaks that AMD can implement to improve performance across the board, but unsurprisingly, these gains are smaller than title-specific optimizations.

Looking further ahead, it seems plausible that AMD can increase performance of Radeon VII down the road, even though we have doubts that enough optimizations can be discovered to match RTX 2080, maybe if suddenly a lot of developers jump on the DirectX 12 bandwagon (which seems unlikely). It's also a question of resources, AMD can't waste time and money to micro-optimize every single title out there. Rather the company seems to be doing the right thing: invest into optimizations for big, popular titles, like Battlefield V and Assassin's Creed. Given how many new titles are coming out using Unreal Engine 4, and how much AMD is lagging behind in those titles, I'd focus on optimizations for UE4 next.
Add your own comment

182 Comments on AMD Radeon VII Retested With Latest Drivers

#126
Frutika007
medi01Remind me how 780Ti fared against 290x.
"Mythical" eh?


Yay, that feeling aspect of AMD products is really something special now, isn't it?

I mean, there should be reasons $139 1050Ti outsells 1.5-2 times faster 570 priced at $99, shouldn't there?
Kepler architecture's failure has nothing do to with AMD's 'finewine' myth. AMD didn't get better,Kepler got worse. Learn the difference. And 1050Ti 139$ and RX 570 99$?? Are you on drugs or something?? You wanna know the reason?? Here's your reason, GTX 1050Ti was launched in 2016 for 130$. RX 570 was launched in 2017 for 170$,also that's precisely when Mining scandal started and AMD gpu just dissapeared from the stock. 1 year gap with 40$ more money is not an appealing deal,not to mention the lack of availability due to mining.
JB_GamerIsn't it the case - the problem for Amd - that ALL games are tested and optimized for nVidia GPU'S?
No,there are many games that are tested and optimized for AMD.
SIGSEGVhahaha....


but I hate their (NVIDIA) approach to reduce performance through a driver update to older GPU.
That's beyond my comprehension.
You are totally wrong. Nvidia doesn't reduce performance of older gpu through driver update. That's just a lie and misconception spread by AMD cultists.

Wow, already got -1 rating from that ignorant delusional braindead AMD cultist named medi01
rvalenciaFrom www.techpowerup.com/reviews/AMD/Radeon_VII/

Assassin's Creed Origins (NVIDIA Gameworks, 2017)

Battlefield V RTX (NVIDIA Gameworks, 2018)

Civilization VI (2016)

Darksiders 3 (NVIDIA Gameworks, 2018), old game remaster, where's Titan Fall 2.

Deus Ex: Mankind Divided (AMD, 2016)

Divinity Original Sin II (NVIDIA Gameworks, 2017)

Dragon Quest XI (Unreal 4 DX11, large NVIDIA bias, 2018)

F1 2018 (2018), Why? Microsoft's Forza franchise is larger than this Codemaster game.

Far Cry 5 (AMD, 2018)

Ghost Recon Wildlands (NVIDIA Gameworks, 2017), missing Tom Clancy's The Division

Grand Theft Auto V (2013)

Hellblade: Senuas Sacrif (Unreal 4 DX11, NVIDIA Gameworks)

Hitman 2

Monster Hunter World (NVIDIA Gameworks, 2018)

Middle-earth: Shadow of War (NVIDIA Gameworks, 2017)

Prey (DX11, NVIDIA Bias, 2017 )

Rainbow Six: Siege (NVIDIA Gameworks, 2015)

Shadows of Tomb Raider (NVIDIA Gameworks, 2018)

SpellForce 3 (NVIDIA Gameworks, 2017)

Strange Brigade (AMD, 2018),

The Witcher 3 (NVIDIA Gameworks, 2015)

Wolfenstein II (2017, NVIDIA Gameworks), Results different from www.hardwarecanucks.com/forum/hardware-canucks-reviews/78296-nvidia-geforce-rtx-2080-ti-rtx-2080-review-17.html when certain Wolfenstein II map exceeded RTX 2080'
From top to the bottom,WRONG WRONG WRONG.

AC Origins gameworks?? It wasn't even nvidia sponsored,nobody sponsored that.
Battlefield 5 gameworks?? Are you having a giggle mate?? Do you even know what gameworks is?? Oh my god the ignorance and delusion in this comment is blowing my mind. Bf5 only has RTX and DLSS. It still uses Frostbite engine which vastly favours AMD.
Let me correct you.

"Assassin's Creed Origins (NVIDIA Gameworks, 2017)" - WRONG

"Battlefield V RTX (NVIDIA Gameworks, 2018)" - WORNG

"Darksiders 3 (NVIDIA Gameworks, 2018), old game remaster, where's Titan Fall 2." - WRONG

"Dragon Quest XI (Unreal 4 DX11, large NVIDIA bias, 2018)" - WRONG

"Ghost Recon Wildlands (NVIDIA Gameworks, 2017)" - WRONG

"Hellblade: Senuas Sacrif (Unreal 4 DX11, NVIDIA Gameworks)" -WRONG

Hitman 2 - it's an AMD title i think,not sure

"Monster Hunter World (NVIDIA Gameworks, 2018)" - not sure,but probably WRONG

"Middle-earth: Shadow of War (NVIDIA Gameworks, 2017)" - WRONG

"Prey (DX11, NVIDIA Bias, 2017 )" - WRONG

"Rainbow Six: Siege (NVIDIA Gameworks, 2015)" - WRONG

"Shadows of Tomb Raider (NVIDIA Gameworks, 2018)" - WRONG,it's only RTX but that hasn't even patched yet

"Wolfenstein II (2017, NVIDIA Gameworks)" - WRONG

Most of the games here aren't even sponsored by nvidia and doesn't have any gameworks in it. The only gameworks game in this list is Witcher 3. Most of the games in your nvidia list are even AMD sposnored title,like Wolfenstein 2 and Prey.
B-RealRadeon VII buyers get RE2, Division 2 and DMC5. 1080Ti buyers get ZERO games. Thats a $150-180 packet.


As poor as the whole RTX roundup is. :D


False, 1050Ti doesn't have an AMD counterpart. GTX 1050 goes against (and trades blows) with the RX560, RX 470/570 against the 1060 3GB and the RX 570/580 against the 1060 6GB. Anyway, it's true that if there is someone who buys a 1050Ti for the same price as the RX 570 4GB, he is just literally stupis af.
Who cares? RE2 already cracked,DMC5 will get cracked, Division 2 will suck as division 1 did. Not a single appealing game. I would rather take price reduction than taking those games. So ultimately 1080Ti is a much better choice. Not to mention 1080Ti came out 2 year ago for the same 699$ price tag. LOL. After 2 years with 7nm process and same 699$ price tag,Radeon 7 still can't beat 1080Ti. How pathetic! Here's the generational improvement i wonder? And RTX 2060 is a great gpu for the price. So the whole RTX roundup isn't poor.
Posted on Reply
#127
eidairaman1
The Exiled Airman
siluro818*looks at the 2080Ti in system specs*

If you say so bro xD
He still does by trolling AMD topics
HD64GSince you are fond of numbers that prove things scientifically, here is an example that proves that AMD GPUs aren't shown in their best form at their launch and this helps us customers to get an equal or better product in better price than will deserve in a few months only. And for a customer that keeps his hardware at least for 3 years, this is an opportunity.

Posted on Reply
#128
Frutika007
eidairaman1He still does by trolling AMD topics


*looks at the AMD FX cpu and AMD 290 gpu in system specs*

If you say so bro
notbOh come on. This game was launched with Vega as a confirmation of AMD-Bethesda cooperation.
Counting Prey as favouring AMD (i.e. it could be even worse) it's:
AMD: 4
Nvidia: 14
undecided: 4

AMD: 4/18 ~= 22%
Nvidia: 14/18 ~= 78%
which is basically the market share these companies have in discrete GPU.

Do you think it should be 50:50? Or what? And why?
Actually his list is completely wrong.
Posted on Reply
#129
EarthDog
notbOh come on. This game was launched with Vega as a confirmation of AMD-Bethesda cooperation.
Counting Prey as favouring AMD (i.e. it could be even worse) it's:
AMD: 4
Nvidia: 14
undecided: 4

AMD: 4/18 ~= 22%
Nvidia: 14/18 ~= 78%
which is basically the market share these companies have in discrete GPU.

Do you think it should be 50:50? Or what? And why?
I don't think games for review need to be chosen by who sponsors it for 'balance', but by relative popularity as well as genre encompassing. Meaning, the most popular games of genre (FPS, 3PS, RTS, MMO, Racing, etc). When people go to buy games, if they have NV or AMD card, do they only play those titles? No... it isn't even a concern for the sane.

There is simply no way everyone will be happy. But this method ensures game types are covered and the more popular ones and f-all to the AMD/NVIDIA sponsor pissing match.
Posted on Reply
#130
danbert2000
I think we can all agree that out of the box, the Radeon VII is somewhere between a 2070 and a 2080, and that the improved drivers didn't change that. And it has no room for overclocking unless you get a golden sample and put it under water. AMD fans need to sit this one out, come back when Navi shows its cards. There's really not much of an argument. Nvidia is better at the high end and better at power efficiency out of the box, and that's how 95% of people will use these cards. Radeon VII is a stopgap card and it's not going to magically become better than the 2080 through drivers. It is a fine card but not the best.
Posted on Reply
#131
Vayra86
MistralIf I may, a suggestion for benchmarks going forward: have a small icon next to the game titles, indicating if they are nVidia or AMD sponsored.
Got a better idea, put a black sticker over everything and only reveal the card brands after people formed an opinion. So all you get is Card ABCD with price ABCD and performance ABCD.

Then, when everything settles, you reveal brands. It'd be very interesting in terms of perceived brand loyalty. Honestly this whole brand loyalty is totally strange to me. For either camp. Neither AMD or Nvidia are in it to make you happy, they're in the game to make money and keep the gears turning for the company. The consumer, and mostly 'gamer' is just a target, nothing else. Ironically BOTH camps are now releasing cards that almost explicitly tell us 'Go f*k yourself, this is what you get, like it or not' and the only reason is because the general performance level is what it is. AMD's midrange is 'fine' for most gaming, and Nvidia's stack last gen was also 'fine' for most gaming. Radeon VII is a leftover from the pro-segment, and Turing in a very similar way is a derivative of Volta, a GPU only released for pro markets. On top of that even the halo card isn't a full die.

We're getting scraps and leftovers and bicker about who got the least shitty ones. How about taking a stance of being the critical consumer instead towards BOTH companies. If you want to win, thát's how it works. And the most powerful message any consumer could ever send, is simply not buying it.
Posted on Reply
#132
cucker tarlson
having owned nvidia for 4 years,I don't think I have ever complained how my card performs in amd-friendly games once.doesn't matter when perfomance is consistent.when it stops being consistent,then people start making ridiculous game-bias lists.
Posted on Reply
#133
moproblems99
Vayra86Got a better idea, put a black sticker over everything and only reveal the card brands after people formed an opinion. So all you get is Card ABCD with price ABCD and performance ABCD.

Then, when everything settles, you reveal brands. It'd be very interesting.
That would be a really cool review. However, I think power consumption numbers would give it away.
Posted on Reply
#134
jmcosta
medi01You mean, when you buy 780Ti, you expect it to be beaten by 960 later on, right?


Why the hell not!
You might discover interesting things, if you check how results were changing over time in TP charts.
Fury used to beat 980Ti only at 4k at launch (and even then, barely) at 1440p it was about 10% behind:

From 10% behind to several % ahead. Not bad, is it?

And one more point: FineWine in general refers to graceful aging of AMD cards, especially in contrast with that nVidia does with its customers.
Not about some magic dust coming into play later on or AMD purposefully crippling it at launch (the way certain other company does)
unfortunately that card has an insufficient memory buffer for a lot of games nowadays which results in occasional stutter.
That minuscule performance gain isn't gonna give you better experience overall.
I used to own one and played many titles at 1440p and it was the worst mistake i ever made
There is a "finewine" but that comes mostly from having a bad driver in the first months of that card release and yeah then slowly there is an improvement over time to a level that should have had in the first place.
Posted on Reply
#135
rtwjunkie
PC Gaming Enthusiast
rvalenciaWolfenstein II (2017, NVIDIA Gameworks),
I’m pretty sure my copy ran on Vulkan.
Posted on Reply
#136
king of swag187
robert3892I'd like to test this myself but sadly my Radeon 7 arrived as a Dead on Arrival unit and I had to send it back for a replacement. I suppose I'll get the replacement next week.
Get a 2080 :laugh:
jmcostaunfortunately that card has an insufficient memory buffer for a lot of games nowadays which results in occasional stutter.
That minuscule performance gain isn't gonna give you better experience overall.
I used to own one and played many titles at 1440p and it was the worst mistake i ever made
There is a "finewine" but that comes mostly from having a bad driver in the first months of that card release and yeah then slowly there is an improvement over time to a level that should have had in the first place.
The VRAM isn't even an issue to me tbh, its the lack of drivers and how poorly kepler aged
danbert2000I think we can all agree that out of the box, the Radeon VII is somewhere between a 2070 and a 2080, and that the improved drivers didn't change that. And it has no room for overclocking unless you get a golden sample and put it under water. AMD fans need to sit this one out, come back when Navi shows its cards. There's really not much of an argument. Nvidia is better at the high end and better at power efficiency out of the box, and that's how 95% of people will use these cards. Radeon VII is a stopgap card and it's not going to magically become better than the 2080 through drivers. It is a fine card but not the best.
14% slower than a 2080 and 5% than a 1080 ti (on average, according to TPU)
Posted on Reply
#137
moproblems99
king of swag18714% slower than a 2080 and 5% than a 1080 ti (on average, according to TPU)
Didn't the new drivers bring it under 10% to 2080? Almost in the middle.
Posted on Reply
#138
Frutika007
moproblems99Didn't the new drivers bring it under 10% to 2080? Almost in the middle.
In two games,yes. But on average,it just brought it up to less than 1% compared to previous driver. So now it's 13% behind rtx 2080 and 4% behind 1080Ti.
Posted on Reply
#139
Nkd
Prima.VeraThis card is unfortunatelly 200$ more than it should have been. Nobody is going to pick this over an 1080Ti for example...
The card will sell because it is really for people with dual purpose who need all that ram for content creation. I got a 2080ti but to say no one is going to buy it is overblown. That is your opinion not a fact, there are plenty of people picking up this card who see value in it within their workload.
Posted on Reply
#140
_larry
notbMan, it's you again on your vendetta against Nvidia. I hoped you would give up after R VII launch.

FineWine doesn't refer to "graceful aging of AMD cards". It refers to AMD not being able to provide proper drivers at the time of launch. So with Nvidia you get that extra 10% the first day, and with AMD you have to wait.

Basically, you just said that instead of getting $1000 today you'd rather get by monthly installments over a year, because then you'd have the sense of earning money.

Also, I would love to learn a way to revoke your rating rights, because you're just running around giving a -1 to anyone who doesn't share your love for Radeon chips. It undermines the already little sense that ranking system has.
My R9 290 cost me $250~ in 2014...
The performance has only gone up since I got it because of drivers and optimisations. I went from 1080p to 1440p without a second thought.
My friend has a 780Ti that constantly had stuttering issues in games... It cost him $400~ in 2014...
All in all, the AMD card is the better investment, because they are cheaper and just get better with updates...
Posted on Reply
#141
efikkan
_larryMy R9 290 cost me $250~ in 2014...
The performance has only gone up since I got it because of drivers and optimisations. I went from 1080p to 1440p without a second thought.
My friend has a 780Ti that constantly had stuttering issues in games... It cost him $400~ in 2014...
All in all, the AMD card is the better investment, because they are cheaper and just get better with updates...
It's not like Nvidia are not improving their drivers either. In fact, over the past decade there have been no driver improvement large enough to shift the relative positioning between competitors, well except perhaps for Nvidia 337.50, the update where Nvidia applied the driver side optimizations of DirectX 12 to DirectX 11 and OpenGL, resulting in one of the greatest driver improvements ever. AMD decided not to do the same kind of optimizations in their driver, to retain a larger performance delta with DirectX 12, and to keep alive the myth that AMD is somehow better in DirectX 12…
Posted on Reply
#142
Vayra86
_larryMy R9 290 cost me $250~ in 2014...
The performance has only gone up since I got it because of drivers and optimisations. I went from 1080p to 1440p without a second thought.
My friend has a 780Ti that constantly had stuttering issues in games... It cost him $400~ in 2014...
All in all, the AMD card is the better investment, because they are cheaper and just get better with updates...
Now ask a Fury X owner how he feels versus a 980ti owner...

Both camps have their great and no-so-great cards and also offer fantastic value propositions. Its hit or miss, all the time, for both green and red.In the end, yes, I think we can agree that the 290 versus the 780 (comparing the non-X to the TI is not fair, neither on price or performance, or market share - 290s and 780s sold like hotcakes and the higher models a whole lot less!) is a win for the 290 when it comes to both VRAM and overall performance, but not in terms of power/noise/heat. Also, you needed a pretty good AIB 290 or you'd have a vacuum cleaner - and those surely weren't 250 bucks.

Regardless. Consistency is an issue and the driver approach of both camps is different. I think its a personal view on what is preferable; full performance at launch, or increasing performance over time - with a minor chance of gaining a few % over the competition near the end of a lifecycle. It depends a lot as well on how long you tend to keep a GPU.

And yes, absolutely, a 780ti in 2017 and onwards ran into lots of problems, I've experienced those first hand. At the same time, I managed to sell that card for a nice sum of 280 EUR, which was exactly the amount I bought it for a year earlier, and meant a serious discount on my new GTX 1080. Longevity isn't just an advantage. Keeping a card for a long time also means that by the time you want to upgrade, the value is almost gone, while you're also slowly bleeding money over time because perf/watt is likely lower. Running ancient hardware is not by definition cheaper - that only is true if that hardware is obsolete and will never be replaced anyway. If you are continuously upgrading, keeping a GPU for a longer time than 1 ~1,5 years (make it 3 with the current gen junk being released) is almost NEVER cheaper than replacing it for something of a similar price point and reselling your old stuff at a good price. And you get a bonus: you stay current.
Posted on Reply
#143
turbogear
Lot of talks about Fury X wakes old memories. :p
I used to own one until 2017 before I replaced it with Vega 64.
At that time I had 1080p monitor @120Hz without Freesync.
For that resolution the Fury X was not bad. The large number of games I played then worked all at Ultra setting with good frame rates.
I sold it at eBay at that time for over 300€. :D
Somebody may still be using it.
Posted on Reply
#144
king of swag187
My Fury X is still going well at 1080p 144hz, but its quite under utilized by my brother (720p 60hz lol)
Posted on Reply
#145
OneMoar
There is Always Moar
drivers are not going to fix this card being a rebranded instinct card

AMD no longer makes gaming cards they make compute accelerators

this card is never going to perform efficiently under gaming loads, if you wanna play games you buy nVidia
Posted on Reply
#146
notb
OneMoardrivers are not going to fix this card being a rebranded instinct card

AMD no longer makes gaming cards they make compute accelerators

this card is never going to perform efficiently under gaming loads, if you wanna play games you buy nVidia
Which is really ironic since their compute accelerators aren't really a hit (that's why they're rebranding them for gaming).
They wanted to unify workstation and high-end gaming. This whole business strategy turned out to be a failure.

Everything could change if AMD focused on making purpose-built gaming chips and unify consoles and PC gaming, which would make sense for a change...

We'll see what happens with their datacenter products. IMO they'll give up and Intel takes over.
Posted on Reply
#147
bug
OneMoardrivers are not going to fix this card being a rebranded instinct card

AMD no longer makes gaming cards they make compute accelerators

this card is never going to perform efficiently under gaming loads, if you wanna play games you buy nVidia
Tbh, Nvidia also put a lot of computing resources in their cards. But when everybody was stuck on 28nm, Nvidia looked to get the most out of the die space and starting with Maxwell, have cut back on compute resources in their gaming cards.
Doesn't make AMD card lesser gaming cards. But, in comparison, it does make GCN resemble Netburst from an efficiency point of view.
Posted on Reply
#148
OneMoar
There is Always Moar
enough will the sub 50 post new members kindly go pound sand
your youtube drama is not welcome here
Posted on Reply
#149
Vayra86
R4WN4KRIP TechPowerUp. AMD cultists have been spoken.


Oh look another AMD cultist giving AdoredTV's review link as an irrefutable proof of Radeon 7's performance increase via new driver.
And looks like INSTG8R is also an AMD cultist,giving -1 rating as i revealed the dark side of his cult.
You are getting a -1 from me because I don't need to see random idiot comments. You can find anything on the internet, just google it and boom, done.

Relevance = -1000

If you want to troll, go to that place you found those screens at. Don't do it here.
Posted on Reply
#150
Redwoodz
OneMoardrivers are not going to fix this card being a rebranded instinct card

AMD no longer makes gaming cards they make compute accelerators

this card is never going to perform efficiently under gaming loads, if you wanna play games you buy nVidia
AMD has been doing the same thing since 7970. It was Nvidia who removed compute performance from their GTX cards.Now they have added it back, charging you double calling it RTX.
Posted on Reply
Add your own comment
Apr 25th, 2024 23:54 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts