• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon VII Retested With Latest Drivers

*looks at the 2080Ti in system specs*

If you say so bro xD

He still does by trolling AMD topics

Since you are fond of numbers that prove things scientifically, here is an example that proves that AMD GPUs aren't shown in their best form at their launch and this helps us customers to get an equal or better product in better price than will deserve in a few months only. And for a customer that keeps his hardware at least for 3 years, this is an opportunity.

 
He still does by trolling AMD topics



*looks at the AMD FX cpu and AMD 290 gpu in system specs*

If you say so bro

Oh come on. This game was launched with Vega as a confirmation of AMD-Bethesda cooperation.
Counting Prey as favouring AMD (i.e. it could be even worse) it's:
AMD: 4
Nvidia: 14
undecided: 4

AMD: 4/18 ~= 22%
Nvidia: 14/18 ~= 78%
which is basically the market share these companies have in discrete GPU.

Do you think it should be 50:50? Or what? And why?

Actually his list is completely wrong.
 
Oh come on. This game was launched with Vega as a confirmation of AMD-Bethesda cooperation.
Counting Prey as favouring AMD (i.e. it could be even worse) it's:
AMD: 4
Nvidia: 14
undecided: 4

AMD: 4/18 ~= 22%
Nvidia: 14/18 ~= 78%
which is basically the market share these companies have in discrete GPU.

Do you think it should be 50:50? Or what? And why?
I don't think games for review need to be chosen by who sponsors it for 'balance', but by relative popularity as well as genre encompassing. Meaning, the most popular games of genre (FPS, 3PS, RTS, MMO, Racing, etc). When people go to buy games, if they have NV or AMD card, do they only play those titles? No... it isn't even a concern for the sane.

There is simply no way everyone will be happy. But this method ensures game types are covered and the more popular ones and f-all to the AMD/NVIDIA sponsor pissing match.
 
Last edited:
I think we can all agree that out of the box, the Radeon VII is somewhere between a 2070 and a 2080, and that the improved drivers didn't change that. And it has no room for overclocking unless you get a golden sample and put it under water. AMD fans need to sit this one out, come back when Navi shows its cards. There's really not much of an argument. Nvidia is better at the high end and better at power efficiency out of the box, and that's how 95% of people will use these cards. Radeon VII is a stopgap card and it's not going to magically become better than the 2080 through drivers. It is a fine card but not the best.
 
If I may, a suggestion for benchmarks going forward: have a small icon next to the game titles, indicating if they are nVidia or AMD sponsored.

Got a better idea, put a black sticker over everything and only reveal the card brands after people formed an opinion. So all you get is Card ABCD with price ABCD and performance ABCD.

Then, when everything settles, you reveal brands. It'd be very interesting in terms of perceived brand loyalty. Honestly this whole brand loyalty is totally strange to me. For either camp. Neither AMD or Nvidia are in it to make you happy, they're in the game to make money and keep the gears turning for the company. The consumer, and mostly 'gamer' is just a target, nothing else. Ironically BOTH camps are now releasing cards that almost explicitly tell us 'Go f*k yourself, this is what you get, like it or not' and the only reason is because the general performance level is what it is. AMD's midrange is 'fine' for most gaming, and Nvidia's stack last gen was also 'fine' for most gaming. Radeon VII is a leftover from the pro-segment, and Turing in a very similar way is a derivative of Volta, a GPU only released for pro markets. On top of that even the halo card isn't a full die.

We're getting scraps and leftovers and bicker about who got the least shitty ones. How about taking a stance of being the critical consumer instead towards BOTH companies. If you want to win, thát's how it works. And the most powerful message any consumer could ever send, is simply not buying it.
 
Last edited:
having owned nvidia for 4 years,I don't think I have ever complained how my card performs in amd-friendly games once.doesn't matter when perfomance is consistent.when it stops being consistent,then people start making ridiculous game-bias lists.
 
Last edited:
Got a better idea, put a black sticker over everything and only reveal the card brands after people formed an opinion. So all you get is Card ABCD with price ABCD and performance ABCD.

Then, when everything settles, you reveal brands. It'd be very interesting.

That would be a really cool review. However, I think power consumption numbers would give it away.
 
You mean, when you buy 780Ti, you expect it to be beaten by 960 later on, right?


Why the hell not!
You might discover interesting things, if you check how results were changing over time in TP charts.
Fury used to beat 980Ti only at 4k at launch (and even then, barely) at 1440p it was about 10% behind:

From 10% behind to several % ahead. Not bad, is it?

And one more point: FineWine in general refers to graceful aging of AMD cards, especially in contrast with that nVidia does with its customers.
Not about some magic dust coming into play later on or AMD purposefully crippling it at launch (the way certain other company does)

unfortunately that card has an insufficient memory buffer for a lot of games nowadays which results in occasional stutter.
That minuscule performance gain isn't gonna give you better experience overall.
I used to own one and played many titles at 1440p and it was the worst mistake i ever made
There is a "finewine" but that comes mostly from having a bad driver in the first months of that card release and yeah then slowly there is an improvement over time to a level that should have had in the first place.
 
Last edited:
I'd like to test this myself but sadly my Radeon 7 arrived as a Dead on Arrival unit and I had to send it back for a replacement. I suppose I'll get the replacement next week.
Get a 2080 :laugh:

unfortunately that card has an insufficient memory buffer for a lot of games nowadays which results in occasional stutter.
That minuscule performance gain isn't gonna give you better experience overall.
I used to own one and played many titles at 1440p and it was the worst mistake i ever made
There is a "finewine" but that comes mostly from having a bad driver in the first months of that card release and yeah then slowly there is an improvement over time to a level that should have had in the first place.
The VRAM isn't even an issue to me tbh, its the lack of drivers and how poorly kepler aged

I think we can all agree that out of the box, the Radeon VII is somewhere between a 2070 and a 2080, and that the improved drivers didn't change that. And it has no room for overclocking unless you get a golden sample and put it under water. AMD fans need to sit this one out, come back when Navi shows its cards. There's really not much of an argument. Nvidia is better at the high end and better at power efficiency out of the box, and that's how 95% of people will use these cards. Radeon VII is a stopgap card and it's not going to magically become better than the 2080 through drivers. It is a fine card but not the best.
14% slower than a 2080 and 5% than a 1080 ti (on average, according to TPU)
 
This card is unfortunatelly 200$ more than it should have been. Nobody is going to pick this over an 1080Ti for example...

The card will sell because it is really for people with dual purpose who need all that ram for content creation. I got a 2080ti but to say no one is going to buy it is overblown. That is your opinion not a fact, there are plenty of people picking up this card who see value in it within their workload.
 
Man, it's you again on your vendetta against Nvidia. I hoped you would give up after R VII launch.

FineWine doesn't refer to "graceful aging of AMD cards". It refers to AMD not being able to provide proper drivers at the time of launch. So with Nvidia you get that extra 10% the first day, and with AMD you have to wait.

Basically, you just said that instead of getting $1000 today you'd rather get by monthly installments over a year, because then you'd have the sense of earning money.

Also, I would love to learn a way to revoke your rating rights, because you're just running around giving a -1 to anyone who doesn't share your love for Radeon chips. It undermines the already little sense that ranking system has.

My R9 290 cost me $250~ in 2014...
The performance has only gone up since I got it because of drivers and optimisations. I went from 1080p to 1440p without a second thought.
My friend has a 780Ti that constantly had stuttering issues in games... It cost him $400~ in 2014...
All in all, the AMD card is the better investment, because they are cheaper and just get better with updates...
 
Last edited:
My R9 290 cost me $250~ in 2014...
The performance has only gone up since I got it because of drivers and optimisations. I went from 1080p to 1440p without a second thought.
My friend has a 780Ti that constantly had stuttering issues in games... It cost him $400~ in 2014...
All in all, the AMD card is the better investment, because they are cheaper and just get better with updates...
It's not like Nvidia are not improving their drivers either. In fact, over the past decade there have been no driver improvement large enough to shift the relative positioning between competitors, well except perhaps for Nvidia 337.50, the update where Nvidia applied the driver side optimizations of DirectX 12 to DirectX 11 and OpenGL, resulting in one of the greatest driver improvements ever. AMD decided not to do the same kind of optimizations in their driver, to retain a larger performance delta with DirectX 12, and to keep alive the myth that AMD is somehow better in DirectX 12…
 
Last edited:
My R9 290 cost me $250~ in 2014...
The performance has only gone up since I got it because of drivers and optimisations. I went from 1080p to 1440p without a second thought.
My friend has a 780Ti that constantly had stuttering issues in games... It cost him $400~ in 2014...
All in all, the AMD card is the better investment, because they are cheaper and just get better with updates...

Now ask a Fury X owner how he feels versus a 980ti owner...

Both camps have their great and no-so-great cards and also offer fantastic value propositions. Its hit or miss, all the time, for both green and red.In the end, yes, I think we can agree that the 290 versus the 780 (comparing the non-X to the TI is not fair, neither on price or performance, or market share - 290s and 780s sold like hotcakes and the higher models a whole lot less!) is a win for the 290 when it comes to both VRAM and overall performance, but not in terms of power/noise/heat. Also, you needed a pretty good AIB 290 or you'd have a vacuum cleaner - and those surely weren't 250 bucks.

Regardless. Consistency is an issue and the driver approach of both camps is different. I think its a personal view on what is preferable; full performance at launch, or increasing performance over time - with a minor chance of gaining a few % over the competition near the end of a lifecycle. It depends a lot as well on how long you tend to keep a GPU.

And yes, absolutely, a 780ti in 2017 and onwards ran into lots of problems, I've experienced those first hand. At the same time, I managed to sell that card for a nice sum of 280 EUR, which was exactly the amount I bought it for a year earlier, and meant a serious discount on my new GTX 1080. Longevity isn't just an advantage. Keeping a card for a long time also means that by the time you want to upgrade, the value is almost gone, while you're also slowly bleeding money over time because perf/watt is likely lower. Running ancient hardware is not by definition cheaper - that only is true if that hardware is obsolete and will never be replaced anyway. If you are continuously upgrading, keeping a GPU for a longer time than 1 ~1,5 years (make it 3 with the current gen junk being released) is almost NEVER cheaper than replacing it for something of a similar price point and reselling your old stuff at a good price. And you get a bonus: you stay current.
 
Last edited:
Lot of talks about Fury X wakes old memories. :p
I used to own one until 2017 before I replaced it with Vega 64.
At that time I had 1080p monitor @120Hz without Freesync.
For that resolution the Fury X was not bad. The large number of games I played then worked all at Ultra setting with good frame rates.
I sold it at eBay at that time for over 300€. :D
Somebody may still be using it.
 
My Fury X is still going well at 1080p 144hz, but its quite under utilized by my brother (720p 60hz lol)
 
drivers are not going to fix this card being a rebranded instinct card

AMD no longer makes gaming cards they make compute accelerators

this card is never going to perform efficiently under gaming loads, if you wanna play games you buy nVidia
 
drivers are not going to fix this card being a rebranded instinct card

AMD no longer makes gaming cards they make compute accelerators

this card is never going to perform efficiently under gaming loads, if you wanna play games you buy nVidia
Which is really ironic since their compute accelerators aren't really a hit (that's why they're rebranding them for gaming).
They wanted to unify workstation and high-end gaming. This whole business strategy turned out to be a failure.

Everything could change if AMD focused on making purpose-built gaming chips and unify consoles and PC gaming, which would make sense for a change...

We'll see what happens with their datacenter products. IMO they'll give up and Intel takes over.
 
Last edited:
drivers are not going to fix this card being a rebranded instinct card

AMD no longer makes gaming cards they make compute accelerators

this card is never going to perform efficiently under gaming loads, if you wanna play games you buy nVidia
Tbh, Nvidia also put a lot of computing resources in their cards. But when everybody was stuck on 28nm, Nvidia looked to get the most out of the die space and starting with Maxwell, have cut back on compute resources in their gaming cards.
Doesn't make AMD card lesser gaming cards. But, in comparison, it does make GCN resemble Netburst from an efficiency point of view.
 
Low quality post by Frutika007
RIP TechPowerUp. AMD cultists have been spoken.
RIP.jpg


Oh look another AMD cultist giving AdoredTV's review link as an irrefutable proof of Radeon 7's performance increase via new driver.
And looks like INSTG8R is also an AMD cultist,giving -1 rating as i revealed the dark side of his cult.
RIP.jpg
 
enough will the sub 50 post new members kindly go pound sand
your youtube drama is not welcome here
 
RIP TechPowerUp. AMD cultists have been spoken.
View attachment 117076

Oh look another AMD cultist giving AdoredTV's review link as an irrefutable proof of Radeon 7's performance increase via new driver.
And looks like INSTG8R is also an AMD cultist,giving -1 rating as i revealed the dark side of his cult.View attachment 117077

You are getting a -1 from me because I don't need to see random idiot comments. You can find anything on the internet, just google it and boom, done.

Relevance = -1000

If you want to troll, go to that place you found those screens at. Don't do it here.
 
drivers are not going to fix this card being a rebranded instinct card

AMD no longer makes gaming cards they make compute accelerators

this card is never going to perform efficiently under gaming loads, if you wanna play games you buy nVidia


AMD has been doing the same thing since 7970. It was Nvidia who removed compute performance from their GTX cards.Now they have added it back, charging you double calling it RTX.
 
Back
Top