Friday, July 12th 2019

AMD Retires the Radeon VII Less Than Five Months Into Launch

AMD has reportedly discontinued production of its flagship Radeon VII graphics card. According to a Cowcotland report, AMD no longer finds it viable to produce and sell the Radeon VII at prices competitive to NVIDIA's RTX 2080, especially when its latest Radeon RX 5700 XT performs within 5-12 percent of the Radeon VII at less than half its price. AMD probably expects custom-design RX 5700 XT cards to narrow the gap even more. The RX 5700 XT has a much lesser BOM (bill of materials) cost compared to the Radeon VII, due to the simplicity of its ASIC, a conventional GDDR6 memory setup, and far lighter electrical requirements.

In stark contrast to the RX 5700 XT, the Radeon VII is based on a complex MCM (multi-chip module) that has not just a 7 nm GPU die, but also four 32 Gbit HBM2 stacks, and a silicon interposer. It also has much steeper VRM requirements. Making matters worse is the now-obsolete "Vega" architecture it's based on, which loses big time against "Navi" at performance/Watt. The future of AMD's high-end VGA lineup is uncertain. Looking at the way "Navi" comes close to performance/Watt parity with NVIDIA on the RX 5700, AMD may be tempted to design a larger GPU die based on "Navi," with a conventional GDDR6-based memory sub-system, to take another swing at NVIDIA's high-end.
Source: Cowcotland
Add your own comment

123 Comments on AMD Retires the Radeon VII Less Than Five Months Into Launch

#51
Patriot
Yall need some learning. When Fury and Vega were designed for HBM they were the first few products with HBM/2
They helped design the HBM standard and then all the FPGAs and compute cards started using it.
The memory makers then decided they could charge more and increased the price 4x or more.
So between Vega demo and price announcement the price increased 4x on their hbm and it screwed over the competitiveness.
Vii therefore was not an expected consumer product but was always a stopgap/defective mi50.
It cannot be decreased in price at all without losing money and the 5700xt is nipping at it as is.
Big navi was too power hungry and got bumped to 7nm+ for 15% power reduction.
Posted on Reply
#52
Metroid
I knew this would happen, Vega was always a failed GPU, was good for amd to have vega cause now they know to never make that mistake ever again ehhe, also I predicted this would happen, on january 2019 when lisa said we will launch radeon vii and not ryzen 3xxx series i knew amd was up to something and that was shift whole 7nm production for radeon vii for maximum profit gain and then eol it once navi is released.
Posted on Reply
#53
turbogear
Well I own one and quite satisfied with it. With the water cooler, I run it at 1900MHz @1071mV.
Hopefully they will keep supporting it with driver updates otherwise I will be screwed. :ohwell:
Posted on Reply
#54
EatingDirt
As I've said many previous times, this GPU was nothing but some failed Vega Instinct cards(-4 CU) being turned into consumer cards so rather than throw away silicon that was functional, they could sell it and make back some of the money that went into manufacturing them.
Posted on Reply
#55
vega22
EarthDog, post: 4079733, member: 79836"
What good was/is HBM? It was more expensive than GDDR5X and GDDR6. Sure, it offered more bandwidth, but for what? It doesn't matter at 1080p or 2560x1440. Maybe 4K, but no card that had HBM/HBM2 had enough horsepower for 60 FPS 4K Ultra/High... so what? Sew buttons... :p

HBM, in its current implementation really didn't bring much to the people. So, yay for that innovation? What really did it bring us???? They pushed GCN forward several generations... middling performance at more power use (but its cheaper!!!)... what did I miss?

Where was this sentiment when RTX came out with ray tracing?
Great card for rendering, great for ai and other heavy compute applications, great for all the things other than mental masterbation. If all you do is sit in a dark room playing with yourself there are much better cards which can do that job.

If hbm is such a flop why are both and and NV still selling their top tier cards with it? Add in cards that cost more than most people's entire gaming system. For just 1 card. Why are chips going into self driving cars with it?

Sure it is not needed for gamers because gaming GPU are still not powerful enough to require that bandwidth. But they will be soon. Who knows if gddr will catch up by that point?

How long will it be before hbm is in CPU with 3d stacked GPU and a whole gaming system is in a single chip?

Who knows what will come of it, but and has for sure helped it get where it is.

As for NV and rtx, I'm guessing you're just a salty fan boi for bringing it up tbh.

I think they did good in pushing rtrt, it is no doubt the future for gaming and CGI. I never gave them stick for it, just for the way they acted like they created and totaly ignoring the fact it has been around since halflife 2....
Posted on Reply
#56
theoneandonlymrk
EatingDirt, post: 4079955, member: 176500"
As I've said many previous times, this GPU was nothing but some failed Vega Instinct cards(-4 CU) being turned into consumer cards so rather than throw away silicon that was functional, they could sell it and make back some of the money that went into manufacturing them.
Half of Nvidias range are failed versions of a better Gpu?, always.
Vega is not a failure, it was designed for multiple use cases and is competitive in a few, many bought and are happy with them as will be the imac pro owners with a better version of it.
I am not surprised to me radeonVII was always a stopgap answer Amds core fans wanted.
And yes, hopefully that sapphire 5900/5800/5700/5600 rumour has weight, sure would help nudge me away from a lowly vega 64 I'm struggling with (sarcasm that bit).
Posted on Reply
#57
EarthDog
vega22, post: 4079958, member: 41040"
Great card for rendering, great for ai and other heavy compute applications, great for all the things other than mental masterbation. If all you do is sit in a dark room playing with yourself there are much better cards which can do that job.

If hbm is such a flop why are both and and NV still selling their top tier cards with it? Add in cards that cost more than most people's entire gaming system. For just 1 card. Why are chips going into self driving cars with it?

Sure it is not needed for gamers because gaming GPU are still not powerful enough to require that bandwidth. But they will be soon. Who knows if gddr will catch up by that point?

How long will it be before hbm is in CPU with 3d stacked GPU and a whole gaming system is in a single chip?

Who knows what will come of it, but and has for sure helped it get where it is.

As for NV and rtx, I'm guessing you're just a salty fan boi for bringing it up tbh.

I think they did good in pushing rtrt, it is no doubt the future for gaming and CGI. I never gave them stick for it, just for the way they acted like they created and totaly ignoring the fact it has been around since halflife 2....
Bruh.. gaming. This is marketed as a gaming card. We weren't really talking much else. Great for compute... tried to bring it down to the consumer level to fill a gap... ick. Didn't do much, that HBM. 99.9% of people here could give 2 poos about compute. This is a lightweight forum of 'enthusiasts' and 'gamers'. Ther's a few who care, but, really, this is a gaming card and HBM has little place on it so far.

I really don't appreciate being called salty though, in particular when its pretty far from the truth and I was just responding to your post. :)
Posted on Reply
#58
Bones
Metroid, post: 4079942, member: 178915"
I knew this would happen, Vega was always a failed GPU, was good for amd to have vega cause now they know to never make that mistake ever again ehhe, also I predicted this would happen, on january 2019 when lisa said we will launch radeon vii and not ryzen 3xxx series i knew amd was up to something and that was shift whole 7nm production for radeon vii for maximum profit gain and then eol it once navi is released.
As far as I'm concerned something like Rad VII was needed for AMD to learn what and how these chips would be since they were based on a different nm sized die.
I mean you know, yet you don't "Know" exactly what it will do, how well it will hold up, perform..... All this is like anything else, growing pains based on 7nm, which is expected no matter who makes it.

Thing now is to see if they can do better with the next release - And probrably will, question with that is how much better?
We'll find out soon enough.
Posted on Reply
#59
xkm1948
EarthDog, post: 4079973, member: 79836"
Bruh.. gaming. This is marketed as a gaming card. We weren't really talking much else. Great for compute... tried to bring it down to the consumer level to fill a gap... ick. Didn't do much, that HBM. 99.9% of people here could give 2 poos about compute. This is a lightweight forum of 'enthusiasts' and 'gamers'. Ther's a few who care, but, really, this is a gaming card and HBM has little place on it so far.

I really don't appreciate being called salty though, in particular when its pretty far from the truth and I was just responding to your post. :)
Completely agree. Also saying this one more time. Vega/Fury/Radeon7 are bad for compute. Horrible CUDA support and horrible Tensorflow support. Any sane researcher would not use this unless they enjoy spending 90% of their time debugging FOR AMD instead of getting actual research done.

Only logical compute use for these GCN+HBM cards are crypto mining or whatever super niche application where AMD actually bothered to polish their software support enough for normal use.
Posted on Reply
#60
jmcslob
I feel for Radeon VII owners...
I owned an HD2900...
my heart goes out.
Posted on Reply
#61
xkm1948
jmcslob, post: 4079988, member: 67555"
I feel for Radeon VII owners...
I owned an HD2900...
my heart goes out.
HD2900 oof

Literally the worst ATi GPU ever
Posted on Reply
#62
vega22
@EarthDog

Sorry, I didn't mean to offend you. I don't know you well enough to really judge your stance on that. But you must understand my point of view too? By that I mean bringing team green up in a team red discussion does tend to be more of a fan boi move.

Personally I like to beat on them all when they take the piss. We all know they all do it too. Be it rebrands, false specs or the old bait and switch. Why I don't hold as much weight on release day reviews as I do those where they get retail items themselves.
Posted on Reply
#63
bug
ratirt, post: 4079654, member: 165024"
It is a good news. This means AMD is planning to release something bigger based on Navi :) Maybe I will be changing my V64 after all. Hopefully this year. That would be great :)
Yes, that's what it means :wtf:
Posted on Reply
#64
EarthDog
vega22, post: 4079993, member: 41040"
@EarthDog

Sorry, I didn't mean to offend you. I don't know you well enough to really judge your stance on that. But you must understand my point of view too? By that I mean bringing team green up in a team red discussion does tend to be more of a fan boi move.

Personally I like to beat on them all when they take the piss. We all know they all do it too. Be it rebrands, false specs or the old bait and switch. Why I don't hold as much weight on release day reviews as I do those where they get retail items themselves.
I don't understand how that makes me a fanboy. I simply wished the same sentiment for team green as the same who were begging for patience for vulkan and whatever else AMD came up with API wise, doesn't have the same patience for another company's features... hence why I wished for the same sentiment... for being FAIR to BOTH sides and not just one.

Anyway, that is off the beaten path for the thread, but wanted to clarify why I said that.
Posted on Reply
#65
kings
No surprise, really.

The Radeon VII was just a cry from AMD "do not forget us, we're still here".

The card never made much sense, I don´t blame AMD for trying to get away from it as soon as possible!
Posted on Reply
#66
lexluthermiester
Manu_PT, post: 4079691, member: 168799"
So when nVidia launches a new expensive GPU and months later replaces it, the internet goes wilde "enjoy getting robbed lel" "you got owned by ngeedia lel".

Now AMD completly eliminates a GPU from its line up and 5700XT is within 12% of its performance for half the price and everyone is "good news! Means they will release something better soon!" "no problem! 5700XT is such a great card anyway".

Amazing.
Careful there, your extreme fanboy is showing.

EarthDog, post: 4079998, member: 79836"
for being FAIR to BOTH sides and not just one.
Yes, this. We need to also remember that business is business and sometimes things just change.

kings, post: 4080001, member: 180022"
The Radeon VII was just a cry from AMD "do not forget us, we're still here".
No it wasn't.
kings, post: 4080001, member: 180022"
The card never made much sense, I don´t blame AMD for trying to get away from it as soon as possible!
2080 level performance for less money... How does that not make sense?
Posted on Reply
#67
Fluffmeister
bug, post: 4079994, member: 157434"
Yes, that's what it means :wtf:
I do find it fascinating how people react depending on their brand preference.

They would be torching Nvidia's shiny new headquarters if this was an Ngreedia card.

The spin here is..... good news! AMD are lovely to their customers! This must mean something better is coming!

Nvidia can't buy that mindshare for love nor money.
Posted on Reply
#68
kings
lexluthermiester, post: 4080019, member: 134537"
No it wasn't.

2080 level performance for less money... How does that not make sense?
It's a very expensive Instinct card, aimed at gaming and sold at gaming prices. It´s far from ideal. If Nvidia had not raised the prices on the RTX cards, it probably would not have existed because of the lack of margin.

AMD had to show anything due to the Turing launch, Vega were getting more and more behind, so that's what AMD got to show off! And there's no harm in it, that's what they had!

And the fact that after 5 months they retired the card, further reinforces the idea that it was a card only to make a statement, nothing more!
Posted on Reply
#69
lexluthermiester
Fluffmeister, post: 4080036, member: 101373"
I do find it fascinating how people react depending on their brand preference.
Right? I've always found it very interesting how people internalize brand loyalty. I have even caught myself doing it years ago. It's a weird psychological condition. It happens in all areas of life.
Fluffmeister, post: 4080036, member: 101373"
The spin here is..... good news! AMD are lovely to their customers! This must mean something better is coming!
That's actually not a spin. Navi has turned out better than AMD was expecting and they seem to be preparing to release more advanced versions of that technology. Seems like standard business planning to me.
Fluffmeister, post: 4080036, member: 101373"
Nvidia can't buy that mindshare for love nor money.
Oh that's not true. Lots of people, myself included, really enjoy Geforce/Quadro cards. You couldn't get me to trade out my 2080 for a Radeon right now even if you paid me to. RTX is rockin and there is no argument based on logic, reason and merit that can refute that fact. That view has nothing to do with being a fanboy and everything to do with actual comparable performance. If AMD had a compelling GPU out that stomped on NVidia's offerings, I'd have one of them. That of course presumes that one can afford the upper level RTX cards. For people on a budget, I have, do and will continue to recommend Radeon GPU's because bang-for-buck they offer a lot of performance value.

With the EOL of Radeon7, AMD opens the door for the 5800, 5800XT, 5800XTX, 5900XT and 5900XTX. These are the cards that will likely hit the market next, and I say bring it on.

kings, post: 4080042, member: 180022"
It's a very expensive Instinct card, aimed at gaming and sold at gaming prices. It´s far from ideal.
That's an opinion. For the performance offered, it's price was competitive with NVidia's offering.
kings, post: 4080042, member: 180022"
If Nvidia had not raised the prices on the RTX cards, it probably would not have existed because of the lack of margin.
NVidia didn't raise the prices on RTX. After release they only went down in price. I was watching very closely.
kings, post: 4080042, member: 180022"
AMD had to show anything due to the Turing launch, Vega were getting more and more behind
They were behind only in the top-tier market. Radeon 7 fixed that with a card that hit 2080 level of performance for less money. In the mid-range and budget market's AMD beat NVidia's offering handily and still do.
kings, post: 4080042, member: 180022"
so that's what AMD got to show off! And there's no harm in it, that's what they had!
Agreed and it was a good showing.
kings, post: 4080042, member: 180022"
And the fact that after 5 months they retired the card, further reinforces the idea that it was a card only to make a statement, nothing more!
It was a card that filled a gap and maybe you're right, it did show one thing, AMD can compete with the upper range. IF they had upped their R&D game and released expanded versions of the Radeon7, they could have competed with 2080ti and even RTX Titan, but in similar price tiers.
Posted on Reply
#70
Darmok N Jalad
I suspect AMD planned Radeon VII when the mining craze was still in full force, as all these products have a lead time. Sure, they would have still marketed it as a gaming card, but most gaming cards were going into mining rigs and the prices were terrible. By the time it was ready to launch, mining went bust and the market was oversupplied, so it pretty much just had to be a limited run collector's edition. If mining was still in full force, they may have put a crazy price tag on it, kept the product going and made good margins.

turbogear, post: 4079945, member: 145848"
Well I own one and quite satisfied with it. With the water cooler, I run it at 1900MHz @1071mV.
Hopefully they will keep supporting it with driver updates otherwise I will be screwed. :ohwell:
Macs are full of Vega cards, so I can't imagine it will drop to no support anytime soon. Granted, Mac support is not the same as Windows gaming support, but I don't see AMD just letting it die.
Posted on Reply
#71
lexluthermiester
Darmok N Jalad, post: 4080061, member: 170588"
I suspect AMD planned Radeon VII when the mining craze was still in full force, as all these products have a lead time.
This is my thought as well.
Posted on Reply
#72
photonboy
ASUS Strix (3-fan) RX-5700XT 16GB
A lot of people want a card like THAT if they are going to drop the Radeon VII. Even 12GB isn't enough for a lot of VIDEO EDITING users. Digital Foundry talks about this. With PCIe v4.0 boosting performance in some applications (not games) then that's a card that AMD should be selling. Just allow another 8GB of VRAM and problem solved. Will AMD do this or just leave a hole in the market for NVidia to fill?

Wendel's buddy already noted that the RX-5700XT beat the RTX2080Ti in his productivity test though he admitted not comparing it to PCIe v3.0. Either way it beat the RTX2080Ti and since the RTX2080Ti doesn't support PCIe v4.0 it can't pull ahead in those tests. A perfect setup for higher-end productivity would then be an R9-3900x (12-core), RX-5700XT 16GB card, 64GB of DDR4 3600MHz CL16, and an X500 motherboard with no goddamn chipset fan (good heatsink)
Posted on Reply
#73
kings
lexluthermiester, post: 4080019, member: 134537"
IF they had upped their R&D game and released expanded versions of the Radeon7, they could have competed with 2080ti and even RTX Titan, but in similar price tiers.
In performance maybe, but in efficiency they would probably be cards to simply avoid.

Radeon VII is already a +/-300W card, to increase 30% more performance, what monstrous consumptions would it have?

Not to mention the problem that would be to cool a card with so much power consumption!

I can see that happening with Navi, but with Vega architecture, I just think it would be a waste of time for AMD.
Posted on Reply
#74
newtekie1
Semi-Retired Folder
We knew this was a stop-gap card when it was released.
Posted on Reply
#75
lexluthermiester
kings, post: 4080084, member: 180022"
In performance maybe, but in efficiency they would probably be cards to simply avoid.
People who are looking for performance don't care about how much electricity is being used. Let's be fair the difference is counted in pennies per month and I know this from experience. I've owned a GTX590;
https://www.evga.com/products/specs/gpu.aspx?pn=C83BF35F-63BE-4DAA-9B7F-0CB8DAEA1AF6
That was a monstrosity of a card at 370w of power draw under load. I expected my power bill to shoot up. It did not. One month to the next, my bill went up by an entire $1, and that was in a summer month with the central air running. Into the winter very little impact to the power bill. So this constant argument(it's not just you) about power usage needs to stop. It's worthy of consideration until that consideration is compared to performance, then it needs to be ignored.

kings, post: 4080084, member: 180022"
Not to mention the problem that would be to cool a card with so much power consumption!
It's not as big a problem as you'd think.

kings, post: 4080084, member: 180022"
I can see that happening with Navi, but with Vega architecture, I just think it would be a waste of time for AMD.
All I'm saying is that they could have done it, and with Navi they still can. It makes more sense. If I were AMD, I'd pull out all the stops and build an RTX Titan killer. They can do it and likely at a much better price to the end user while still making solid profit.

What I'm waiting to see is AMD's answer to RTX's RTRT. Come on AMD, get on it!
Posted on Reply
Add your own comment