• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon VII Detailed Some More: Die-size, Secret-sauce, Ray-tracing, and More

What are the drawbacks? What advantages does the 2080 have? You can't be talking about RTX and DLSS, can you?
Primarily a major difference in TDP: 215W vs. ~300W.

When you have competing products A and B, which performs and costs the same, but one of them have a major disadvantage, why would anyone ever buy it?
 
I don't consider that a major disadvantage. It's probably less than $20 a year. If that is the only disadvantage then I don't see a problem. Also, throw that 215W out after you start overclocking and lower that 300W when you undervolt.
 
According to AMD the cost of 7nm is significant, with 16 hbm2 I can't imagine it's cheap for them, but i assume they are least making some money.
4r9c2e.jpg

All that slide does is say is the die says remian the same costs go up, so when does a die not get smaller when going from a larger process to a smaller one? With this in mind it shows that they they were are profitting with decreasing margins until jump to 7nm.
 
Late to the party again, but I'd say this is a decent answer to RTX. Maybe not the show stopper that Ryzen was but damn decent none-the-less. It seems AMD has kicked it up.

Good luck with RayTracing in software, if that was viable we would have had that already. If they do it it is just a desperate move not to look obsolete.
Raytracing has been done in software for decades, just not real-time.
Do not expect RayTracing in hardware until end of 2020 and even then they will be years behind nVidia who will, by that time, be in the process of readying their third gen RTX cards for release.
You don't and can't know any of that.
 
Last edited:
[QUOTE="Kaotik, post: 3974472, member: 101367"Unless proven otherwise it should be 64, as the Vega 20 diagrams from Instinct release clearly show 4 Pixel Engines per Shader Engine.[/QUOTE]

I must admit I took the 128 ROPs report as given. If the Instinct diags aren't just high level basic copies of Vega10 slides, then definately 64 ROPS for Vega20.
 
AMD has confirmed that the card's ROP count is 64.
 
Good luck with RayTracing in software, if that was viable we would have had that already. If they do it it is just a desperate move not to look obsolete.

Do not expect RayTracing in hardware until end of 2020 and even then they will be years behind nVidia who will, by that time, be in the process of readying their third gen RTX cards for release.

We need Intel to enter the market with RayTracing from the get go in 2020.

I also have a feeling that AMD may be working secretly with Intel on RayTracing tech to sett up a unified standard against nvidias RTX.

3rd gen rtx card? Not happening lol. NVidia is not going to refresh until 2020. Thats when they will have 7nm. You really think Nvidia is goint to replace rtx 20 series after less then 12 months? They don't have a process to shrink to and they are not in a hurry to do it. Heck they stretched pascal for 2 years. So Nvidia is going to have 3 rtx generations in 3 years lol. Do you realize what you are saying?

Late to the party again, but I'd say this is a decent answer to RTX. Maybe not the show stopper that Ryzen was but damn decent none-the-less. It seems AMD has kicked it up.


Raytracing has been done in software for decades, just not real-time.

You don't and can't know any of that.


yea he thinks nvidia is going to release 3 rtx generations in 3 years 2018, 2019 and then 2020. When pascal went for 2 years alone. Not sure about that rofl.
 
The only way they could have done this is if they priced the Radeon 7 at $649 or $599, not $699. $699 is the same price as the RTX2080 but the 2080 doesn't have the heat, power use, has RT cores, has Tensor cores, etc. Overall the RTX2080 is expensive because it has new tech in it. If I have to pay the same price, I will buy the one with the lower power draw, the lower heat, the advance tech in it.



699 its a good price for that performance, plus dont forget how looks stock cooler. A not shit blowers style.
The rumor is that it costs close to $750 to make the Radeon 7 cards. So no, they are not making money. This is just to stop the bleeding.
 
What are the drawbacks? What advantages does the 2080 have? You can't be talking about RTX and DLSS, can you?
The drawback is the rumored price of $699 and missing technology. If you can get the technology with the other product at the same price why settle? It is like choosing between two identical cars - one has headlights and one doesn't. The salesman can say "hey it is light out right now maybe you won't need those headlights". AMD's engineering has always been adequate but it sold by undercutting competition pricing. If AMD GPU prices intend to match the competition I can't see how they continue to improve their already dismal market shares. NVIDIA's release and pricing led to a major crash in their sales and stock value - I am not sure why a strengthening AMD would want to embrace that model. AMD has a long way to go before they can price with the big boys.
 
It should be noted that Nvidia has a huge ass achilles heel with the RTX series - that RT operations are INT based, and that the card needs to flush to switch between FP and INT operations.

Dedicated Hardware acceleration for RT is a smokescreen IMO, the key is if you can cut down your FP or INT instructions as small as possible and run as many as parallel as possible. AMD does have some FP division capability so its possible that some cards can be retrofitted for RT.
Source? I tried googling it and couldn't find anything.
 
I just remembered how the AMD fanboys were pissing over RTX 2080 's $699 pricetag at launch, but Radeon VII comes along with the same price and suddenly people are claiming it's great value.

No, great value would be if it wasn't just a Vega respin with double the memory bandwidth, double the VRAM, 250MHz higher clocks, and an extra $200 tacked on to the price. The die-shrink to 7nm is going to help with power and heat, but this is still Vega/GCN 5 with all its limitations, and I honestly don't expect this card to outperform GTX 2080 in the way AMD is claiming.
 
For reference, Vega 20 would need about ~40% more performance over Vega 10 to be on par with RTX 2080. I do wonder which changes are going to make that possible.
 
I just remembered how the AMD fanboys were pissing over RTX 2080 's $699 pricetag at launch, but Radeon VII comes along with the same price and suddenly people are claiming it's great value.

No, great value would be if it wasn't just a Vega respin with double the memory bandwidth, double the VRAM, 250MHz higher clocks, and an extra $200 tacked on to the price. The die-shrink to 7nm is going to help with power and heat, but this is still Vega/GCN 5 with all its limitations, and I honestly don't expect this card to outperform GTX 2080 in the way AMD is claiming.
Both are bad value, one being worse than the other doesn't mean either card are good value.
 
Primarily a major difference in TDP: 215W vs. ~300W.

When you have competing products A and B, which performs and costs the same, but one of them have a major disadvantage, why would anyone ever buy it?

Gtx 2080 is around 225w. It remains to be seen what the actual usage is on Radeon 7 during gaming. For that we wait for reviews.

The only way they could have done this is if they priced the Radeon 7 at $649 or $599, not $699. $699 is the same price as the RTX2080 but the 2080 doesn't have the heat, power use, has RT cores, has Tensor cores, etc. Overall the RTX2080 is expensive because it has new tech in it. If I have to pay the same price, I will buy the one with the lower power draw, the lower heat, the advance tech in it.

The rumor is that it costs close to $750 to make the Radeon 7 cards. So no, they are not making money. This is just to stop the bleeding.

I don't think that was how much it costs them to make, it was what they originally wanted to sell it at. Yea I have no doubt they are not making much on it.

Plus lets hold off on that heat portion. Wait for the reviews, you can't complain about heat when you haven't seen the temps yet. Will it use more power? Yea sure doesn't mean its going to run hot.
 
For reference, Vega 20 would need about ~40% more performance over Vega 10 to be on par with RTX 2080. I do wonder which changes are going to make that possible.


This video shows the performance of a Vega 64 clocked at 1,750MHz against an RTX 2080 running at stock clocks. (Also don't forget Vega 64 has 4 more CUs than Radeon VII which makes up for that 50MHz core clock deficit)
Even the memory on the AMD side is overclocked and at those clocks the vega has 580GB of memory bandwidth which is quite a lot.
This is pretty much what you would expect from a Radeon VII to do, maybe a little bit better.
 
Last edited:
Gtx 2080 is around 225w. It remains to be seen what the actual usage is on Radeon 7 during gaming. For that we wait for reviews.
49759116_1432412106894095_2102395545183059968_n.jpg
AMD promises "25% more performance at the same power", whatever that means.
25% is not enough to be on par with RTX 2080.

But as you say, reviews will tell the truth.
 
If you can get the technology with the other product

I fail to see the missing technology. RTX is usable in one game...and the series is trash. DLSS looks like shit compared to the other available methods. I fail to see what benefits the 2080 has.
 
49759116_1432412106894095_2102395545183059968_n.jpg
AMD promises "25% more performance at the same power", whatever that means.
25% is not enough to be on par with RTX 2080.

But as you say, reviews will tell the truth.
If you take it at face value, Radeon VII has 25% more performance for the same power consumption (295w).
13% of that performance comes from the higher boost clock of 1800 MHz (remember, 4 CU short).
12% likely comes from Radeon VII's ability to hold boost clock longer than Vega 64 does.

You know how it goes: they're likely talking about games where Vega 64 does really well against Turing. I highly doubt they're talking about an average.
 
I wonder why AMD is stuck with maximum 4096 SP's ?
I mean.... Fury, Vega (1), Vega II ... they are almost identical.

Considering that the new chip is rather small at 331 mm2, what stopped them from making a 450 mm2 chip for example and fitting 72 CU's in it, or 96 !!
It would wipe the floor with 2080 Ti with 6144 SP's (let's say cut a few for being defective, even with 5760 SP's it would still crush it with raw computer power and that massive 1TBps bandwidth, WHILE BEING A SMALLER CHIP due to 7nm)

Instead, they just shrunk Fury, then shrunk it again without adding anything :(
 
Because bigger = lower yields. AMD is all about mass production these days.
 
I just remembered how the AMD fanboys were pissing over RTX 2080 's $699 pricetag at launch, but Radeon VII comes along with the same price and suddenly people are claiming it's great value.

No, great value would be if it wasn't just a Vega respin with double the memory bandwidth, double the VRAM, 250MHz higher clocks, and an extra $200 tacked on to the price. The die-shrink to 7nm is going to help with power and heat, but this is still Vega/GCN 5 with all its limitations, and I honestliy don't expect this card to outperform GTX 2080 in the way AMD is claiming.

I see people justifying the power consumption but I don't see that, only ONE comment stating "...because the 2080 is $699." was it's 10-series counter also $699 at launch?

49759116_1432412106894095_2102395545183059968_n.jpg
AMD promises "25% more performance at the same power", whatever that means.
25% is not enough to be on par with RTX 2080.

But as you say, reviews will tell the truth.

Good spot, but it probably means what it says to does, it's 25% more effiecient. Assuming it's being compared to the V64, when consuming the same amount of power it does 25% more work. We could probably figure out how much power this card really sucks down with that bit assuming power/perf scales linearly and a little guestimation(2080 power * [V64/2080] ratio * [v7/v64] ratio) puts the card around 400-450w.
 
Last edited:
I wonder why AMD is stuck with maximum 4096 SP's ?
I mean.... Fury, Vega (1), Vega II ... they are almost identical.

Considering that the new chip is rather small at 331 mm2, what stopped them from making a 450 mm2 chip for example and fitting 72 CU's in it, or 96 !!
It would wipe the floor with 2080 Ti with 6144 SP's (let's say cut a few for being defective, even with 5760 SP's it would still crush it with raw computer power and that massive 1TBps bandwidth, WHILE BEING A SMALLER CHIP due to 7nm)

Instead, they just shrunk Fury, then shrunk it again without adding anything :(
They say have removed the 4 Shader Engine limitation on GCN5 (Vega), but I dont believe that. They instead put something to mitigate the limitation like DSBR, NGG fastpath, HBCC, which some of them are broken. AMD should dump GCN for gaming card and start anew.

Even if I dont have my Vega56 I wont buy this card at all, for once it still uses the same limitation since Fiji. They only increase clockspeed and add tiny bit of improvement here and there. Only reason I bought my Vega56 is because it didnt have the dreaded 4GB limitation as Fury so new games wont choke, and I get it for cheap since mining crash.
 
Last edited:
I wonder why AMD is stuck with maximum 4096 SP's ?
I mean.... Fury, Vega (1), Vega II ... they are almost identical.

Considering that the new chip is rather small at 331 mm2, what stopped them from making a 450 mm2 chip for example and fitting 72 CU's in it, or 96 !!
It would wipe the floor with 2080 Ti with 6144 SP's (let's say cut a few for being defective, even with 5760 SP's it would still crush it with raw computer power and that massive 1TBps bandwidth, WHILE BEING A SMALLER CHIP due to 7nm)

Instead, they just shrunk Fury, then shrunk it again without adding anything :(

Because bigger = lower yields. AMD is all about mass production these days.

so...the rules of interactive entertainment is ?
think littel ? - stay littel
think big - get BIG

isn't that whay nvidea have money for drivers and AMD don't ? the money they get from thinking big give them drivers, when AMD fail again and again in drivers and still didn't learned that game developer relations important ? let me guess: the "solution" to developer relations is put more GB/s memory banwith and another 1000 mhz. they never learn. do wrong once, you stupid, do wrong twice you retard, do wrong 3: you insane.
 
do wrong once, you stupid, do wrong twice you retard, do wrong 3: you insane.

What does complaining about driver issues that don't exist make you?
 
whare do you see complaining ? and whare do you see don't exist ?

AMD shills ? I can respect that, you look like a fighter too. "fight for your right for gaming on AMD, kill anyone that looks like against AMD" ?
but you could use a brain: a gaming developer relationship program will benefit AMD more than a few more mhz and a few more GB/s memory banwith
you don't see the advantage of that ? for your own good ? what does that make you ?
how about async compute enabled on all games sounds to you ? should I mention how much faster doom 4 was with async enabled ? and that was just one game where is was used.....WITHOUT AMD's help...and that's just the beginning, are you able to imagine what it could mean if AMD was involved ? in all games ?
oh wait, you are a radeon expert, im sorry you must know more than I do
 
Last edited:
Back
Top