• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

CHOO CHOOOOO!!!!1! Navi Hype Train be rollin'

What about them? There is an odd case of having better performance due to 1GB of extra VRAM but that os really all she wrote and has little to do with the argumentation which is that, once again we are hearing a repackaged AMD fine wine here. We know this doesnt really exist. In the larger scheme of things a 680, and a 7970 are equally obsolete and relegated to budget/low end performance level.

It needs no discussion that GCN is lacking the efficiency it needs to compete. That will only get worse as the die space and power budget is limited. The best architecture is the one that can keep clear of these limitations. The moment you touch them on the current node is a sign you are getting behind the curve, and AMD ignored those signs since 2013. Nvidia offers us an architectural update every time they risk having to move up from their cut down big die. The only time we got the full Titan in a consumer chip was with the 780ti, and only because Hawaii existed.


Hawaii was a worse performing graphic die but had potential for compute based off what AMD saw as the limiting factors of Tahiti. It was also aimed for a node shrink which didn't happen, and it took a mediocre chip and made it more bland with only 17% more performance than a 7970Ghz while using 13% more power too.

Adding more streaming processors made it worse, again, due to poor management of resources, but they worked great at actual streaming loads like Compute, mining anyone?
 
What about them? There is an odd case of having better performance due to 1GB of extra VRAM but that os really all she wrote and has little to do with the argumentation which is that, once again we are hearing a repackaged AMD fine wine here. We know this doesnt really exist. In the larger scheme of things a 680, and a 7970 are equally obsolete and relegated to budget/low end performance level.

It needs no discussion that GCN is lacking the efficiency it needs to compete. That will only get worse as the die space and power budget is limited. The best architecture is the one that can keep clear of these limitations. The moment you touch them on the current node is a sign you are getting behind the curve, and AMD ignored those signs since 2013. Nvidia offers us an architectural update every time they risk having to move up from their cut down big die. The only time we got the full Titan in a consumer chip was with the 780ti, and only because Hawaii existed.
It Needs no discussion here, he mentioned that stuff I replied ,no more off topic drivel to come from me (though you won't stop clearly)on old tech here check my reply to him and move on.


Possibly read the Op both you and him.

Personally I think you and him are just trolling for an argument, I can't believe how many times you want to argue about the same shit in so many different thread's personally.
 
Last edited:
Amazing, all it takes to get the hype train rolling again is one random guy on YouTube saying AMD's next gen may end up faster than Nvidia's current gen. Did anyone even bother to check out the source?

Navi 1x and 2x is the same architecture and will share performance characteristics. The differences are feature sets and core configurations.
 
The well claimed “RTG Knight” AdoredTV himself is starting to trash Navi in his latest video



Some TL;DW from reddit user u/WinterCharm


Detailed Tl;Dw: (it's a 30 min video)

First half of video discusses possibility of Navi being good - mainly by talking about the advantage of new node vs old node, and theoretical improvements (AMD has made such strides before, for example, matching the R9 390 with RX 580, at lower power and cost). Then, discusses early rumors of Navi, and how they were positive, so people's impressions have been positive up until now, despite some nervousness about delay.

Now, the bad news:

1. ⁠Very early samples looked promising, but there's a clockspeed wall that AMD hit, required a retape, hence missing the CES launch.
2. ⁠Feb reports said Navi unable to match Vega 20 clocks.
3. ⁠March reports - said clock targets met, but thermals and power are a nightmare
4. ⁠April - Navi PCB leaked, could be engineering PCB, but 2x8 pins = up to 375 (ayyy GTX 480++) power draw D:
5. ⁠Most recently, AdoredTV got a message from a known source saying "disregard faith in Navi. Engineers are frustrated and cannot wait to be done!"

Possible Product Lineup shown in this table is "best case scenario" at this point. Expect worse.

RIP Navi. We never even knew you. :(

It's quite possible that RTG will be unable to beat the 1660Ti in perf/watt on a huge node advantage (7nm vs 12nm)

Edit: added more detail. Hope people dont mind.



It's quite possible that RTG will be unable to beat the 1660Ti in perf/watt on a huge node advantage


Let that sink in.


Nvidia on 7nm is going to be a bloodbath.
 
The well claimed “RTG Knight” AdoredTV himself is starting to trash Navi in his latest video



Some TL;DW from reddit user u/WinterCharm


Detailed Tl;Dw: (it's a 30 min video)

First half of video discusses possibility of Navi being good - mainly by talking about the advantage of new node vs old node, and theoretical improvements (AMD has made such strides before, for example, matching the R9 390 with RX 580, at lower power and cost). Then, discusses early rumors of Navi, and how they were positive, so people's impressions have been positive up until now, despite some nervousness about delay.

Now, the bad news:

1. ⁠Very early samples looked promising, but there's a clockspeed wall that AMD hit, required a retape, hence missing the CES launch.
2. ⁠Feb reports said Navi unable to match Vega 20 clocks.
3. ⁠March reports - said clock targets met, but thermals and power are a nightmare
4. ⁠April - Navi PCB leaked, could be engineering PCB, but 2x8 pins = up to 375 (ayyy GTX 480++) power draw D:
5. ⁠Most recently, AdoredTV got a message from a known source saying "disregard faith in Navi. Engineers are frustrated and cannot wait to be done!"

Possible Product Lineup shown in this table is "best case scenario" at this point. Expect worse.

RIP Navi. We never even knew you. :(

It's quite possible that RTG will be unable to beat the 1660Ti in perf/watt on a huge node advantage (7nm vs 12nm)

Edit: added more detail. Hope people dont mind.


Heres some light on where RTG is.
 
The well claimed “RTG Knight” AdoredTV himself is starting to trash Navi in his latest video



Some TL;DW from reddit user u/WinterCharm


Detailed Tl;Dw: (it's a 30 min video)

First half of video discusses possibility of Navi being good - mainly by talking about the advantage of new node vs old node, and theoretical improvements (AMD has made such strides before, for example, matching the R9 390 with RX 580, at lower power and cost). Then, discusses early rumors of Navi, and how they were positive, so people's impressions have been positive up until now, despite some nervousness about delay.

Now, the bad news:

1. ⁠Very early samples looked promising, but there's a clockspeed wall that AMD hit, required a retape, hence missing the CES launch.
2. ⁠Feb reports said Navi unable to match Vega 20 clocks.
3. ⁠March reports - said clock targets met, but thermals and power are a nightmare
4. ⁠April - Navi PCB leaked, could be engineering PCB, but 2x8 pins = up to 375 (ayyy GTX 480++) power draw D:
5. ⁠Most recently, AdoredTV got a message from a known source saying "disregard faith in Navi. Engineers are frustrated and cannot wait to be done!"

Possible Product Lineup shown in this table is "best case scenario" at this point. Expect worse.

RIP Navi. We never even knew you. :(

It's quite possible that RTG will be unable to beat the 1660Ti in perf/watt on a huge node advantage (7nm vs 12nm)

Edit: added more detail. Hope people dont mind.
Sadly, it all makes sense. AMD *really* needs to go back to the drawing board. Start from scratch.
 
Sadly, it all makes sense. AMD *really* needs to go back to the drawing board. Start from scratch.

Arcturas is the change but sounds to be an instinct card.
 
33:30 is where the TL;DR is and only Navi 10 is expected any time soon (Q3). Copied verbatim:
[table="head"]Card|Chip|CUs|Memory|Performance|TDP|Price
RX 3080 XT|Navi 10|56||~RTX 2070|190W|$330
RX 3080|Navi 10|52|8GB GDDR6|Vega 64 + 10%|175W|$280
RX 3070 XT|Navi 10|48||Vega 64|160W|$250[/table]
Sounds to me like the first batch of cards performed fantastic but then they discovered they performed fantastic because something was broken. Now they fixed that something and now it's just incremental GCN, nothing fantastic. Hopefully they learned something from the mistake that can help improve Arcturus but I won't hold my breath.

Arcturas is the change but sounds to be an instinct card.
Navi 20 is coming in 2020 (apparently using HBM2) and it will be replacing Radeon VII and debuting as a Radeon Instinct card. It's not clear what Arcturus is at this point (other than beyond Navi).


The tonal difference in what Lisa Su told investors is very telling, I think. She not only knows it's worse than expected, but also knows RTG is struggling. I hope she has a plan that works to make RTG competitive again on the technology front. With Intel entering the fray soon, AMD can't afford to dilly dally with GCN much longer.
 
Last edited:
33:30 is where the TL;DR is and only Navi 10 is expected any time soon (Q3). Copied verbatim:
[table="head"]Card|Chip|CUs|Memory|Performance|TDP|Price
RX 3080 XT|Navi 10|56||~RTX 2070|190W|$330
RX 3080|Navi 10|52|8GB GDDR6|Vega 64 + 10%|175W|$280
RX 3070 XT|Navi 10|48||Vega 64|160W|$250[/table]
Sounds to me like the first batch of cards performed fantastic but then they discovered they performed fantastic because something was broken. Now they fixed that something and now it's just incremental GCN, nothing fantastic. Hopefully they learned something from the mistake that can help improve Arcturus but I won't hold my breath.


Navi 20 is coming in 2020 (apparently using HBM2) and it will be replacing Radeon VII and debuting as a Radeon Instinct card. It's not clear what Arcturus is at this point (other than beyond Navi).
These 3 cards are too close together in performance, power and price to be viable. The middle one is pointless.
On a more positive note, 2070 perf for $330, is pretty nice. NV can always drop prices but those chips are massive and I doubt they'll sell them at a loss.

Now whether Navi hits those targets or not, is to be seen. The lack on any proper info from AMD is worrisome, and I'm guessing it 'll be just another Polaris. Good enough but not great. Then again with the amount of money AMD has been pouring into RTG, or better yet, lack thereof, what are we to expect anyway.

Another thing to bare in mind is that the entire Navi lineup will be out in 2020, so I personaly wouldn't expect Arcturus (or whatever this next non-GCN arch will be called) before 2021...
In the mean time NV can just shrink Turing to 7nm and continue to rip us of.
 
Yeah, Adored said the numbers I copied are likely optimistic.

Navi was funded by Sony so it should be a better overall architecture than Vega in terms of features and support but, where Vega translated fantastic to 7nm, Navi didn't. The delays are likely to improve yields, performance, and leakage.

What was to replace GCN likely started R&D immediately after Raja Kuduri left at the end of 2017. New ISAs take about five years to develop so 2022 or 2023 is the soonest we'll see it. I don't think that's Arcturus because it's way too soon (have to remember Arcturus has been delayed because of Navi delays). GCN's replacement could be what immediately follows Arcturus.
 
Last edited:
These 3 cards are too close together in performance, power and price to be viable. The middle one is pointless.
These may be release candidates. That's how you make products: test multiple versions, analyze, research the market and launch a few that make most economic sense.
For AMD it's natural to make a "candidate" every 4 CU.
On a more positive note, 2070 perf for $330, is pretty nice.
Maybe today, but not in 2020-2021 - the period when this card is going to be offered.
NV can always drop prices but those chips are massive and I doubt they'll sell them at a loss.
NV sells them with big margin now - something they can always sacrifice if needed.
And 2020 will be half way (or better) to another gen with similar performance/price as these virtual Navi chips. Just real.
Then again with the amount of money AMD has been pouring into RTG, or better yet, lack thereof, what are we to expect anyway.
You're a client. Why would you care how much R&D money Radeon has? It's AMD's business strategy. You should only look at what they sell you.

Basically, many AMD fans say something like this: objectively Radeon GPUs are sh*t, but AMD is small, poor and doesn't give a f*ck, which makes Radeon GPUs great.
In the mean time NV can just shrink Turing to 7nm and continue to rip us of.
I don't understand how you can say about RTG lacking R&D budget and call Nvidia ripping clients off in literally next paragraph. Where is R&D money coming from in your universe?
So what you meant here was: NV can shrink Turing and continue to sell a leading product - at a premium for being the only company that actually gives a f*ck (at least until Intel joins the race).
 
Last edited:
Yeah, Adored said the numbers I copied are likely optimistic.

Navi was funded by Sony so it should be a better overall architecture than Vega in terms of features and support but, where Vega translated fantastic to 7nm, Navi didn't. The delays are likely to improve yields, performance, and leakage.

What was to replace GCN likely started R&D immediately after Raja Kuduri left at the end of 2017. New ISAs take about five years to develop so 2022 or 2023 is the soonest we'll see it. I don't think that's Arcturus because it's way too soon (have to remember Arcturus has been delayed because of Navi delays). GCN's replacement could be what immediately follows Arcturus.
Here's one of my theories.
We have heard that Navi was supposed to be a light-weight GCN, with all the compute-heavy elements removed in favor of pure gaming performance. PS5 is a gaming console after all, and while it's still an old GCN at heart, at least it will not be inefficient as Vega (when it comes to gaming).
So what if, while removing/changing all the quirks and features, they somehow screwed up, and now they can't hit clocks as high as V20.

Sony probably doesn't care, Navi inside PS5 will definitely not clock as high as 1800MHz
That's why we heard rumors that Navi looks better than expected. Well yeah, for a ~1500MHz chip. But when you try to push it, it fails hard.
Which brings me to this.
Could we see a damage control card? Take Polaris, buff it up, shrink it with some minor generational improvements to GCN and call it a day...
I'm probably wrong, but it's something to ponder about.
 
The well claimed “RTG Knight” AdoredTV himself is starting to trash Navi in his latest video



Some TL;DW from reddit user u/WinterCharm


Detailed Tl;Dw: (it's a 30 min video)

First half of video discusses possibility of Navi being good - mainly by talking about the advantage of new node vs old node, and theoretical improvements (AMD has made such strides before, for example, matching the R9 390 with RX 580, at lower power and cost). Then, discusses early rumors of Navi, and how they were positive, so people's impressions have been positive up until now, despite some nervousness about delay.

Now, the bad news:

1. ⁠Very early samples looked promising, but there's a clockspeed wall that AMD hit, required a retape, hence missing the CES launch.
2. ⁠Feb reports said Navi unable to match Vega 20 clocks.
3. ⁠March reports - said clock targets met, but thermals and power are a nightmare
4. ⁠April - Navi PCB leaked, could be engineering PCB, but 2x8 pins = up to 375 (ayyy GTX 480++) power draw D:
5. ⁠Most recently, AdoredTV got a message from a known source saying "disregard faith in Navi. Engineers are frustrated and cannot wait to be done!"

Possible Product Lineup shown in this table is "best case scenario" at this point. Expect worse.

RIP Navi. We never even knew you. :(

It's quite possible that RTG will be unable to beat the 1660Ti in perf/watt on a huge node advantage (7nm vs 12nm)

Edit: added more detail. Hope people dont mind.
he pumped up the expectations so high they went through the roof,reaps what he sowed.

they did blind tests with Vega,now they'll be doing deaf ones too :roll:
 
Last edited:
Navi cannot be a total failure imho. It just might not be much more efficient then the last nVidia GPUs as it should due to the 7nm process. VFM will determine its success or failure as a product. For us enthusiast might be a mediocre product though. As customers we need competition to force the companies to lower the prices. Let's wait and see.
 
Honestly why did anyone ever think the first iteration(s) of Navi were going to be a massive performance jump? The arch was never touted as being that, first and foremost it was going to finally get Vega's inefficiency and shitty margins out of the gaming segment. Seems to do that. And only that.

So really, we have no news here, just people that have been pulled into wishful thinking mode and are now kicked back to reality.

Navi cannot be a total failure imho. It just might not be much more efficient then the last nVidia GPUs as it should due to the 7nm process. VFM will determine its success or failure as a product. For us enthusiast might be a mediocre product though. As customers we need competition to force the companies to lower the prices. Let's wait and see.

Well... I think no commercial success does apply as a total failure in most companies that like to sell product. AMD's been looking at that for a looong time now. Radeon VII is just the same. They move units, but they don't make profit. Only when the mining craze was at its peak did they have some worthwhile margins... on old product. The consoles are their best segment really and its clear these chips are geared towards that before anything else.
 
Last edited:
4. ⁠April - Navi PCB leaked, could be engineering PCB, but 2x8 pins = up to 375 (ayyy GTX 480++) power draw D:

Traced for 2x8-pin does not mean it will definitively have 2x8-pin. By that logic all of nVidia's previous flagship reference boards were 450W cards since they had 2x8-pin and 1x6-pin on the PCB.
 
Last edited:
Sony probably doesn't care, Navi inside PS5 will definitely not clock as high as 1800MHz
That's why we heard rumors that Navi looks better than expected. Well yeah, for a ~1500MHz chip. But when you try to push it, it fails hard.

Every architecture hits a power wall, welcome to the world of integrated circuits. Come on people, let's stop pretending this stuff is new.

Traced for 2x8-pin does not mean it will have definitively have 2x8-pin.

Yeah but we better spew some sensationalist nonsense while we can because it's cool.
 
Last edited:
Every architecture hits a power wall, welcome to the world of integrated circuits. Come on people, let's stop pretending this stuff is new.
I'm not pretending anything, I simply made the parallel between V20 which can hit those clocks and Navi which apparently can't. Because let's face it, it's all GCN.
Further more souldn't, in theory, a more simple arch (Navi) reach higher clocks than the compute-heavy V20.
But for the sake of argument let's put Vega aside for a moment. Navi is a Polaris successor after all.
Let's take RX 580. It hits a wall at around 1600 with (arguably) reasonable power draw. Now let's put Navi at maybe 1700. Okayish but nothing mind blowing, right. But you also went from GloFo 14nm to Tsmc 7nm. Plus high power draw and temps that get mentioned as well. Now that's definitely worrisome.
So that takes me back to my previous statement that they probably screwed up something while slimming the architecture OR Tsmc's 7nm is not the be all end all process we hoped.
Like I said this me thinking out loud.
 
Navi cannot be a total failure imho.
Why would it be a failure?
AMD holds 20% of dGPU market. You can't call AMD GPUs failed if you praise their CPUs.

It's almost impossible for it to be worse than last Polaris cards. It'll be OK for gaming and as chips get bigger and TSMC node gets better, Navi will get to 4K @ 60fps. Maybe even with hardware RT similar to Nvidia's.

But that's one side of a coin. Another one is delivering on expectations.
Literally from the day Vega launch and turned out to be somehow disappointing, countless AMD fans started saying that it's just an interim step. That Navi will be the "Zen moment" for Radeon. Well.. here we are almost 2 years later. And even before Navi reviews came out, narration moved to next-gen Arcturus...
 
Further more souldn't, in theory, a more simple arch (Navi) reach higher clocks than the compute-heavy V20.

No, it shouldn't, that's the point. There are endless ways to make a simpler architecture scale badly with clocks, it's all about the implementation, you don't know how AMD dealt with that and neither do I.

Let's take RX 580. It hits a wall at around 1600 with (arguably) reasonable power draw. Now let's put Navi at maybe 1700. Okayish but nothing mind blowing, right. But you also went from GloFo 14nm to Tsmc 7nm. Plus high power draw and temps that get mentioned as well. Now that's definitely worrisome.

None of that means anything without context, maybe the RX 580/590 replacement draws more power but maybe it is also notably faster.

I don't understand you people at all. What did you hope Navi will be ? A high performance, lower power, cheap design ? If so get ready to become disappointed for the rest of your life because you'll never see that not from AMD nor anyone else. Free lunches in chip designs do no exist anymore, you will always trade in something for an improved metric somewhere else. Look at Turing, plenty fast and power efficient, right ? Yeah but it's huge and it costs a lot, for Nvidia and for you.

Regardless, good luck with this skewed perception of this industry.
 
I don't understand you people at all. What did you hope Navi will be ? A high performance, lower power, cheap design ? If so get ready to become disappointed for the rest of your life because you'll never see that not from AMD nor anyone else. Free lunches in chip designs do no exist anymore, you will always trade in something for an improved metric somewhere else. Look at Turing, plenty fast and power efficient, right ? Yeah but it's huge and it costs a lot, for Nvidia and for you.

Regardless, good luck with this skewed perception of this industry.
Why are you acting so triggered? Everything I said was simply for the sake of discussion, not to mention all of the speculations are based on wild rumors. Yet you managed to take everything I said like a fact...
 
Why are people getting triggered by rumours about pre-production chips?

Let's look at shit we know for sure. Gcn hardware, so it's going to be better suited to compute than gaming. Smaller node, so more heat in a small area. Amd don't have the same efficiency, so they will need 300w to match NV 200w cards.

If they price it right they will still sell like hot cakes imo. People who cry about power draw and still use desktop parts are hypocrites.
 
People who cry about power draw

People don't really cry about power draw. They cry about perf/watt. This is true everywhere, and especially on mobile with its cooling/power restrictions. As we reach the end of a node, power draw becomes a crucial part of the equation, its exactly what AMD is struggling with for the last 5-7 years.
 
I just want David Wang to join Intel now, then the old ATi will be reborn inside Intel. Intel’s dGPU cannot arrive soon enough
 
Back
Top