• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD "Navi 4C" GPU Detailed: Shader Engines are their own Chiplets

Shame if true. Innovation should always be encouraged. Taking risks is where success lives.
 
"Navi 4C" is a future high-end GPU from AMD that will likely not see the light of day, as the company is pivoting away from the high-end GPU segment with its next RDNA4 generation."

This does not mean in way AMD is abandoning the high end. RDNA4 is a much more complex design than RDNA3 and there were going to be over 2x the number of chiplets. SOurces inside AMD have said they were struggling to get the design to work and performance would be only minimally improved over RDNA3. Rather than eat tons of resources trying to get it to work which would be delays not just on RDNA4 but also RDNA5 they are just sticking to the lower end monolithic N33/34 designs for RDNA4 which are progressing well and will see huge uplifts in RTing. The high end will just shift to RDNA5. Given Blackewell is not out until 2025 AMD is not going to be that disadvantaged by being without high RDNA4. RDNA5 might be out late 2025 say 6 months or so after Blackwell and in the long run that will mean they have far stronger competitors to high end Blackwell. IMO as long as they can get N43's 8600 level card to be a much stronger offering and more like a 7700XT in raster but with much stronger RT and hardware accelerated FSR3 they sell a ton. Being on 3nm they could pack in a lot more CU's, give it 192bit bus, GDDR7, 12GB for say $299 that would slay upcoming 7700XT.

If it's minimally improved over RDNA 3, that means they still can't beat the 4090. But there is no way RDNA 5th gen is coming out in early 2025, that's less than 2 years from now.
 
There was a shift in GPU segmentation since Turing, the 2080ti should've been the 80 card, an hypothetical 3090ti super the 80 card for ampere, and the 4090 the 80 for Ada. What used to be High end is now ultra enthusiast.

What do you mean?

Ever since Kepler, x80 cards have mainly been using small x04 dies, between 300-400 mm2. The 2080 actually used a big 104 die that was 545 mm2 (because they added RT and tensor cores).

The 3080 used a slightly cut down 102 die, which is what made it such a great value card, one of the best in NVIDIA's history.

And there's no way the 4090 should be the 4080. But at $1200, the 4080 should be using a cut down 102 die. The 103 die should be the 104 die in the 4070/Ti. And the 104 die should be the 106 die for the 4060/Ti.

What changed with Turing is that they started releasing the 102 die right away, instead of a year later. But it's still a huge die with crazy performance. The shift actually happened with Kepler, because that's when they started using a mid-range chip in a high-end product. That was after the failure of the GTX 280 and 480, which used huge and power hungry chips with poor performance.
 
If it's minimally improved over RDNA 3, that means they still can't beat the 4090. But there is no way RDNA 5th gen is coming out in early 2025, that's less than 2 years from now.
Where did I say it's coming out in early 2025. I said Blackwell is out in 2025 and RDNA5 might be 6 months behind, meaning it's at least late 2025 at best.
 
What do you mean?

Ever since Kepler, x80 cards have mainly been using small x04 dies, between 300-400 mm2. The 2080 actually used a big 104 die that was 545 mm2 (because they added RT and tensor cores).

The 3080 used a slightly cut down 102 die, which is what made it such a great value card, one of the best in NVIDIA's history.

And there's no way the 4090 should be the 4080. But at $1200, the 4080 should be using a cut down 102 die. The 103 die should be the 104 die in the 4070/Ti. And the 104 die should be the 106 die for the 4060/Ti.

What changed with Turing is that they started releasing the 102 die right away, instead of a year later. But it's still a huge die with crazy performance. The shift actually happened with Kepler, because that's when they started using a mid-range chip in a high-end product. That was after the failure of the GTX 280 and 480, which used huge and power hungry chips with poor performance.
Performance wise, not how big the die itself. The 2080 should've been as powerful as the 2080ti to justify its 80 name, using the hypothetical card the 3080 won't be so great nor the 4090.
 
Performance wise, not how big the die itself. The 2080 should've been as powerful as the 2080ti to justify its 80 name, using the hypothetical card the 3080 won't be so great nor the 4090.

I could agree and disagree at the same time.

2080 - 9% faster than 1080 Ti
1080 - 31% faster than 980 Ti
980 - 11% faster than 780 Ti
680 - 23% faster than 580

So yes, the improvement over the previous flagship was the smallest, but it had to happen because of RTX. They had to sacrifice raster performance improvement to include RT, and that's the generation they decided to do it.
It was kind of a failure, because there were pretty much no games with RT for a long time, DLSS 1 was garbage, and there were some failing memory problems if I remember correctly.

But now that RT is already here, performance improvements are pretty much back to what they were in the previous decade. Unfortunately they also come with a price increase, but that's another topic.
 
I could agree and disagree at the same time.

2080 - 9% faster than 1080 Ti
1080 - 31% faster than 980 Ti
980 - 11% faster than 780 Ti
680 - 23% faster than 580

So yes, the improvement over the previous flagship was the smallest, but it had to happen because of RTX. They had to sacrifice raster performance improvement to include RT, and that's the generation they decided to do it.
It was kind of a failure, because there were pretty much no games with RT for a long time, DLSS 1 was garbage, and there were some failing memory problems if I remember correctly.

But now that RT is already here, performance improvements are pretty much back to what they were in the previous decade. Unfortunately they also come with a price increase, but that's another topic.
Here lies the problem, we maybe get 80 series performance but at Titan prices.

I believe they just increased margins, and with demand from IA it doesn't make sense to waste silicon on gaming GPUs.
 
I believe they just increased margins
Sure they did. First of all 3090 and other xx90 products are meant to replace Titan, with a few features missing but a lower price point than TITAN RTX. Then covid happened and they saw what enthusiasts are willing to pay for top products. The 3080 was only that cheap because of the 6800 XT, don't think for a second they would've priced their big chip (albeit cut down a bit) that cheap otherwise.
 
New renders. This is how Navi 4C was supposed to look like. Extremely difficult/complex to make possible and actually working.

1692286613741.png

 
Back
Top