• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announces the Radeon RX 6000 Series: Performance that Restores Competitiveness

uhm where did they claim such a thing? All I can see is the slide claiming 35 games supporting FidelityFX, which does not mean that those 35 games has or will have any form of DXR or RT. FidelityFX has been around for a while now. And yeah directx ultimate is more than just RT.
It's 5 games at start. double-clicked for some reason and didn't fix it sorry for that.

Cons for AMD:
- 1st gen RT, so its a slower than Ampere's RT
- No DLSS, which gives huge performance advantage in the second most popular game in the world (Fortnite)
- NVENC is far superior to AMD's implementation. (OBS Streaming software has native support for Nvidia but not AMD)
- Nvidia Broadcast is a huge asset to content creators, no equivalent from AMD
- Way behind on professional applications due to Nvidia's broad CUDA support.
Where did you get this from? Your thoughts?
There is DLSS in form of Super Resolution. We don't know how it works but it is there and we will have to wait and see what it brings.
You dont know if it will be slower than Amperes RT. 6000 series will use Microsoft DXR API with the Ray Accelerators for every CU.
You way of perceiving this makes me confused.
CUDA support :) , NVENC :)
It's just the same way if I'd said NV sucks cause it doesn't have AMD Cores.
 
Last edited:
Well, everything said about RT performance on Radeon is pure speculation at this point. So don't judge it, before there is any evidence! Anyways it remains to be seen how RT will be adopted in the future, right now its a goodie, used on 20 games (none of which interests me). The future may bring broader adoption of RT, as the consoles can do it. But even then, the consoles are slower than the 6800xt and games will be optimized for consoles (as its always been) so RT it will probably run great on the AMD cards too.

I have tried all the different hardware encoders out there, and none of them was good. To be honest they're all shit! Sorry if you want quality, you have to do it in software. So no reason to go green for me there. Only reason to go for NVENC would be if you only had a tiny 4 or 6 core CPU that couldn't handle software encoding. But if you only had such a low tier cpu, you wouldn't buy 3080/6800xt anyways.

DLSS is absolutely overhyped. First of all there is only a handful of games that support it. Then the difference to native rendering is absolutely visible even with 2.0. Its just a blur. And, last but not least, it only makes sense for absolute hardcore esport guys, because 3080 and 6800xt seen to have no problems rendering in 4k/60fps or 1440p/120fps, so its really only a thing for competitive 4k/120fps+ gamers.

While I respect your opinion, for $50 more, I get all these features. At this price bracket, $50 is not much. 5900X and 3080 for me.
 
If AMD are going to compare a 6900XT with rage mode (overclocked), surely they should compare it to an overclocked 3090 for a fair comparison.

Or just not use rage mode. Cant get a proper idea of performance, although they "are" just slides.

To be fair, if they are right and it's a one click feature, it's basically PBO for GPUs. No one seems to have a problem with PBO so you have that.

What uhh, what titles out there support AMD's ray tracing right now?

DXR is hardware agnostic, AMD just has to provide a back end implelentation.
 
Last edited:
It's 5 games at start. double-clicked for some reason and didn't fix it sorry for that.


Where did you get this from? Your thoughts?
There is DLSS in form of Super Resolution. We don't know how it works but it is there and we will have to wait and see what it brings.
You dont know if it will be slower than Amperes RT. 6000 series will use Microsoft DXR API with the Ray Accelerators for every CU.
You way of perceiving this makes me confused.
CUDA support :) , NVENC :)
It's just the same way if I'd said NV sucks cause it doesn't have AMD Cores.

Marketing rule number 1: Always show yr best

Yesterday's Radeon presentation clearly indicates they have matched Ampere's raster performance. They did not show RT numbers and Super Resolution is something they are working on. If the RT was as good as Ampere, they would have shown numbers. Simple deduction.
 
And if it's not what you need in relation of sound, slap a watercooler and your done. Watercooling always has better efficiency compared to air to be honest. And because of that you could hold the boost clocks longer. I'm glad for AMD to bring back not one but two generations leap forward that competes with the 3090 which costs 1500 compared to 999$. 24GB of Vram is'nt a real reason to pay 500$ more, considering nvidia pays like 4 $ per each extra memory chip. The margins are huge.

That is exactly why I picked reference cards for Vega 64 and well Radeon VII was only available as reference. :rolleyes:
Replaced the cooler with water block and applied undervolt which helped to keep the boost clock higher for much longer times in games giving higher gaming performance.
With water block the temperatures were great for both Vega64 and Radeon VII with maximum 40°C at full load. :D

The results from my benchmarks for both of these cards tuned with undervolt and watercooling is available in Owners section at TPU forum for respective cards. ;)
 
To be fair, if they are right and it's a one click feature, it's basically PBO for GPUs. No one seems to have a problem with PBO so you have that.



DXR is hardware agnostic, AMD just has to provide a back end implelentation.

So DXR works in all RTX titles? I assume no one really knows yet, but these games have "RTX: On" not "DXR: On"
 
Marketing rule number 1: Always show yr best

Yesterday's Radeon presentation clearly indicates they have matched Ampere's raster performance. They did not show RT numbers and Super Resolution is something they are working on. If the RT was as good as Ampere, they would have shown numbers. Simple deduction.
Best doesn't mean there is only one way of doing it. Sure but this doesn't change a thing.
There aren't many games supporting DXR 1.0 and 1.1 at the moment. When NV released RTX there were none. So maybe give it some time for the cards to launch and then make an opinion when they are tested? AMD will host another presentation with the deep dive in the RDNA2's architecture. Expect this to be revealed then.
So DXR works in all RTX titles? I assume no one really knows yet, but these games have "RTX: On" not "DXR: On"
Well NV was first to implement RT so RTX it is but it doesn't mean it won't change to DXR if AMD joins the club.
 
Marketing rule number 1: Always show yr best

Yesterday's Radeon presentation clearly indicates they have matched Ampere's raster performance. They did not show RT numbers and Super Resolution is something they are working on. If the RT was as good as Ampere, they would have shown numbers. Simple deduction.

When they showed those numbers back at the Ryzen event that should have been from a 6900XT, right ? Except it wasn't, they didn't show their best.

So DXR works in all RTX titles? I assume no one really knows yet, but these games have "RTX: On" not "DXR: On"

There is no such thing as "enabling RTX", it's a marketing trick Nvidia did to make it seem like it's their technology and they seem to have pulled it successfully. DXR is the feature that enables ray tracing, RTX is just the back end implementation, there is no RTX API or anything like that.
 
Last edited:
While I respect your opinion, for $50 more, I get all these features. At this price bracket, $50 is not much. 5900X and 3080 for me.
It is not 50 more. There ARE no 3080 for 700€. There never were 700$ FE cards on the market for private customers!
RTX3080 for 700$ was a big fat lie!

Since Xbox uses DX ML, Super Resolution will be in most new games, dont worry!
 
It is not 50 more. There ARE no 3080 for 700€. There never were 700$ FE cards on the market for private customers!
RTX3080 for 700$ was a big fat lie!

Since Xbox uses DX ML, Super Resolution will be in most new games, dont worry!
It's not just the price that is higher. The cards are nowhere to be found. In my area these cards availability is January
 
It is not 50 more. There ARE no 3080 for 700€. There never were 700$ FE cards on the market for private customers!
RTX3080 for 700$ was a big fat lie!

Since Xbox uses DX ML, Super Resolution will be in most new games, dont worry!

I expect the same for 6800XT, $649 will be another lie.

It's not just the price that is higher. The cards are nowhere to be found. In my area these cards availability is January
That sucks. I should be getting mine in mid-November.
 
I expect the same for 6800XT, $649 will be another lie.
That depends on the availability not if somebody lied or not.
NV availability was non existent and that bump the price. Nobody says NV lied about the MSRP price and so as AMD doesn't lie. Put yourself together bro :)
That sucks. I should be getting mine in mid-November.
If you get it, good for you :) Pre-orders is not my type of thing.
 
I expect the same for 6800XT, $649 will be another lie.


That sucks. I should be getting mine in mid-November.
Doubtful I bought a Polaris and Vega at AMD's MSRP day one price, and if not I wouldn't just pay a lier anyway.
 
I expect the same for 6800XT, $649 will be another lie.
As I said, it remains to be seen. Right now you are just making assumptions.
Fact is, Nvidia lied to us. And they keep lying to us when they say RTX3000 availability is not a supply issue.
Another fact is, its not the first time they broadly lied to their customers (remember GTX970?)

For AMD, at least they have record of not falsifying their bar graphs. For availability, well we'll see. But looking at their choice to use regular GDDR6 and makig up for the lack of bandwidth by using cache lets me hope that there will be more cards on launch day.
 
What's with DLSS/RT ? I feel they're like Async compute which fanboy made them overhyped. come after 5 years or more , so we can talk about raytracing/super resolution.I remembered :

1) Ryzen 1xxx/2xxx can't get higher clock so intel is better until Ryzen 3xxx came out.
2) Ryzen 3xxx has higher memory latency so Intel is best for gaming until AMD announces Ryzen 5xxx
3) Ryzen 5xxx is matched and people are looking for something ?
 
What's with DLSS/RT ? I feel they're like Async compute which fanboy made them overhyped. come after 5 years or more , so we can talk about raytracing/super resolution.I remembered :

1) Ryzen 1xxx/2xxx can't get higher clock so intel is better until Ryzen 3xxx came out.
2) Ryzen 3xxx has higher memory latency so Intel is best for gaming until AMD announces Ryzen 5xxx
3) Ryzen 5xxx is matched and people are looking for something ?

At least DLSS allows cards to age more gracefully, it'd be like having a 780ti right now that managed 4k60 (while really running the game at like 720p)
I see *that* tech, in either implementation to be a massive deal going forward
 
At least DLSS allows cards to age more gracefully, it'd be like having a 780ti right now that managed 4k60 (while really running the game at like 720p)
I see *that* tech, in either implementation to be a massive deal going forward
Problem is you can't get DLSS on every game.even after 5 years , DLSS will implement only in famous game.

Edit : I forgot to mention it : Hitman 2.
 
At least DLSS allows cards to age more gracefully, it'd be like having a 780ti right now that managed 4k60 (while really running the game at like 720p)
I see *that* tech, in either implementation to be a massive deal going forward
I doubt that. As you may know, having cards age gravefully is not in the interest of tech companies. They need to sell new hardware, not support old hardware. So right now they'll give you DLSS 2.0 support for current games, next years games will get DLSS 3.0 support, but sooner or later older cards wont support future DLSS versions, so that current cards will be limited to DLSS Versions of their corresponding generation.
 
As I said, it remains to be seen. Right now you are just making assumptions.
Fact is, Nvidia lied to us. And they keep lying to us when they say RTX3000 availability is not a supply issue.
Another fact is, its not the first time they broadly lied to their customers (remember GTX970?)

For AMD, at least they have record of not falsifying their bar graphs. For availability, well we'll see. But looking at their choice to use regular GDDR6 and makig up for the lack of bandwidth by using cache lets me hope that there will be more cards on launch day.

Nvidia share is 80% in the DGPU market. Of course, the demand is much higher for Nvidia than AMD. It's not a fair comparison. Moreover, Nvidia created so much hype around Ampere launch that demand was unprecedented. Gamer's Nexus Steve explained this in one of his videos to investigate whether its a supply issue or a demand issue. According to him, its a demand issue.
 
Marketing rule number 1: Always show yr best

Yesterday's Radeon presentation clearly indicates they have matched Ampere's raster performance. They did not show RT numbers and Super Resolution is something they are working on. If the RT was as good as Ampere, they would have shown numbers. Simple deduction.
I don't think anyone here is arguing that RDNA 2 will match Ampere in RT performance, but also going by simple deduction: given that the RT performance of RDNA 2 is good enough for both major console manufacturers, it can't be that far behind. Console makers are fully aware of the RT capabilities of Turing (and the issues with these), so I would expect them to demand at least parity with Turing for RT on a console to be even remotely viable.
So DXR works in all RTX titles? I assume no one really knows yet, but these games have "RTX: On" not "DXR: On"
That's because RTX has so far been the only existing implementation of DXR so far, so Nvidia has had free rein to market it as they have seen fit. RTX is in effect nothing more than a sticker on top of DXR.
I expect the same for 6800XT, $649 will be another lie.
There's little reason to expect this - while this is no doubt a very large die, unlike the GA102 it's made by the best foundry around and on a tried and tested process node. Yields should be much better than for Nvidia, and while there's still no fab capacity to spare on TSMC 7nm, AMD has likely bought up all the wafer starts they possibly could in the run-up to RDNA 2 and Ryzen 5000 (not to mention the new consoles). Of course a lot of capacity also opened up a while back when Apple moved to 5nm, which AMD likely grabbed a decent chunk of.

Will availability be tight? No doubt - the gaming market has grown massively in the last few years, so interest in and demand for these products is higher than ever, and production capacity hasn't really increased to match (no wonder, seeing how building a fab costs billions of dollars and takes at least 5 years). It stands to reason that supply will be tighter than before. But I don't expect supply for these to be as tight in relation to demand compared to the RTX 3000-series.
At least DLSS allows cards to age more gracefully, it'd be like having a 780ti right now that managed 4k60 (while really running the game at like 720p)
I see *that* tech, in either implementation to be a massive deal going forward
That's an excellent point, but it also needs a universal driver-side implementation (one that doesn't require developer intervention) to work like that. AMD is promising their alternative to be more in that direction (though I'd be surprised if it was as good as Nvidia's in terms of image quality), and even cross-platform (so any developer effort would be applicable also to competing GPUs, making adoption more likely). It'd also be really interesting to see if consoles would adopt this.


By the way, looking forward to when the new consoles are out and the RX 6700 series appear: it will be really cool to see someone do a comparative review with the more traditional wide VRAM buses of the consoles vs. the narrow bus+cache solution for RDNA2 dGPUs.
 
DLSS is absolutely overhyped. First of all there is only a handful of games that support it. Then the difference to native rendering is absolutely visible even with 2.0. Its just a blur.

you clearly don't have a card supporting it, because your assumptions are absolutely wrong.
DLSS works very well and definitely it is NOT just blur.

The only true point is the limited support in games.
 
Problem is you can't get DLSS on every game.even after 5 years , DLSS will implement only in famous game.

Edit : I forgot to mention it : Hitman 2.

Famous games have the most players. Most players mean Nvidia can sell more cards. Which card do you think a Fortnite player will choose? A card without DLSS or a card with DLSS??
 
Famous games have the most players. Most players mean Nvidia can sell more cards. Which card do you think a Fortnite player will choose? A card without DLSS or a card with DLSS??
You are missing the point. If NV implements DLLS only in some games, that means the DLSS is harder to implement and since it is such a good feature not all games will have it. That is not good from a player perspective. If AMD's super resolution can be done easier, the support for it will be better and it will not matter which game you play you will have that feature. Also the development from game developers will be easier meaning support will be better.
 
Famous games have the most players. Most players mean Nvidia can sell more cards. Which card do you think a Fortnite player will choose? A card without DLSS or a card with DLSS??
and with game that doesn't support DLSS , you think player chooses a card with DLSS or a card with more raw power? What I wanted to say that DLSS support is limited.
AMD left a lot clocks for AIB card.I'm sure RDNA2 AIB can reach 2400mhz or more with water cooling.
 
Problem is you can't get DLSS on every game.even after 5 years , DLSS will implement only in famous game.

Edit : I forgot to mention it : Hitman 2.
DLSS 2.0 (the previous version was quite crap) was released in March 2020...
 
Back
Top