• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Announces the Radeon RX 6000 Series: Performance that Restores Competitiveness

Here is number of Ray-tracing.so far 6900xt is probably equal to rtx 3070
Rumours do point to 50% better than a 2080ti
I'm not sure how the 3070 stand's here.
 
No mention of DXR performance at all. For all we know, you flick that setting on in Cyberpunk 2077 and it'll drop down to 2060 performance LOL!
According to Digital Foundry, PS5's raytracing performance is similar to RTX 2060 Super.
 
Hardware encoding
I like Turing NVENC. Using it very often to offload encoding from the CPU with low GC utilization and good and usually sufficient results for my needs. I think AMD don't plan to make hardware encoders. They have multi-core CPU already presented as CPU for content creators. Software encoding has similar or better results than NVENC. And multi-core processors from AMD is probably also reason why NVIDIA placed Turing NVENC on Ampere cards.
You clearly haven't seen streams that use NVENC. It looks as good as any cpu could put out and doubt even 12 core cpu using 8 of them for encoding would look any better.

Games are designed with consoles in mind first and ported to PC later. With RDNA 2 being the backbone of the next-gen consoles, I highly doubt Nvidia will have better RT performance. Devs don't like don't extra work especially for proprietary software solutions.
Its called DXR and its the Direct X standard that is being used. Its not Proprietary anything and likely just another amd excuse to being bad at it. Like Tessellation that amd was bad at forever that was apart of DX standard.
 
I agree that some pc hardware prices are ridiculus.
But basically prices for an average gaming pc have not changed too much. Remember that 3080 and 6800xt are not average, these are enthusiast products. You can get a capable gaming pc for 1000 bucks, just like you could 20 years ago.

But you can't really compare that to consoles. On pc games are usurally cheaper since you have a lot more ways to buy them. On pcs you can work, you can upgrade your pc, and sell parts you dont use anymore... There are endless possibilities. On pcs you can play brand new games and games from 40 years ago...
On consoles you can just play games and watch netflix.

I agree to some extend. But Xbox X is 4K/60fps capable console. How much do I have to pay today to get the same level of performance? RDNA2/Ampere 2080S equivalent is not gonna be cheaper than $400 if we're lucky. Add 8C/16T 3700x or 10700 to the mix $300 (not even considering $450 8C zen3), 16 GB or ram $70, fast 1TB NVe SSD $120, blue ray $60, decent case $70, decent PSU $80... That's 1100 bucks PC that will probably perform worse in gaming than the console because most devs don't give a f... about code optimization when it comes to porting games from consoles to PC. I accept paying 50% premium as you get fully functioning workstation and freedom that PC brings, but not 110%. That's just too much. AMD/Intel/Nvidia have become too greedy when it comes to DIY PC market.
 
Last edited:
RT
RDNA2 is first generation with hardware accelerated RT and RT games and drivers must be optimized first. It is logical that they didn't presented RT perf now. It will take months. We will see, but I think NVIDIA will be still more better than AMD, they started with RT sooner.

Memory subsystem
We will see in tests if more expensive G6X memory/384bit bus or infinity cache and G6 memory/256bit bus is better or the same. But if infinity cache will be advantage, then it is plus for AMD, they can move to infinity cache and G6X memory if needed.

Hardware encoding
I like Turing NVENC. Using it very often to offload encoding from the CPU with low GC utilization and good and usually sufficient results for my needs. I think AMD don't plan to make hardware encoders. They have multi-core CPU already presented as CPU for content creators. Software encoding has similar or better results than NVENC. And multi-core processors from AMD is probably also reason why NVIDIA placed Turing NVENC on Ampere cards.
Memory subsystem
Big NAVI's Infinity Cache is based on Zen's L3 cache.

Hardware encoding
RX 5700 has AMF & VCE
 
RT
RDNA2 is first generation with hardware accelerated RT and RT games and drivers must be optimized first. It is logical that they didn't presented RT perf now. It will take months. We will see, but I think NVIDIA will be still more better than AMD, they started with RT sooner.

Memory subsystem
We will see in tests if more expensive G6X memory/384bit bus or infinity cache and G6 memory/256bit bus is better or the same. But if infinity cache will be advantage, then it is plus for AMD, they can move to infinity cache and G6X memory if needed.

Hardware encoding
I like Turing NVENC. Using it very often to offload encoding from the CPU with low GC utilization and good and usually sufficient results for my needs. I think AMD don't plan to make hardware encoders. They have multi-core CPU already presented as CPU for content creators. Software encoding has similar or better results than NVENC. And multi-core processors from AMD is probably also reason why NVIDIA placed Turing NVENC on Ampere cards.

There will be games with AMD RT support on release date. AMD is using DX ray tracing, so every game that has that implemented will work. Games that have more Nvidia tailored RTX would probably need more work to get them to work.

G6X is not much faster than G6, but it is hotter and need more cooling. They don't need G6X, if they are currently matching it and it's cheaper to have a smaller bus with G6.

AMD has a hardware encoder in their GPU, but it uses standard formats and not a self-made one. They do have to increase support for it, if they want to be an alternative for streamers, but it is good enough for regular Joe.
 
Last edited:
95% of the whole dGPU market is at sub-$300. Nothing AMD or Nvidia have announced in the last week is remotely relevant to market share.

Even Navi22 - presumably a 40CU and 36CU pair of cards somewhere in the $300-400 range might actually be of relevance to the marketshare numbers, but even then that's still in the realm of 'enthusiast'
PS5 has the mainstream RDNA 2 with 36 active CU with up to 2.23 Ghz clock speed. PS5 has the 40 CU RDNA 2 based design without the Infinity Cache.

NAVI 22 with 36 CU would be PC's equivalent to PS5.
 
I was planning to upgrade my PC with Zen 3 and something from 3000 series, but this, this has made me think... Maybe GPU will actually be AMD again. Upgrade was planned for sometime next year anyway, so a perfect chance to see reviews and decide.
But hopes are up. Worst case - it is at least comparable to 3000 series, which is already good for AMD. They have crawled out of the pit.
 
Well i can openly say this justifiably ... AMD have not had a single win against anyone at the moment ... tbeir products are large doses of salt .. as is their claims .. until such time as theyre made available for independent testing and purchase .. Claims of "we are faster than intel or nvidia" dont cut it if theyre comparing their products to those that are already in the marketplace, and until such time as theirs are available no claims made by AMD are validated.
Lets see how their products stack up in the real world .. I have purchased AMD products in the past and have been burned by the hype .. never again.
Excuse me if i have good reason to want to see how they perform outside of AMD's presentations or the hype which of recent years has closely been similar to Apple's Reality Distortion paradigm.
Arguing that theyre better failing independent testing is simply absurd as we have heard it from them many times before.
 
Eagerly waiting for TPU to review these cards when they are released!
I see a CPU / GPU upgrade combo in my stocking for Christmas!
 
Ok so the only difference between 6800xt and 6900xt is 8 Computer units.... Am I missing something?

It seems very light. For 350$. 45$ each computer units....Hopefully it will be way more overclockable than the 6800xt....

The fans will be bigger I guess... Do we know if AIB will produce 6900xt as well?
 
Ok so the only difference between 6800xt and 6900xt is 8 Computer units.... Am I missing something?

It seems very light. For 350$. 45$ each computer units....Hopefully it will be way more overclockable than the 6800xt....

The fans will be bigger I guess... Do we know if AIB will produce 6900xt as well?

rumored to be no
 
Ok so the only difference between 6800xt and 6900xt is 8 Computer units.... Am I missing something?

It seems very light. For 350$. 45$ each computer units....Hopefully it will be way more overclockable than the 6800xt....

The fans will be bigger I guess... Do we know if AIB will produce 6900xt as well?
It may be another case of Vega 56 OC that rivals stock Vega 64. I'm targeting RX 6800 XT AIB OC and RTX 3080 Ti for my two gaming PCs.
 
I believe they indicated the 6900 would be an AMD exclusive .. which raises all sorts of concern if true .. given that their past exclusives have been shocking in comparison to AiB partner versions .. maybe they have concerns about their top end card .. so much so that they want to keep control of its functioning .. regardless of reason i cant see any valid argument as to why they wouldnt want AiB partners to extract every last drop from their products by making it available to them
 
Ok so the only difference between 6800xt and 6900xt is 8 Computer units.... Am I missing something?
Yes, it's Compute units.
 
Given the 6800 pricing is $579, should we expect 36CU and 40CU Navi 22 (clocked at 2250Mhz) and priced $399/349 or even try to push to $449/399 again? That would mean 1080TI/2080 level of performance still costing the same amount as RX 5700(XT) did. Can this "rx 590" Deja Vu refresh really happen to us again?
 
No mention of DXR performance at all. For all we know, you flick that setting on in Cyberpunk 2077 and it'll drop down to 2060 performance LOL!

What uhh, what titles out there support AMD's ray tracing right now?
 
I like gaming by maxing all available quality settings regardless of outright fps with the expectation that my experience will be fluent and as the game developer intended to highlight latest tech realism. AMD deliberately left out several important current tech .. or performance thereof .. that raises an eyebrow .. and theres no denying that. On paper they claim their card is fast under their best ideal conditions .. but even that claim is without validation by any 3rd party.
 
According to Digital Foundry, PS5's raytracing performance is similar to RTX 2060 Super.
according to DF they didn't really do an apple to apples comparison.

They took a guess.
 
And I remember how I purchased the GeForce 8800 GT (basically the RTX 3080 in today's terms) for $250 which was close to the very top. Nowadays $250 is what? Lower end?
That's not how inflation works.
Average wages in the U.S. have risen 65% since the 8800GT's release, that would mean your $250 card would have to be priced around $412 to be on par.
 
Yep .. aligning themselves to Apples reality distortion field ... lmao

Its real when it exists in real hands ... everything else is vapour .. we will know soon enough what the realities are

Well im old enough to know that back in the Geforce 2 days when Nvidias reign was threatened they released Detonator drivers which resulted in a 20-40 percent increase in performance .. the history is there for Nvidia ... sadly its never been there for AMD ... regardless of speculation AMD have never delivered strong driver performance increases .. you can deny reality until it bites you and ive been bitten by AMD's claims often .. thats why im sceptical until its independently tested.
 
Last edited by a moderator:
Any Idea why the 6900XT and 6800XT are both listed as 300W cards, but the PSU recommendations differ?
The 6900XT shows a 850W recommendation while the 6800XT shows a 750W unit.
 
With a 375 watt maximum its all academic .. i doubt theres much headroom for manual overclocking otherwise that would be the 1st thing AMD would be leading with .. and they didnt .. but thats not surprising given the dependencies they highlighted in their presentation
 
I plan to have the RTX 3070 but hello there RX 6800 :)
 
Back
Top