• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

The problem will be price, at least in Sweden. No store would make a profit on MSRP (according to Sweclockers anyway), and it's unclear what prices we'll be looking at in future.
You always have to add the VAT (moms) in Sweden on top of whatever price you see. I think the review sites in Sweden forgets about this.
Also, somethings Sweclockers are a bit too full of themselves imho.

Looks like a $40-ish price premium here, even after VAT has been added.
 
TLDR:
  • Truly stellar rasterization performance per watt thanks to the 7nm node.
  • Quite poor RTRT performance (I expected something close to RTX 2080 Ti, nope, far from it).
  • No DLSS.
  • Almost no attempt at creating a decent competition. I remember AMD used to fiercly compete with NVIDIA and Intel. AMD 2020: profits and brand value (fueled by Ryzen 5000) first.
Overall: kudos to RTG marketing machine which never fails to overhype and underdeliver. In terms of being future-proof people should probably wait for RDNA 3.0 or buy ... NVIDIA.

lmao. I like your insecure comments.
well, as I predicted after I read the scientific article about infinity cache.
Hi Wizz, do you have a plan to bench this card alongside the Ryzen 5000 CPU series?
It's already there.
 
Not surprisingly, the ray-tracing performance is quite a bit behind. Makes it difficult to decide for me as I want a card for ray-traced Minecraft.
Hope they will catch up in the future.

If you need the best RT performance RIGHT NOW, then Nvidia is clearly the way to go. That being said, I won't be surprised if after a year or two, RT performance of the 6800 and 6800xt is noticeably better than 2080ti.
 
So, did these actually go on sale yet or also sell out in seconds?
 
thanks again for the amazing review @W1zzard .
AMD delivered exactly what they promissed. a Worth competitor, in some games it beats the 3080, but looses in others, it consumes less power and costs less money, i'm not disappointed, i just hope AMD continues the trend and keeps improving to keep NVIDIA in check and making us the consumers the Winners in this Battle.
 
lmao. I like your insecure comments.
well, as I predicted after I read the scientific article about infinity cache.
Hi Wizz, do you have a plan to bench this card alongside the Ryzen 5000 CPU series?


Shame about RT performance, but expected since it's their first attempt. Not sure how they are going to handle their SAM marketing after Nvidia came out saying they're going to support a similar feature on BOTH Intel and AMD if AMD is willing to play ball. They must have known it could be implemented on a driver level...
 
So subpar raytracing performance and near raster parity? Congratulations on getting out of the gutter AMD but this is with a node advantage (SS 8nm sucks) and the absence of a proper DLSS alternative. 3080 looks like more value. A shortlived victory before a 7nm ampere refresh or hopper restores the status quo
 
I am getting this just for the multi-monitor power consumption alone. That is just beautiful. Also, I hate Nvidia. Been buying ATI then AMD gpus since the first 9700 pro. Had to jump ship to the 970 then 1080ti then rtx 2080 super because AMD didn't have what I needed at the time. Now I just have to wait for a water cooled version.
 
Both impressed and disappointed.
 
thanks again for the amazing review @W1zzard .
AMD delivered exactly what they promissed. a Worth competitor, in some games it beats the 3080, but looses in others, it consumes less power and costs less money, i'm not disappointed, i just hope AMD continues the trend and keeps improving to keep NVIDIA in check and making us the consumers the Winners in this Battle.

Yep, keeping Nvidias imaginary cards in check with their own imaginary cards...

This year sucks for consumers.
 
Wow, really. :D
-4-6% behind 3080 without SAM
-costs $50 less
-+6GB VRAM
-significantly less power consumption and therefore just crushing the 3080 in terms of effectiveness
-slightly better temperatures than reference 3080
-much quieter than the 3080
-OCs MORE THAN TWICE BETTER than the 3080 (WTF)

And note that 4 new benchmarks, including Dirt 5, AC: Valhalla, WD: Legion and Godfall are not yet included here, and in 3 of those, AMD just crushes the 3080

The only difference is in RT performance. But to be honest, given the quality of RT in most games (and even the number of games supporting it) it doesn't make it a decisive factor given the earlier points.
 
Last edited:
with a node advantage (SS 8nm sucks) and the absence of a proper DLSS alternative
Node (full node?) disadvantage huh, care to enlighten us if you have Ampere's perf or perf/w on 7nm :rolleyes:

While SS might be inferior, by how much no one really knows. Anyone claiming otherwise is just blowing hot air off the wrong end.

Yeah sure, people are dying to go DLSS when 4k native is already an option on most games :slap:
 
Impressive AMD. Welcome back. I salute AMD for taking intel and nvidia head on.. And now, alot of people including me, to finally tell people, I have a full AMD system with a heads up and proud. Im pretty sure AIB will have fun OCing this chip. Im going full AMD. Im sure patches will come for the ray tracing and so called virtual super solution ;)
 
I'm glad AMD has once again returned to competing at the high-end of performance in rasterization. The ray-tracing performance is unusable, however, at all but the lowest resolutions and without a DLSS competitor, I won't be purchasing an RX 6000 card. Guess it's back to waiting for NV to make 3000 series cards available in meaningful quantity.
 
What or who have they destructed? I'm truly curious. As far as I can see these cards are for hardcore AMD fans who will continue to say that RTRT is irrelevant. Yes, it is irrelevant on RTX 6000 hardware. It's very much viable for the RTX 3000 series which thanks to DLSS allows to enable RTRT and game at 4K.
I feel like this is why Nvidia rushed their RT with the 20 series so people would say this.

Them being first to market allowed them to set a certain benchmarking floor. The problem is like all nVidia tech, support is limited and generally requires their involvement or investment to implement in games.

The AMD implementation will be limited in comparison, BUT every multiplatform game developed can easily utilize it in some capacity, which I think is good enough for most people. Consoles were always going to dictate how RT would be used since very few devs make high end PC only games any more.

Personally the "low" setting in RTX optimized games look best to me.

nVidia have mastered marketing limited use tech as a selling point knowing few games/software will actually take advantage of it. These replies confirm that.

This applies to DLSS as well. It's great tech but purposely limited to marketable games. There would be no reason to buy a 3080 if a 3060 was leveraging DLSS in every game.
 
@W1zzard

So exactly why is Nvidia in trouble? I don't get the title. Nvidia has the faster product both in rasterization and raytracing (much faster in raytracing in fact). There is no AI support on the AMD card that would be similar to DLSS, with no alternative to TensorCores. All that on a worse 8nm process (really just an improved 10nm). I'd actually look at that the other way - AMD, even with the best available process technology, much work spent on improving power efficiency, AMD was still unable to beat Nvidia. What happens when Nvidia moves to 5nm next gen? Where will AMD find any reserves for improvement? It's not like they can ditch GCN for the second time and get the same improvements (they've finally done it now, removing the last remains of that horrible architecture).

For longevity of the cards, the TensorCores and DLSS are actually a perfect solution. You can play games in native 4k easily on a RTX 3080. Games that come out 4 years from now, you might still be able to play confortably with DLSS on. I agree with you that memory is not as important simply because you run out of shading power before you do with memory. But with DLSS, you can actually extend that moral end-of-life of a card.

I also couldn't find any DLSS ON raytracing results, while there were in most previous Nvidia reviews. Why?

Otherwise thanks for a great review!
 
In a world without DXR AMD feet is GRAND!
But DXR is where everybody seems to go this days... so nothing new. NV will still dominate and the whole 10 vs 16 GB RAM will be excelent food for endless, worthless form wars.

If you dont care for DXR (I for once dont care at all) go get that 6800XT. It is quite perfect.
If you do, NV then- no other choice.
 
Last edited:
For longevity of the cards, the TensorCores and DLSS are actually a perfect solution.
You think RDNA2 based cards will not improve their performance over time? I know the future potential uplift dank memes about AMD, GCN on consoles but you really think the massive gains for games we saw on zen3 chips was just a coincidence?
 
Combining RT on and off results in one chart is one weird decision.
 
Great to see AMD back in the competition game and on par with Nvidia's finest, but too bad there's no price competition anymore, 6800 XT and RTX3080 almost cost the same, while I'd personally go with AMD just for the sake of features/upgrades in this round (SAM, more VRAM). I'll think twice if I want to spend that much on a GPU this time around, and also the prices in Europe will be over the roof anyway. 4k 60 fps gaming is still quite expensive.
 
Good performance, however it seems the infinity cache approach to accelerating memory performance can't keep up with 4k performance.

Based on the reviews I've seen it looses a lot of performance in most cases at 4k and looses ground to Nvidia.

I mean it could be a lack of pure grunt but this is the case even in games where it outperforms at 1440p.

I hope a board partner attempts a gddr6x version just to see if y theory is correct.

AMD need to get their upscaling tech out ASAP though.
 
I feel like this is why Nvidia rushed their RT with the 20 series so people would say this.

Them being first to market allowed them to set a certain benchmarking floor. The problem is like all nVidia tech, support is limited and generally requires their involvement or investment to implement in games.

The AMD implementation will be limited in comparison, BUT every multiplatform game developed can easily utilize it in some capacity, which I think is good enough for most people. Consoles were always going to dictate how RT would be used since very few devs make high end PC only games any more.

Personally the "low" setting in RTX optimized games look best to me.

nVidia have mastered marketing limited use tech as a selling point knowing few games/software will actually take advantage of it. These replies confirm that.

This applies to DLSS as well. It's great tech but purposely limited to marketable games. There would be no reason to buy a 3080 if a 3060 was leveraging DLSS in every game.

NVIDIA's RTRT implementation is bog standard and standardized as DX12/RTX and Vulkan RT extensions. Xbox Series X uses D3D12 RTX as well, so there's nothing NVIDIA'esque about RTRT. Not sure about Sony as they use their own graphics API but very similar RDNA 2.0 hardware.

DLSS on the other hand is 100% proprietary - that's true.
 
Almost 100W less on average with that insane chunk of power hungry cache, confirmation that Samsung's node is utter crap, my God how the tables have turned. And all of that just so that Nvidia could save probably a couple of bucks per chip.
Then you really have to wonder what AMD is gonna do when Nvidia moves to 5nm TSMC. I don't think AMD is looking forward to that.
 
Back
Top