Tuesday, October 18th 2022

AMD Radeon RX 6900 XT Price Cut Further, Now Starts at $669

In the run up to the November 3 reveal of the next-generation RDNA3 architecture, and with the 43% faster RTX 4090 mauling away its appeal to the enthusiast crowd, the AMD Radeon RX 6900 XT got another round of price-cuts, and can be had for as low as $669. Prices are down on both sides of the big pond, with European retailers listing it for as low as 699€. Although not technically AMD's flagship graphics card, with the RX 6950 XT (starts at $869); the RX 6900 XT is a formidable 4K gaming graphics card with a high performance-per-Dollar at its new price (roughly 35% higher than the RTX 4090). AMD's latest round of official price-cuts happened around mid-September as the company was bracing for the RTX 4090 "Ada."
Source: VideoCardz
Add your own comment

131 Comments on AMD Radeon RX 6900 XT Price Cut Further, Now Starts at $669

#51
Makaveli
fevgatosAnd ive jumped from the 5th floor and landed in a ferrari f40...


Your math is....wrong
That difference is i've acutally owned all those radeons and speaking from personal experience you have never done that nice try though.

You can search my post history on this very forum going back years to some of the radeon's i've owned.

Where can I see that ferrari you landed in?
Posted on Reply
#52
HTC
tvshackerHere in Portugal we can barely get a RX 6800 (non XT) for that price.

Screenshots took this morning from a price comparison website, and it's ordered from cheapest.


Cheapest 6900XTs just for reference.
Those are the cheapest.

Here are the typical prices from one store, here in Portugal:



And before you think "but that's only one store", think again:



Only a "slight" price variance between the models on offer, and i don't mean between the stores ...
Posted on Reply
#53
Daven
TheinsanegamerNWoah, stop the kool aid party there man.

www.tomshardware.com/reviews/catalyst-13.8-frame-pacing-crossfire,3595-2.html
years ago, did AMD "fix" their frame pacing issues? The frame pacing issues that forum users complained about for years with no resolution?

Navi black screen

www.techspot.com/news/84228-latest-amd-radeon-drivers-meant-address-black-screen.html

AMD driver causing BSODs

www.techspot.com/news/89612-microsoft-removes-problematic-amd-driver-causing-bsods-windows.html

AMD users ditching cards over driver issues

www.techspot.com/news/84005-gamers-ditching-radeon-graphics-cards-over-driver-issues.html

This doesnt get into the rDNA downlclocking, r300 black screen issues, the vegas burning out their PCBs, ece. How far you want to go down this rabbit hole? Until very recently, AMD drivers were complete trash, this is why the fermi series still sold so well. After rDNA came out AMD finally buckled down and fixed their garbage after yet more negative media attention.
Nvidia has had all these types of issues as well with their drivers going back to Vista.

www.theregister.com/2008/03/28/nvidia_vista_drivers/

Some issues have even been bad hardware choices like the capacitor guidance when the 3000 series first launched. Evga didn’t leave Nvidia because they are too awesome and the vast majority of the tech industry protested the ARM merger because they knew Nvidia would hurt everyone with their business practices.

I owned AMD/ATI and Nvidia cards alternating back and forth for decades. There is nothing significantly different between the driver quality of the two companies for awhile now. There are different feature sets and performance is different for similar features.

Most internet myths are a product of cognitive dissonance.
en.wikipedia.org/wiki/Cognitive_dissonance?wprov=sfti1

It is easier to justify irrational love of something if you can create an environment where opposing forces are bad and unworthy of your love. So stop spreading the myth of AMD driver issues. It doesn’t help ANYONE.
Posted on Reply
#54
swirl09
Good GPU, not a bad price.
Posted on Reply
#55
DemonicRyzen666
ThomasKAMD is supporting RT just for the sake of saying "look, we've got that functionality too". Their implementation is poor though, at least comparing to nVidia's. Also, very few games support RT and many new one's don't have any form of RT at all.

And if you're planning to compare Radeon drivers to Intel ARC ones, just to prove your point, like some other folks are doing 'round here, go ahead :)
135 games out 240 DX12 games support some type of Raytracing So how is that Very few when it's like 56% of games on that API ?
Posted on Reply
#56
GreiverBlade
Chrispy_As someone who is generally favourable towards AMD and hates Nvidia for anti-consumer bullshit, it pains me to make the above post, but it's the cold hard truth.
good thing i disliked Nvidia because of drivers issues then :laugh: (/joke)

on a more serious note, i owned an equal number of Nvidia and ATI/AMD cards since late 1997 (starting with the Riva 128 and Rage LT Pro, aka: Mach64 LT) and truthfully i only got one major issue with the red team (that was corrected with the next driver) while i hade to rollback almost on a regular basis with Nvidia ... 6yrs with a GTX 1070 and i never could get the latest driver without issues :oops:

enjoying my current GPU with the latest driver in date is ... refreshing (also given it did cost me 75chf less than the GTX 1070, 450 vs 525chf :laugh: )
Posted on Reply
#57
ARF
DemonicRyzen666135 games out 240 DX12 games support some type of Raytracing So how is that Very few when it's like 56% of games on that API ?
I hope AMD will fix the poor RT performance come RDNA 3. :(
Posted on Reply
#58
Lianna
JAB CreationsHm, 4090 is only 31% stronger than a 3090ti. Someone must be upset that they didn't get FOUR TIMES PERFORMANCE. ︀︀︀:laugh::laugh::laugh:
fevgatosYour math is....wrong
Their math is wrong, but +45%, or 1.45x average performance is still far from 2x-4x vs. 3090 Ti that was promised:
NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090) (2nd image at 15:17 UTC)
Even if it was with DLSS3, 2x should be attainable in some games.
I checked all TPU results at 4k and the best I could find was AFAICR 1.75x.
Posted on Reply
#59
LFaWolf
JAB CreationsHm, 4090 is only 31% stronger than a 3090ti. Someone must be upset that they didn't get FOUR TIMES PERFORMANCE. ︀︀︀:laugh::laugh::laugh:
How did you get 31% from either of the graphs?
Posted on Reply
#60
dragontamer5788
Chrispy_Comparing performance/$ to a 4090 seems disingenuous; If AMD want to compare performance/$ against Nvidia, they should use a card at the same performance/capability level.

At best it competes with the $800 3080Ti in raster performance. In DXR titles it's barely keeping up with a $530 3070.

Additionally, it can't do DLAA, DLSS has better game support at the moment than FSR, and then there's NVENC which is vastly superior. If you do anything other than gaming you'll also appreciate CUDA support as so many applications support CUDA but not OpenCL.

The 6900XT is a good card, but it's not a 4090. Comparing it to the true competition shows that it's still awful value for money, like any flagship always is. Just buy a 3070/3070Ti/3080/3080Ti instead for better API support, equivalent performance, and much better features.

As someone who is generally favourable towards AMD and hates Nvidia for anti-consumer bullshit, it pains me to make the above post, but it's the cold hard truth.
AMD's Rx 6800+ series all have 16GBs of RAM though, which is very useful for compute applications.

That being said, Intel Arc 770 comes with 16GBs at $350, and probably is the price/performance king.
Posted on Reply
#61
ARF
I don't think that the vast majority of users care about CUDA and NVENC. These are marketing bullet points for the hardcore-die nvidia fanbase.
Posted on Reply
#62
neatfeatguy
DavenNvidia has had all these types of issues as well with their drivers going back to Vista.

www.theregister.com/2008/03/28/nvidia_vista_drivers/

Some issues have even been bad hardware choices like the capacitor guidance when the 3000 series first launched. Evga didn’t leave Nvidia because they are too awesome and the vast majority of the tech industry protested the ARM merger because they knew Nvidia would hurt everyone with their business practices.

I owned AMD/ATI and Nvidia cards alternating back and forth for decades. There is nothing significantly different between the driver quality of the two companies for awhile now. There are different feature sets and performance is different for similar features.

Most internet myths are a product of cognitive dissonance.
en.wikipedia.org/wiki/Cognitive_dissonance?wprov=sfti1

It is easier to justify irrational love of something if you can create an environment where opposing forces are bad and unworthy of your love. So stop spreading the myth of AMD driver issues. It doesn’t help ANYONE.
Let's not forget these Nvidia blunders:
www.tomshardware.com/news/Nvidia-196.75-drivers-over-heating,9802.html

www.destructoid.com/dont-install-the-newest-nvidia-driver-its-breaking-pcs/

Wait....what's this? AMD did it at least once, too?!?!
news.softpedia.com/news/warning-new-amd-crimson-driver-is-heating-and-killing-gpus-users-report-496867.shtml

Oh man! Is it true? Neither company is perfect?
Posted on Reply
#63
ARF
neatfeatguyNeither company is perfect?
Yeah. When will nvidia update its software user interface? They forgot that control panel from Windows 95 times :D
Posted on Reply
#64
Vader
Even though not a big change in the US since cards were already at that price like some mentioned, it's a welcome discount in other countries were they don't get the typical US discounts. Hopefully they price cut the 6700s and 6600s too!
Posted on Reply
#65
Guwapo77
HenrySomeone...the RX 6900 XT is a formidable 4K gaming graphics card...
Not really...Somewhat ironically (for being their almost top card), it's actually best suited for high refresh rate 1080p and 1440p gaming without RT:



That won't happen anytime soon, now that both will be on the same process...
I'm an owner of a 6900XT and I agree wholeheartedly with your take. I don't know why you're catching so much heat... The 6900XT could be buy one half off and get a second one free and it's still not a formidable 4K card. The 6900XT is a beast at 1440p and even then there are games like Cyberpunk that will bring it to its knees at 1440p. I'm an individual who loves to play at max settings above 90FPS and I wouldn't dare pair this with a 4K monitor. However, those who don't mind playing games 30-60 fps at low to medium setting and office work, it will do it.
TheinsanegamerNWoah, stop the kool aid party there man.

www.tomshardware.com/reviews/catalyst-13.8-frame-pacing-crossfire,3595-2.html
years ago, did AMD "fix" their frame pacing issues? The frame pacing issues that forum users complained about for years with no resolution?

Navi black screen

www.techspot.com/news/84228-latest-amd-radeon-drivers-meant-address-black-screen.html

AMD driver causing BSODs

www.techspot.com/news/89612-microsoft-removes-problematic-amd-driver-causing-bsods-windows.html

AMD users ditching cards over driver issues

www.techspot.com/news/84005-gamers-ditching-radeon-graphics-cards-over-driver-issues.html

This doesnt get into the rDNA downlclocking, r300 black screen issues, the vegas burning out their PCBs, ece. How far you want to go down this rabbit hole? Until very recently, AMD drivers were complete trash, this is why the fermi series still sold so well. After rDNA came out AMD finally buckled down and fixed their garbage after yet more negative media attention.
I can't speak for everyone...I haven't had any of the AMD driver issues UNLESS I used their non-WHQL drivers back in the day (those did suck). I've been using AMD cards since the 9700PRO.
Posted on Reply
#66
evernessince
ARFThe RTX 4090 is 133% (2.33x) faster in RT games:


NVIDIA GeForce RTX 4090 Founders Edition Review - Impressive Performance - Ray Tracing & DLSS | TechPowerUp
The 4090 wins big over both last gen AMD and Nvidia products at 4K with RT enabled. It's the best case scenario for the card compared to last gen.

Meanwhile at 1080p it's a mere 16% faster. At lower resolutions Nvidia's 4090 suffers from driver overhead as mentioned in the 4090 review.

Realtime RT is still very much in it's infancy. Games have at best 1-2 RT effects that were picked because they are light on performance and those effects are limited is scope. The 4090 finally brings acceptable RT performance for what games have now but when game devs start adding more it will essentially make the 4090 obsolete for those that care about realtime RT. Realtime RT is a blessing for Nvidia as it makes it extremely easy for them to tout massive gains in performance each generation and cash in.
Posted on Reply
#67
ARF
evernessinceMeanwhile at 1080p it's a mere 16% faster.
Is this because of broken games and their code, or will we blame the "slow" CPUs which never see utilisation and load more than 20%?
Posted on Reply
#68
evernessince
ARFThe scalpers are still lurking around? How about them to lower the price to 50% of a brand new card?
Who knows if that "second hand" actually works properly and how many mining hours it had been tortured.
I bought 4 RX 580s that were from a mining farm in 2018 for $80 / each. Every one of them is still in operation in other people's rigs as they were given as gifts and still work flawlessly.

The chances of you getting a bad mining card is the same as purchasing any other used card.
ARFIs this because of broken games and their code, or will we blame the "slow" CPUs which never see utilisation and load more than 20%?
According to TPU's review, the 4090 has additional driver overhead.
Posted on Reply
#69
wheresmycar
I haven't really been following the prices but its nice to see the 6900 XT dropping in price (or in an official capacity). Looks like AMD's making space for RDNA3 at more competitive prices or justifying potential inferior performance opposed to 40-series.

About all this free brand sentiment/free pessimism - its just freely boring!! Both NVIDIA and AMD are extremely fortunate to have free unemployed and free non-commissioned free members of a free society who out of their own free-will and exhausting free determination provide a free service to staunchly and freely support or market for free, each brand. Regardless of "fair play", no matter the cost, the free patriots always come out for free with their expensive sharpened swords and free patriotic flags - the craziest thing of all, they fund the war too (yep, no freebies there) I'm still trying to wrap my head around where all this freeotic behaviour comes from.... I stopped praising GPU manufacturers when the cost of GPUs was no longer acceptable. I'd be more than happy for this segment of the market to hit a all-time decline even if it challenges the very existence of these companies - well hopefully they will stay put and deliver more reasonable asking prices of which we can all blindly and patriotically chant about whilst surfing the freeotic patriot ship of contentment.
Posted on Reply
#70
Vayra86
fevgatosSure there arent a lot of rt games, but those games are a big chunck of the the main reason you upgrade your gpu. I mean - i only played 4-5 games with rt on my 3090, but then again the number of games that even required a 3090 to play properly were 10 at most. The rest i could have played perfectly fine with my older card (like apex warzone stray etc). So 50% of the games i upgraded for do in fact have rt. And in those the 6900xt is a 3070 competitor.
A very good point. If you're really interested in RT, you'll want to upgrade. Exactly working as Nvidia intended ;)
Posted on Reply
#71
Chrispy_
ARFThe scalpers are still lurking around? How about them to lower the price to 50% of a brand new card?
Who knows if that "second hand" actually works properly and how many mining hours it had been tortured.
I sold 24 mining GPUs on ebay in March, and I sold them with a no-quibble 12-month warranty backed at my own risk, and enforced by ebay
So far, 7 months in zero returns - nobody has even contacted me. I reckon I sold each card for £25 more than other cards listed simply because I had the confidence to offer a warranty.
I sell my gaming cards without warranty.

ETH mining (done by careful miners who cared about the hardware, efficiency, and their resale value) looked after the cards way better than any gamer would. Regular dust cleaning, careful 24/7 thermal monitoring, open-frames for exceptionally low operating temperatures, undervolted, GDDR6 temps lower than when gaming, and stable unchanging temperatures that meant it wasn't thermal-cycled like a gaming GPU is. My gaming cards get put in a box, never cleaned until they're replaced, and their workload is bursty resulting in frequent temperature and power spikes from idle, in a hotbox, with tons of thermal-cycling on the GPU die, the VRAM, and of course all the thermal pads.

The only caveat to a well-treated mining card is that the fans have been on their whole life. Given that their 24/7 TDP was around half of a gaming load, the fans were running fairly slowly so unlikely to be ruined, and you can replace GPU fans affordably and easily for most models.
dragontamer5788AMD's Rx 6800+ series all have 16GBs of RAM though, which is very useful for compute applications.
Which "compute applications"? It's nice in theory and I wish CUDA would die a horrible, proprietary death in favour of Opensource APIs, but the reality is that most software developers hook into CUDA.

I've been building and maintaining both CPU and GPU render farms at work for nearly two decades (well, one decade in the case of GPU rendering) and support for OpenCL is hot garbage. The API may be okay but most software wants CUDA so it doesn't matter how much RAM your Radeon has when the application only offers CUDA or CPU compute options. I'm coming from a 3D rendering/animation/modelling side, so perhaps financial/physics simulation software does actually have decent OpenCL support. I can only comment on the stuff my company does.
Posted on Reply
#72
NinjaQuick
dragontamer5788AMD's Rx 6800+ series all have 16GBs of RAM though, which is very useful for compute applications.

That being said, Intel Arc 770 comes with 16GBs at $350, and probably is the price/performance king.
People keep forgetting most games are console games ported to PC.
Want to see your PC brought to its knees? Play a game like Star Citizen, there should be a free- fly event, just be aware of the TOS, they enforce outside of what they claim they'll enforce, some people on their "player safety" team are political.act vists.
Posted on Reply
#73
Guwapo77
Vayra86A very good point. If you're really interested in RT, you'll want to upgrade. Exactly working as Nvidia intended ;)
Remember back when AA and later implementations of it was as problem for GPUs? Now down to the lowest GPU they handle it flawlessly. We can't stay stagnant, I (we) wants all the graphical eye candy. I hope to live long enough to where I can't tell the difference from computer generated and real life (in motion).
Posted on Reply
#74
GreiverBlade
dragontamer5788That being said, Intel Arc 770 comes with 16GBs at $350, and probably is the price/performance king.
well if my RX 6700 XT 12gb price of 450chf/$ was not due to a promotion ... (650chf/$ regular price) -that one would be the price/performance king... :laugh:
but yeah at 350 the A770 is almost adequately priced (if it was priced like that and not the 450$+ i will see :D )
Vayra86A very good point. If you're really interested in RT, you'll want to upgrade. Exactly working as Nvidia intended ;)
given the RT application ... heck i even ran the CP2077 bench at 1440 and 1620p60 RT on, it wasn't a slideshow strangely ... a few drop under 30fps but nothing unplayable, ofc benchmark might not reflect (hehe reflect ... ) normal gameplay tho :oops:
although i saw prettier reflection and lightning work in Morrowind :laugh: (joke again but OpenMW is really awesome! )
Guwapo77Remember back when AA and later implementations of it was as problem for GPUs?
oh, i do remember ... but the happiest thing for me is running a monitor to 3K does not need AA (yay, free performances increase, because nearly all AA algorithme will bring a performance drop even if small for some )
Posted on Reply
#75
NinjaQuick
Also, SC is proof you don't need RT to make a gorgeous game. RT is really nice, sure, but it is nice on the dev-end, way less time doing lighting work to have it baked perfectly and have good effects, just put the lights in and enjoy the show real time.
Posted on Reply
Add your own comment
May 4th, 2024 09:20 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts