• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 6950 XT Now as Low as $600, Poses a Juicy Alternative to RTX 4070

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,853 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
In the height of the crypto-mining GPU shortage of 2021-22, the Radeon RX 6900 XT and its refresh the RX 6950 XT, were scalped and resold for upward of $2,000. You can get one for as low as $600 on Newegg. The ASRock Radeon RX 6950 XT Phantom Gaming OC is now listed on U.S. retailer Newegg.com at $630, with a coupon code that shaves $30 off, bringing the effective price down to $600, which is the MSRP of the recently announced NVIDIA GeForce RTX 4070 "Ada."

Our testing shows that the RTX 4070 offers performance on-par with previous-generation RTX 3080 and the Radeon RX 6800 XT. The RX 6900 XT is about 6% faster than the RTX 4070 at 1440p (averaged over our test suite), and the RX 6950 XT from our older reviews is about 6-7% faster than the RX 6900 XT at 1440p. This is, however, performance with raster 3D graphics (which makes up the majority of gaming graphics), the ray tracing performance of the RX 6950 XT is closer to that of the RTX 3070 Ti, or about 23% slower than the RTX 4070. The RTX 4070 is a more efficient GPU, and also offers next-gen features such as DLSS 3 Frame Generation.



View at TechPowerUp Main Site | Source
 
I'm seeing $650 at my local Microcenter. $600 from Newegg.

Seems like a good time to think about a new rig! Zen4, cheaper GPUs, DDR5 upgrade. Its just the motherboards that feel overpriced (but PCIe5 / DDR5 are better, so maybe its worth the extra price on the mobo).
 
This price seems amazing but I'm nervous about buying these high end cards because of both the power needed and the problems reported.
 
Since energy price is going insane everywhere, I wonder how much performance can we get out of 6950XT if we undervolt it with 200W limit?

Would be pretty annoying to make sure it is actually 200W though, since AMD report wattage on core only and we need something to measure rest of the board.
 
...and the extra vram. 12GB is a bad joke.
 
This price seems amazing but I'm nervous about buying these high end cards because of both the power needed and the problems reported.
The power can be controlled to an extent by limiting the TDP/temps etc. But the biggest issue IMO with the previous gen is the ridiculous sizes on some of them! No way I'm getting a 3-4 slot GPU ever.
 
Pretty appealing from a price/performance perspective, but its too big and I don't think my PSU could handle it. If they get this aggressive with pricing on the 6800 XT though, I'm in.
 
This price seems amazing but I'm nervous about buying these high end cards because of both the power needed and the problems reported.
Power is an issue. Other problems... are you thinking of anything specific? Might be able to shine some light, as far as I know RDNA2/3 are in a good place now. I'm living the RDNA3 dream right now, no problems whatsoever ;)
 
the enormous difference in power draw makes all the different, it's a pay now and keep it forever, no more payments, VS a pay now and then you have to pay monthly instalments for life

it would be a disastrous choice where i live
 
I dont agree, the 4070 is way more energy efficient and has stuff like DLSS 3.0
If the RX6950 will end up being able to run FSR 3.0 and it ends up being worth a damn, then maybe.

Right now AMD just needs to DROP The F out of their cards, everything down 150 - 200 dollars/euro AND release the replacements already ><
 
I dont agree, the 4070 is way more energy efficient and has stuff like DLSS 3.0
If the RX6950 will end up being able to run FSR 3.0 and it ends up being worth a damn, then maybe.

Right now AMD just needs to DROP The F out of their cards, everything down 150 - 200 dollars/euro AND release the replacements already ><

People trash Nvidia, but AMD is selling a end of life card with similar performance and almost double the power draw for similar price, it's insanity. :kookoo:
I guess the only reason anyone would buy this was if they went by that brainless HU video about vram, and you almost think it was exactly tailor made for that purpose. We don't need to lower the price because vram o_O
 
the enormous difference in power draw makes all the different, it's a pay now and keep it forever, no more payments, VS a pay now and then you have to pay monthly instalments for life

it would be a disastrous choice where i live
Not so much power draw, but power spikes. And resulting instability on below spec PSUs.
You can undervolt just fine and use as much power as any other high end GPU - 250-300W give or take, and you can go lower too. At the cost of single digit performance %.

This is no different than with any other high end GPU from the last gen or even the current one

1682327509923.png


^ Gaming typical draw.

1682327631365.png

So please... let's keep talking about reality, and take note of the Ampere cards, and even the mighty efficient Ada based 4090 at 508W.
Similarly, the efficiency gap between both Ampere and RDNA2, OR Ada and RDNA3 is negligible.
 
Last edited:
Not so much power draw, but power spikes. And resulting instability on below spec PSUs.
You can undervolt just fine and use as much power as any other high end GPU - 250-300W give or take, and you can go lower too. At the cost of single digit performance %.

This is no different than with any other high end GPU from the last gen or even the current one

View attachment 292867

^ Gaming typical draw.
So please... let's keep talking about reality.

i haven't followed the 4070, because it's an idiotic product, don't know it's undervolt capabilities, but in general all cards can be undervolted for a single digit performance, that's true, but if you are going to do it (and 99% don't) won't you prefer to start with the one that draws almost half at similar performance?
 
i haven't followed the 4070, because it's an idiotic product, don't know it's undervolt capabilities, but in general all cards can be undervolted for a single digit performance, that's true, but if you are going to do it (and 99% don't) won't you prefer to start with the one that draws almost half at similar performance?
It doesn't draw almost half, the gap is 100W in a worst case scenario only, and <50W if you undervolt RDNA2, while the 4070 doesn't gain nearly as much efficiency from UV.
Puts a rather different perspective on your supposed 'insanity' of buying a 16GB GPU at 600 bucks doesn't it.

You really need to reflect what you're saying against the actual numbers. The 4070 FE does 4.5W/frame; its not even the most efficient Ada.

1682328077099.png
 
It doesn't draw almost half, the gap is 100W in a worst case scenario only, and <50W if you undervolt RDNA2, while the 4070 doesn't gain nearly as much efficiency from UV.
Puts a rather different perspective on your supposed 'insanity' of buying a 16GB GPU at 600 bucks doesn't it.

You really need to reflect what you're saying against the actual numbers. The 4070 FE does 4.5W/frame; its not even the most efficient Ada.

View attachment 292869

sure there are ifs and buts, but the numbers those are almost double or almost half depending on your perspective

Untitled.png
 
sure there are ifs and buts, but the numbers those are almost double or almost half depending on your perspective

View attachment 292870
Furmark, dude.

Please. Just admit your earlier comment was pulling things out of proportion, don't make a further fool of yourself, seriously.
 
Furmark, dude.

Please. Just admit your earlier comment was pulling things out of proportion, don't make a further fool of yourself, seriously.

that's the cards power draw, period, that's a fact. If you undervolt or not and 99% don't, if you play game x or y with feature z or x enabled, competitive gaming or with a fps cap, or if you fold mine or do whatever, get better or worst luck on the lottery and your card undervolts better or worst, etc... it's another story

not to mention needing a bigger psu
 
that's the cards power draw, period, that's a fact. If you undervolt or not and 99% don't, if you play game x or y with feature z or x enabled, competitive gaming or with a fps cap, or if you fold mine or do whatever it's another story
Ah, so 4090's 'power draw' is 666,6W?

You need to get your head examined, fast. Stop grasping at straws, count to 10, and admit you're spouting BS and got corrected. All is well. I have a persistent allergy for nonsense and this is nonsense. Furmark hasn't been good for anything since many years, its called a power virus for a reason.
 
Ah, so 4090's 'power draw' is 666,6W?

You need to get your head examined, fast. Stop grasping at straws, count to 10, and admit you're spouting BS and got corrected. All is well. I have a persistent allergy for nonsense and this is nonsense. Furmark hasn't been good for anything since many years, its called a power virus for a reason.

similar situation, one draws 200, the other draws 377. If your point is in other scenarios one won't draw 377, the other won't draw 200
 
similar situation, one draws 200, the other draws 377. If your point is in other scenarios one won't draw 377, the other won't draw 200
Not sure if you're joking or actually serious now :D
I hope you're joking. Again: Furmark is no basis to indicate ANYTHING. All you get to see is maximum board power; not what GPUs actually do in any typical load. Furmark doesn't even get boost clocks going.

Just stick to the gaming power draw, its much safer for you, interpreting numbers clearly isn't your strong suit. TPU has nice bar charts that represent the different types of power draw clearly. There is also a Furmark one. Do a little compare, and you might figure it out.
 
Not sure if you're joking or actually serious now :D
I hope you're joking. Again: Furmark is no basis to indicate ANYTHING. All you get to see is maximum board power; not what GPUs actually do in any typical load. Furmark doesn't even get boost clocks going.

Just stick to the gaming power draw, its much safer for you, interpreting numbers clearly isn't your strong suit. TPU has nice bar charts that represent the different types of power draw clearly. There is also a Furmark one. Do a little compare, and you might figure it out.

what's "gaming power draw"? what games? what settings? do you cap fps? is the optimized or not? do you max fps or cap fps? does the game prefer team red or blue? what etc etc .... ???!!!
 
what's "gaming power draw"? what games? what settings? do you cap fps? is the optimized or not? do you max fps or cap fps? does the game prefer team red or blue? what etc etc .... ???!!!
Those metrics are a good 6 posts above this one. You take averages across a large benchmark suite, and you get those nice bar charts.
Or you can cherry pick games and pick out the biggest offenders in peak power draw.

NEITHER will show you anything close to Furmark numbers, and all GPUs much closer together than their peak power draw suggests.

But you know this already, you're just trying to make a case where there isn't one. Just stahp
 
Great actually.

Regarding the discussion about RX6x and RTX3xx power spikes. W1z stated it as one of the points that current gen of cards softens things in this regard and are less critical.

The thing about picking up a correct PSU and power consumption.

Do PSU reviews answer the question, will my PC work with this certain PSU? I have 6900XT + 5800X3D for example? As the PSU reviews are full of self proclaiming certification and other BS - no, they are weak in that regard, giving you the practical info. You usually will not get the answer unless you try itself. OCP triggering is a sketchy topic really.
 
Back
Top