• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 9070 16 GB Graphics Cards Drop Below MSRP in Europe; Only Temporarily

T0@st

News Editor
Joined
Mar 7, 2023
Messages
3,328 (3.99/day)
Location
South East, UK
System Name The TPU Typewriter
Processor AMD Ryzen 5 5600 (non-X)
Motherboard GIGABYTE B550M DS3H Micro ATX
Cooling DeepCool AS500
Memory Kingston Fury Renegade RGB 32 GB (2 x 16 GB) DDR4-3600 CL16
Video Card(s) PowerColor Radeon RX 7800 XT 16 GB Hellhound OC
Storage Samsung 980 Pro 1 TB M.2-2280 PCIe 4.0 X4 NVME SSD
Display(s) Lenovo Legion Y27q-20 27" QHD IPS monitor
Case GameMax Spark M-ATX (re-badged Jonsbo D30)
Audio Device(s) FiiO K7 Desktop DAC/Amp + Philips Fidelio X3 headphones, or ARTTI T10 Planar IEMs
Power Supply ADATA XPG CORE Reactor 650 W 80+ Gold ATX
Mouse Roccat Kone Pro Air
Keyboard Cooler Master MasterKeys Pro L
Software Windows 10 64-bit Home Edition
AMD's Radeon RX 9070 16 GB graphics card design launched along with a more capable and popular XT sibling, three months ago. Since then, the first wave of RDNA 4 desktop gaming products have hovered above suggested price baselines—much to the chagrin of brand champions. Yesterday, Germany's ComputerBase highlighted a brief fluctuation in elevated trends. Team Red's general European MSRP—for Radeon RX 9070 cards—is €629, including VAT. A price fluctuation report observed (on June 3): "Alternate.de is currently selling an XFX QuickSilver RX 9070 OC Gaming Edition for the first time at €613 (inc. VAT), thus below the MSRP. The (ComputerBase) editorial team was alerted to this by the community, and the bot for prices and availability for Radeon RX 9000 now also show this offer." Naturally, graphics connoisseur will scoff at this unusually low offer—after all, a mild upcharge grants access to the superior Radeon RX 9070 XT tier (MSRP: €689). The slightly cheaper option does have supporters; mainly due to its more energy efficient operation.

Members of the HotUKDeals community have become obsessed with finding deep graphics card discounts; a lucky few have boasted about acquiring current-gen AMD-based flagships at well below recommended price points. Several discerning customers have taken advantage of anomalous listings, and roundabout utilization of various eBay promotion codes. Pleasingly, a dual-fan Sapphire PULSE RX 9070 16 GB model floated just below British MSRP (£569.99, inc. VAT). Amazon UK's stock—of this barebones option—was quickly depleted, thanks to a tempting £10 reduction. Until the emergence of a current-gen Great Radeon Edition (GRE) design, (generally) AMD's Radeon RX 9070 model was considered an odd duck. A permanent price cut could raise its profile in the future.



View at TechPowerUp Main Site | Source
 
Now same needs to happen in the People's Republic of Canadistan
 
9070 XT now has a 699€ model in germany, i think that is also below MSRP or just very slightly above it. Unfortunatly its from acer
 
Technicallly, the 9070 non-XT is an odd duck, mainly because it's a 9070 XT with 8 less CUs.

But obviously we get why: These are "failed" dies that did not meet the quality standard to be a 9070 XT in the first place (maybe some of the CUs didn't pass or such).
 
great but who cares about the non XT model? and also still too expensive
 
Nothing special when you consider the Pulse and PowerColor Reaper were under £540 at launch. Most of them have been over £80 more than a 5070 for months, not sure why anyone is buying them when they're at the launch price of the 9070 XT. Wonder if AMD are going to pull the same trick with the 9060 XT, just increase it by 10-20% a week later.
 
Its Germany. Some shops will not sell to neighbour countries.

I think the cards are still overpriced. It's summer
 
great but who cares about the non XT model? and also still too expensive
Main issue is that it's actually significantly slower than XT model while barely costing any less.
 
great but who cares about the non XT model? and also still too expensive
For example, me and people who focus first on the highest possible energy efficiency, silence and elegance of their PC. And only then, within these guidelines - on pure rendering performance. For example; I am not interested in cards with a TDP above 300W at all. The RX 9070 model with its TDP of 220W offers significantly higher energy efficiency than its more powerful brother XT with a TDP of 304W.

The current RX 6800/16 (7nm, 250W TDP) offers high energy efficiency already from the factory, and after manual UV even much higher. Although I got a GPU with an average UV potential, in the end, at the cost of reducing the GPU clock by 5%, voltage by 9% and a power consumption limit of -8%, it was possible to significantly increase the already high factory energy efficiency. Now the GPU in 1440p mode draws only 70-140W (GPU only, no VRAM). This is a full-size Gigabyte with a large radiator and 3 fans. The effect: silence under full load, something beautiful.
I wonder what the RX 9070 will show after UV.
 
Main issue is that it's actually significantly slower than XT model while barely costing any less.
This is actually my concern about the 9070. At the previous MSRP, it would only be a $50 difference for a quite a bit more performance with the 9070 XT. Might as well spend that extra $50 for the extra +12% to +15% it nets you, which has shown that it can reach the RTX 5080 sometimes.
 
This is actually my concern about the 9070. At the previous MSRP, it would only be a $50 difference for a quite a bit more performance with the 9070 XT. Might as well spend that extra $50 for the extra +12% to +15% it nets you, which has shown that it can reach the RTX 5080 sometimes.
I think that’s a feature, not a bug. It’s a deliberate upsell on AMD part. The 9070 will probably mostly find its way into pre-builds to save costs and maybe will be used as a heavily discounted attempt to grab some market share later down the line when the generation is halfway done.
 
For example, me and people who focus first on the highest possible energy efficiency, silence and elegance of their PC. And only then, within these guidelines - on pure rendering performance. For example; I am not interested in cards with a TDP above 300W at all. The RX 9070 model with its TDP of 220W offers significantly higher energy efficiency than its more powerful brother XT with a TDP of 304W.

The current RX 6800/16 (7nm, 250W TDP) offers high energy efficiency already from the factory, and after manual UV even much higher. Although I got a GPU with an average UV potential, in the end, at the cost of reducing the GPU clock by 5%, voltage by 9% and a power consumption limit of -8%, it was possible to significantly increase the already high factory energy efficiency. Now the GPU in 1440p mode draws only 70-140W (GPU only, no VRAM). This is a full-size Gigabyte with a large radiator and 3 fans. The effect: silence under full load, something beautiful.
I wonder what the RX 9070 will show after UV.
Why not an undervolt on a 9070XT? You can UV a 9070XT close to the power consumption of a 9070, while still having higher performance.
 
For example, me and people who focus first on the highest possible energy efficiency, silence and elegance of their PC. And only then, within these guidelines - on pure rendering performance. For example; I am not interested in cards with a TDP above 300W at all. The RX 9070 model with its TDP of 220W offers significantly higher energy efficiency than its more powerful brother XT with a TDP of 304W.

The current RX 6800/16 (7nm, 250W TDP) offers high energy efficiency already from the factory, and after manual UV even much higher. Although I got a GPU with an average UV potential, in the end, at the cost of reducing the GPU clock by 5%, voltage by 9% and a power consumption limit of -8%, it was possible to significantly increase the already high factory energy efficiency. Now the GPU in 1440p mode draws only 70-140W (GPU only, no VRAM). This is a full-size Gigabyte with a large radiator and 3 fans. The effect: silence under full load, something beautiful.
I wonder what the RX 9070 will show after UV.
I present you Radeon Chill feature. What would you say to sub 200W average power consumption for running games maxed out with ray tracing and stuff on RX 9070 XT that has TDP of 340W? That's the magic of Radeon Chill.

I have it set to 100 fps bottom limit and 238 fps upper limit (because I have 240 Hz display so this acts instead of frame limiter to avoid tearing). I do use AFMF which further blurs the line between lowest limit and higher. The card has the grunt to push higher framerates when needed, but when there is no need, it just won't, saving power while not really affecting the way games play because it's entirely seamless.

Marvel Rivals, Overwatch 2, Oblivion Remastered, Doom The Dark Ages, Dying Light 2, I play all these games maxed out with Radeon Chill and I really can't tell if it's there or not. It's actually wild how underrated Radeon Chill feature is. Granted, it works the best with first person shooter games (or other mouse controlled action games) because framerate is dictated by how intense mouse movement is so side scrollers, racing games and potentially strategy games may not benefit most from it, but action games where mouse is involved for aiming, it works amazingly well. Seriously, I urge all Radeon users to check it out.

If RTX 5070Ti has the upper hand in efficiency by few 10W in most games, it suddenly HEAVILY loses against RX 9070 XT in power consumption because of this "simple" software feature. Which is why I think it's a shame reviewers don't really talk about it. If they have to talk about all the NVIDIA framegen and DLSS stuff, why not talk about this too. It's certainly advantage and for me at least a newly discovered requirement for graphic card. Which means Radeon will be a preferred choice next time because of this. Which is kinda wild considering I've been on NVIDIA for last 10 years
 
I present you Radeon Chill feature. What would you say to sub 200W average power consumption for running games maxed out with ray tracing and stuff on RX 9070 XT that has TDP of 340W? That's the magic of Radeon Chill.

I have it set to 100 fps bottom limit and 238 fps upper limit (because I have 240 Hz display so this acts instead of frame limiter to avoid tearing). I do use AFMF which further blurs the line between lowest limit and higher. The card has the grunt to push higher framerates when needed, but when there is no need, it just won't, saving power while not really affecting the way games play because it's entirely seamless.

Marvel Rivals, Overwatch 2, Oblivion Remastered, Doom The Dark Ages, Dying Light 2, I play all these games maxed out with Radeon Chill and I really can't tell if it's there or not. It's actually wild how underrated Radeon Chill feature is. Granted, it works the best with first person shooter games (or other mouse controlled action games) because framerate is dictated by how intense mouse movement is so side scrollers, racing games and potentially strategy games may not benefit most from it, but action games where mouse is involved for aiming, it works amazingly well. Seriously, I urge all Radeon users to check it out.

If RTX 5070Ti has the upper hand in efficiency by few 10W in most games, it suddenly HEAVILY loses against RX 9070 XT in power consumption because of this "simple" software feature. Which is why I think it's a shame reviewers don't really talk about it. If they have to talk about all the NVIDIA framegen and DLSS stuff, why not talk about this too. It's certainly advantage and for me at least a newly discovered requirement for graphic card. Which means Radeon will be a preferred choice next time because of this. Which is kinda wild considering I've been on NVIDIA for last 10 years
Uhm, no. Just no. Please oh god no.
 
Technicallly, the 9070 non-XT is an odd duck, mainly because it's a 9070 XT with 8 less CUs.

But obviously we get why: These are "failed" dies that did not meet the quality standard to be a 9070 XT in the first place (maybe some of the CUs didn't pass or such).
The 9070 non_XT is the GPU to buy. You get 220W power consumption instead of 315-330W and it is only 10% slower for 7% less money. You can also get one of the smaller cards too, so it doesn't have to be a brick.
 
Uhm, no. Just no. Please oh god no.
Ah, one of those no fake pixels, no fake frames, god forbid any framerate optimizations that would result in lower power consumption type of people. I guess year 2001 called, right before we got Pixel Shaders, coz that might already be too impure...
 
I present you Radeon Chill feature. What would you say to sub 200W average power consumption for running games maxed out with ray tracing and stuff on RX 9070 XT that has TDP of 340W? That's the magic of Radeon Chill.

I have it set to 100 fps bottom limit and 238 fps upper limit (because I have 240 Hz display so this acts instead of frame limiter to avoid tearing). I do use AFMF which further blurs the line between lowest limit and higher. The card has the grunt to push higher framerates when needed, but when there is no need, it just won't, saving power while not really affecting the way games play because it's entirely seamless.

Marvel Rivals, Overwatch 2, Oblivion Remastered, Doom The Dark Ages, Dying Light 2, I play all these games maxed out with Radeon Chill and I really can't tell if it's there or not. It's actually wild how underrated Radeon Chill feature is. Granted, it works the best with first person shooter games (or other mouse controlled action games) because framerate is dictated by how intense mouse movement is so side scrollers, racing games and potentially strategy games may not benefit most from it, but action games where mouse is involved for aiming, it works amazingly well. Seriously, I urge all Radeon users to check it out.

If RTX 5070Ti has the upper hand in efficiency by few 10W in most games, it suddenly HEAVILY loses against RX 9070 XT in power consumption because of this "simple" software feature. Which is why I think it's a shame reviewers don't really talk about it. If they have to talk about all the NVIDIA framegen and DLSS stuff, why not talk about this too. It's certainly advantage and for me at least a newly discovered requirement for graphic card. Which means Radeon will be a preferred choice next time because of this. Which is kinda wild considering I've been on NVIDIA for last 10 years
Unfortunately, Radeon Chill has been useless for me for the last 5 years. Probably because I always actively move my mouse, especially in FPS games. Even in Oblivion Remastered I am always trying to bunny hop (Acrobatics 100) so this is probably a me problem. LOL

Anti-Lag (including Anti-Lag 2) and FRTC though, extremely useful, especially with FreeSync Premium off (along with any in-game VSync settings off) for eSports/competitive shooters. FRTC is very useful for 2D games/boomer shooters, especially if there is no in-game frame limiter.
 
Ah, one of those no fake pixels, no fake frames, god forbid any framerate optimizations that would result in lower power consumption type of people. I guess year 2001 called, right before we got Pixel Shaders, coz that might already be too impure...
It's useless. Lowers your framerate, stutters when your fps suddenly skyrockets due to sudden movements, increases input latency, doubly increases input latency case you can't use antilag with it, and then on top of that you are using AFMF! Come on man...

If you are playing a lightweight game and want to save power just power limit the thing. Im doing it on my 4090 - when not playing the latest triple AAA game I have it at 220w
 
Last edited:
great but who cares about the non XT model? and also still too expensive
Smart people care..

They buying it and flashing bios from XT version. End of the day you have XT version for less money, same was done back in days with vega56, flashed bios from vega64.
 
For example, me and people who focus first on the highest possible energy efficiency, silence and elegance of their PC. And only then, within these guidelines - on pure rendering performance. For example; I am not interested in cards with a TDP above 300W at all. The RX 9070 model with its TDP of 220W offers significantly higher energy efficiency than its more powerful brother XT with a TDP of 304W.

The current RX 6800/16 (7nm, 250W TDP) offers high energy efficiency already from the factory, and after manual UV even much higher. Although I got a GPU with an average UV potential, in the end, at the cost of reducing the GPU clock by 5%, voltage by 9% and a power consumption limit of -8%, it was possible to significantly increase the already high factory energy efficiency. Now the GPU in 1440p mode draws only 70-140W (GPU only, no VRAM). This is a full-size Gigabyte with a large radiator and 3 fans. The effect: silence under full load, something beautiful.
I wonder what the RX 9070 will show after UV.
You know you can just limit clocks to get a power usage you like, no?

I run my 7900XT at around 250W. A wider/bigger GPU has more options to reduce power without losing a lot of performance.

So I'm not seeing this argument in favor of a 9070 that is missing shaders.
 
You know you can just limit clocks to get a power usage you like, no?

I run my 7900XT at around 250W. A wider/bigger GPU has more options to reduce power without losing a lot of performance.

So I'm not seeing this argument in favor of a 9070 that is missing shaders.
Common sense ain't that common. When i said im running my 4090 at 320w they told me i should have bought a 4080,lol
 
It's useless. Lowers your framerate, stutters when your fps suddenly skyrockets due to sudden movements, increases input latency, doubly increases input latency case you can't use antilag with it, and then on top of that you are using AFMF! Come on man...

If you are playing a lightweight game and want to save power just power limit the thing. Im doing it on my 4090 - when not playing the latest triple AAA game I have it at 220w
Lol, it does nothing you've described. Like, I'm literally playing competitive online games like Marvel Rivals and Overwatch 2 with Radeon Chill enabled. AntiLag makes next to no difference and AFMF doesn't actually induce any noticeable latency and neither does AntiLag2 negate it in any way I could even feel.

This is with latest Windows 11, there was some choppy behavior few months ago with Radeon Chill, but not anymore. I can't really sense the framerate fluctuations at all. I'm running OLED monitor where every imperfection is visible instantaneously.
 
Back
Top