• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 9070 XT Tested in Cyberpunk 2077 and Black Myth: Wukong

This seems too good to be true... but if TPU and HWU says this is legit, I'd be very happy for buying this GPU. Like everyone else here, I am curious to see what the MSRP will be.
 
This seems too good to be true... but if TPU and HWU says this is legit, I'd be very happy for buying this GPU. Like everyone else here, I am curious to see what the MSRP will be.
This is pretty much my view. One W1zzard gives me the good numbers I'll be recommending it to my friends looking to upgrade. I'll be keeping my 7900 XTX myself but my recommendations will (hopefully) be this and one of the Battlemage cards.
 
Now to see what excuses ensue.:nutkick:
I can think on 3:

1- Doesnt have DLSS.

2- Power consumption sucks.

3- The classic "AMD drivers sucks".

Pick your poison.
 
AMD says no one has access to the finished driver. How accurate is this information then?

Is the "finished" driver supposed to increase the performance by a significant / noticeable margin ?

I doubt that Nvidia will speed up the 2/2 and half years release cycle that they have now. I'm wondering however what the 3nm GPU lineup will look like, in theory, they should be much faster, but with TSMC hiking the prices, we'll probably get even more cut down chips for any tier below the xx90.

View attachment 379785View attachment 379786

2nm @ $30,000 and 3nm @ $18,000 are prices which will stop both AMD and Nvidia from releasing. Given also that the 3nm process is not suitable for GPUs at all, it means the stagnation is here, real and to stay.
Goodbye progress in the graphics department !
 
I hope they pit this against the 5070 - cause then it has a very nice Vram advantage. If they try to pit this against the 5070ti, their vram advantage is gone and then it's just down to performance and software.

Nearly 4080 performance for an expected price of around $550 is a winner in my books. Let's not forget about the 16 GB VRAM, either. The 5070 will only have 12.
I have some bad news though, the leaks (grain of salt, of course) are suggesting that amd is pitting this against the 5070ti and not the 5070. The normal 9070 non XT will go against the 5070. I hope that's false
 
I think it is Path Tracing per the third pict and I could run RT ultra on my 7800xt fine... forget the fps and settings tho but it was playable. Theres 2 more RT levels above ultra tho IIRC which kind of defeats the term.
They're just heavily overselling the very same reflections I've been seeing in my playthrough of rasterized Hitman 1-3 levels yesterday much the same. Strangely, without using RT. Cars, even razor sharp reflections of moving assets and NPCs, are all there. High density environments, tons of light sources... looks fine.

But... Oh no, they don't appear when the actual NPC is off screen! ...But... it does run locked 144 FPS at all maxed settings... and the 7900XT was barely using 150W doing so lol. The gap is just so immense, its insane. And the visual change? Literally interchangeable.
 
I hope they pit this against the 5070 - cause then it has a very nice Vram advantage. If they try to pit this against the 5070ti, their vram advantage is gone and then it's just down to performance and software.


I have some bad news though, the leaks (grain of salt, of course) are suggesting that amd is pitting this against the 5070ti and not the 5070. The normal 9070 non XT will go against the 5070. I hope that's false
what are you talking about with "pit"?

the only matter is perf/price.

5700 @ 550 USD.
5700ti @ 800 USD.

9070XT will be <= 550 USD, so worst case at 5700 price (and might be up to 100 USD cheaper). That's what it is pitted against, not against the much more expensive 5070ti.

9070 will be even cheaper, fighting against 5060ti price probably.

Now, what about the performance? We have only some leaks for 9070xt, and nothing at all for 5070 AFAIK, so lets wait.
I guess 9070xt will be in between 5070 and 5070ti in term of perf.
 
Last edited:
I hope they pit this against the 5070 - cause then it has a very nice Vram advantage. If they try to pit this against the 5070ti, their vram advantage is gone and then it's just down to performance and software.


I have some bad news though, the leaks (grain of salt, of course) are suggesting that amd is pitting this against the 5070ti and not the 5070. The normal 9070 non XT will go against the 5070. I hope that's false
What 5070ti?
 
what are you talking about with "pit"?

the only matter is perf/price.

5700 @ 550 USD.
5700ti @ 800 USD.

9070XT will be <= 550 USD, so worst case at 5700 price (and might be up to 100 USD cheraper). That's what it is pitted against, not against the much more expensive 5070ti.

9070 will be even cheaper, fighting against 5060ti price probably.

Now, what about the performance? We have only some leaks for 9070xt, and nothing at all for 5070 AFAIK, so lets wait.
I guess 9070xt will be in between 5070 and 5070ti in term of perf.
I don't think this will be below 550 though. If it was, they'd tell us at CES man.
 
I have some bad news though, the leaks (grain of salt, of course) are suggesting that amd is pitting this against the 5070ti and not the 5070. The normal 9070 non XT will go against the 5070. I hope that's false

5070 @ 550 USD.
5070 Ti @ 800 USD.
9070XT will be <= 550 USD, so worst case at 5700 price (and might be up to 100 USD cheraper). That's what it is pitted against, not against the much more expensive 5070 Ti.

I don't think this will be below 550 though. If it was, they'd tell us at CES man.

It will be bad news for AMD - goodbye sales and welcome shrinking market share.
Sooner or later AMD will reach 5% market share, and then 0.5%, and then they will stop releasing new GPUs that no one buys.
 
Is the "finished" driver supposed to increase the performance by a significant / noticeable margin ?
According to Frank Azor, performance will be higher.

I just don’t know how much more.
 
It will be bad news for AMD - goodbye sales and welcome shrinking market share.
Sooner or later AMD will reach 5% market share, and then 0.5%, and then they will stop releasing new GPUs that no one buys.
AMD suffers more from the bad reception of the PS5Pro than it does from not making competitive dGPUs for the PC market these days.

Their GPU gaming division floats on consoles. Not us.
 
bad reception of the PS5Pro
These go for as much as $1150 in here. Won't be surprised it is the reason why.

An i5-12400F + 32 GB DDR4-3200 CL16 + 4070 Super / RX 7900 GRE + 1 TB NVMe build will cost you about as much. No need to be a rocket scientist to tell which one plays games better.
 
Is the "finished" driver supposed to increase the performance by a significant / noticeable margin ?



2nm @ $30,000 and 3nm @ $18,000 are prices which will stop both AMD and Nvidia from releasing. Given also that the 3nm process is not suitable for GPUs at all, it means the stagnation is here, real and to stay.
Goodbye progress in the graphics department !
I'd say that in the next 10 years dGPUs to play games will go extinct as it becomes unfeasible to release them.
 
I don't think this will be below 550 though. If it was, they'd tell us at CES man.
Well, they didnt say at CES because they want to see the perf of 9070(xt) vs 5070(ti), and they will adjust the price accordingly.

We know they target as cheap as possible a GPU, and their aim is to undercut nvidia as much as possible on raster perf/price.

The leaks point to 450-550 usd which seems reasonable to me. also, the very naming 9070(xt) points to a higer mainstream segment, not at the enthusiast (high price) level, so 600 usd would be the highest possible for such a segment [400-600].

Hence, the card will be positioned below 5070 in price, for a performance likely higher in raster.
How much lower in price / how much faster will probably go together (if perf are very good, then they will price higher in the curve, ~550, if perf is barely faster than 5070 then they will price much lower at 450).

We ll just have to wait!
 
With relatively decent raytracing performance perhaps they can take any gained mindshare into RDNA 5 (that unified architecture card) when path tracing starts becoming mainstream. Like someone here said "It's all math" so interesting to see how companies start to diversify beyond strictly visuals with their "suites" a la "Project Amethyst" / Direct X Neural Rendering / Nvidia ace, etc...
 
So based on those benchmark numbers, how does this card compare to say 7900 XT, theoretically?
 
AMD suffers more from the bad reception of the PS5Pro than it does from not making competitive dGPUs for the PC market these days.

Their GPU gaming division floats on consoles. Not us.

Bad reception could be influenced by several things:
1. Economic recession - people have no money to spend;
2. No interesting new games;
3. Covid and other disturbing circumstances which make the interest shift in other directions.

I'd say that in the next 10 years dGPUs to play games will go extinct as it becomes unfeasible to release them.

RDNA 4 in this form is a mistake. Instead, they should have tried to design a large monolithic die with 7000-8000 shaders, 512-bit memory bus, and improved ray-tracing.
 
AMD announced improved h264 encoding, anti-lag 2 picking up traction... not looking for GPU rn but next cycle will definitely consider it, especially as they're likely to offer better VRAM and pricing in one go.
On the other hand, this no doubt will force nvidia to not inflate their GPU price as much. I have a bit of hope for this market.
 
About the same to slightly faster in raster but much faster in RT.
"DLSS 4, along with all the multi-frame generation AI chicanery appears to be all set to shake up the way we measure performance, making things even more complicated."
 
This round i might go Nvidia, that 5080 with dual decoders is a god send for video editing, then the amazing support for CUDA in all Adobe products and all the plugins, priority optimized for CUDA and then Opencl.
One Nvidia rtx 5000 decoder can do 8 streams of 4k 60 fps and RTX 5080 has two, that's just nuts, Nvidia killed Intel quicksync and all Apple M chips.
Amd is just dead, i can't see how they will compete with this at any price, they have better upscaling, better professional support, better raytracing, better AI, why would anyone buy AMD for 50-100$ cheaper.
I hope they can survive this so we have some competition but this might be over.
 
This round i might go Nvidia, that 5080 with dual decoders is a god send for video editing, then the amazing support for CUDA in all Adobe products and all the plugins, priority optimized for CUDA and then Opencl.
One Nvidia rtx 5000 decoder can do 8 streams of 4k 60 fps and RTX 5080 has two, that's just nuts, Nvidia killed Intel quicksync and all Apple M chips.
Amd is just dead, i can't see how they will compete with this at any price, they have better upscaling, better professional support, better raytracing, better AI, why would anyone buy AMD for 50-100$ cheaper.
I hope they can survive this so we have some competition but this might be over.
Sorry I couldn’t hear you over all the Nvidia brand loyalists saying the exact same thing.

"DLSS 4, along with all the multi-frame generation AI chicanery appears to be all set to shake up the way we measure performance, making things even more complicated."
Only in Huang’s wet dreams.
 
Back
Top