• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Sapphire Radeon RX 9070 XT Nitro+

Green cope is funny. :laugh:

I honestly struggle to find any instance a 5070ti is going to give you a tangibly better experience. I keep looking, haven't found it yet.
Hell, even the 5080 in most cases isn't going to net you anything majorly worthwhile, especially if you factor in an OC to get you over any potential bump.

Honestly, scroll this this min page, look at your resolution/game, and tell me where either of those are a good option compared to this and at this price. If so, you expect that to be the rule or the exception?

You can play Wukong at 1080p on either, or it can suck at 1440p on either. I am confused. I am looking at them. Are you? I see almost zero difference. 0.2fps at 1440p. Zero point two.
We've already seen it can upscale from 1080p->4k and keep 60fps with FG...That's pretty dope. So that's probably at least playable/decent as well.
I haven't seen how nVIDIA does in that instance. Big DLSS4 perf hit on that level of upscale, so IDK. This is why stuff like that needs to be tested/compared on a consistent basis (with lows/mins shown).

You do you...I'm just saying. There is real, real cope and reality distortion for some reason. Everyone should be happy, especially since FSR4 is apparently better than DLSS3, which is AWESOME NEWS.

It's still 10-20% slower in RT than a 5070Ti, it's improved for sure but nothing to celebrate imo
Also look at performance numbers of RT in Wukong...
 
Last edited:
You do you...I'm just saying. There is real, real cope and reality distortion for some reason.
Same old, same old. It's a constant back and forth between the AMD and NVIDIA clubs. The only people not murdering each other are the Intel fans (all 4 of them)
Everyone should be happy, especially since FSR4 is apparently better than DLSS3, which is AWESOME NEWS.
I wonder how it compares to DLSS4, more specifically how close the FSR4 vs DLSS4 comparison is vs the FSR3 vs DLSS3 comparison. I think FSR4 is gonna close the gap even more now (not that I expect AMD to ever surpass NVIDIA anytime soon, buts its still really nice for AMD fans.)
 
Getting one of these just to spite Nvidia assuming availability is a non issue and a 9800x3d tomorrow. All new build year for me. Full AMD at that. Haven't done that since the Phenom days.

i didnt noticed it mentioned. is that 2 shunts on the power socket?
It's two fuses, and shunts. Looks like either 2 or 4 shunts.
 
Well, I'll give it to them. I didnt think they'd actually hit the 7900xtx numbers. But they did. Makes me really want that 9080xt now.

Cant wait for a FS4 deep dive.
LMAO we gonna see all the AMD fanboys love the 12V2x6 connector now!

Trolling aside, I like the design of this card. I do not like that it is 3 slots, nor that Sapphire wants $130 over MSRP for a different cooler and some magnets (how do they work?), but scalpers gonna scalp (and that includes AMD).

Power consumption is still bad compared to NVIDIA but I reckon a 9070 XT undervolted to 9070 levels will perform nicely. At least they fixed non-gaming power.


That would be true if the NVIDIA cards were available in any quantity... but they're not.
Man, you think it's bad now, wait for the first one to melt.
 
Well, I'll give it to them. I didnt think they'd actually hit the 7900xtx numbers. But they did. Makes me really want that 9080xt now.

Cant wait for a FS4 deep dive.

DF has a video up on it and it looks promising absolutely murders FSR3 and I'd say in the games they used it's slightly better than DLSS3 but is still behind DLSS4 but by a lesser extent than FSR2/3 was behind DLSS 2/3.

It's now more than adequate and going by my impression on a large 4k screen based on a YouTube video it's finally worth using.

Glad they are finally targeting their own hardware to make somthing much better.
 
Thanks for the review @W1zzard . I wish AMD had sent you at least one card with the regular MSRP of $599. In the conclusion, you write:

As fabrication process, AMD is using TSMC's 4 nanometer N4P node, while Blackwell is still on the same 5 nm process as Ada.

At least for the biggest Blackwell die, Nvidia says that it's built using 4NP which seems like a typo: N4P is named similarly and is a 4 nm process. In any case, the 5 nm and 4 nm processes aren't significantly different in feature size.
 
Where did you get this info on the new drivers and CS2 ? Card looks great and if they can improve CS2 performance they can take my money!
It is just common sense,
Source 2 engine is important game engine, it is a safe bet that AMD will fix it in a short time.
:toast:
 
So after everything, it seems like it was AMD who did wonders with the arch and the new 'midrange' is overall faster than their previous top end.

Meanwhile nVidia did jack all for two years and released cards that are ~10% faster in every tier except the unbalanced (read: lack of load balancing), piece of shit melting 5090 which I won't touch with a two foot pole and sold the one I got. F that

This is what a launch should be like. And it's weird saying that about an AMD launch lol
What midrange? This is the best they can do, there will be no stronger cards in this generation. This is their top, doesnt matter how they marketing it.
 
I wonder how it compares to DLSS4, more specifically how close the FSR4 vs DLSS4 comparison is vs the FSR3 vs DLSS3 comparison. I think FSR4 is gonna close the gap even more now (not that I expect AMD to ever surpass NVIDIA anytime soon, buts its still really nice for AMD fans.)
Digital foundry has a video to answer that question
 
Every frame matters when your monitor is 480hz/500hz. The input lag difference is quite noticeable.
No one can realistically perceive a difference between 400 and 500 fps and if they claim that they do they're lying. CS2 has a tick rate of 64, at that point the frame rate is like 7-8 times higher than the rate at which the game logic is being updated, this is absurd.
 
What midtange? This is the best they can do, there will be no stronger cards in this generation. This is their top, doesnt matter how they marketing it.
This is AMD's midrange, they said as much, they are not making anything higher.

and nvidia's "high end" card is 21% faster average at 4k while costing nearly double at MSRP.
 
Man, you think it's bad now, wait for the first one to melt.
Unlikely, given the 9070 XT doesn't get anywhere near the 5090's insane power draw. That's why one 12V2x6 makes a lot more sense than multiple 8-pin power on midrange cards.

No one can realistically perceive a difference between 400 and 500 fps and if they claim that they do they're lying. CS2 has a tick rate of 64, at that point the frame rate is like 7-8 times higher than the rate at which the game logic is being updated, this is absurd.
The placebo effect is strong with FPS gamers.
 
Unlikely, given the 9070 XT doesn't get anywhere near the 5090's insane power draw. That's why one 12V2x6 makes a lot more sense than multiple 8-pin power on midrange cards.
We'll wait and see if they screw up. Pulling 300w instead of 600w over one pin is still sufficient to melt said pin.

Now if they have proper load balancing and sensing implemented it shouldnt be a problem, but everyone thought nvidia had that figured out too....
 
Green cope is funny. :laugh:

I honestly struggle to find any instance a 5070ti is going to give you a tangibly better experience. I keep looking, haven't found it yet.
Hell, even the 5080 in most cases isn't going to net you anything majorly worthwhile, especially if you factor in an OC to get you over any potential bump.

Honestly, scroll this this min page, look at your resolution/game, and tell me where either of those are a good option compared to this and at this price. If so, you expect that to be the rule or the exception?

You can play Wukong at 1080p on either, or it can suck at 1440p on either. I am confused. I am looking at them. Are you? I see almost zero difference. 0.2fps at 1440p. Zero point two.
We've already seen it can upscale from 1080p->4k and keep 60fps with FG...That's pretty dope. So that's probably at least playable/decent as well.
I haven't seen how nVIDIA does in that instance. Big DLSS4 perf hit on that level of upscale, so IDK. This is why stuff like that needs to be tested/compared on a consistent basis (with lows/mins shown).

You do you...I'm just saying. There is real, real cope and reality distortion for some reason. Everyone should be happy, especially since FSR4 is apparently better than DLSS3, which is AWESOME NEWS.
No cope here, I ended up with a sick stomach with what nvidia has pulled this month...horrible pricing and zero availability.
In the end only price matters but pricing a 9070 XT near a 5070ti isn't just something I would be excited about.

Knowing in europe here, 9070 XT price would mean €900-€1000 street price but we'll see tomorrow I guess.
 
Judging from the relative performance charts, a 96 CU die would have beaten the 4090 and a 128 CU die would have landed closer to the 5090 than the 5080. The performance per mm^2 is encouraging; hopefully, AMD will try and ship at least a 500 mm^2 die next time to entice the whales.
 
Same old, same old. It's a constant back and forth between the AMD and NVIDIA clubs. The only people not murdering each other are the Intel fans (all 4 of them)
Is this really still a thing? Because I have been on nvidia reddit subs for 1 month and it's raining negativity.
I don't think anyone will try to defend Nvidia anymore with the stunts they did recently.

I think it's just only AMD fanboys that are left thinking we are all nvidia fanboys the moment we start to criticize their product.
This isn't the case, I fucking hate nvidia, the company disgusts me and i'm also not a fan of AMD so there that.
Why not AMD? because of their Nvidia -50$ pricing policy
 
Is this really still a thing? Because I have been on nvidia reddit subs for 1 month and it's raining negativity.
I don't think anyone will try to defend Nvidia anymore with the stunts they did recently.
It's not that just anyone is defending NVIDIA, its a vocal minority. AMD has the same kind of people. It is really just the same old, same old. Vocal minority on both sides will always complain about the other.
I think it's just only AMD fanboys that are left thinking we are all nvidia fanboys the moment we start to criticize their product.
This isn't the case, I fucking hate nvidia, the company disgusts me and i'm also not a fan of AMD so there that.
Yea, no reason to be a fan of either unless you really enjoy their products that much; but don't become delusional about it like some people do.
Why not AMD? because of their Nvidia -50$ pricing policy
Which in complete fairness, nobody likes this strategy; and at least AMD is pushing the ante a bit more by increasing just how much of a discount it is. We should of gotten stuff like the RX 7000 price cuts after launch AT LAUNCH. The 9070XT, to an extent, delivers on that front.
 
Sounds decent. I'm interested in seeing what they actually sell for though.
 
No one can realistically perceive a difference between 400 and 500 fps and if they claim that they do they're lying. CS2 has a tick rate of 64, at that point the frame rate is like 7-8 times higher than the rate at which the game logic is being updated, this is absurd.
You don’t play Cs2, I can tell. If someone fires on the client side the server side’s subtick still accepts the input. The 64 updates a second aren’t the point of limitation.

240 to 500 fps can be perceived if the motion of the game happens fast enough, I.e., panning camera quickly. 500 also isn’t the frame limit. Input lag difference can be noticed because the first person to fire usually gets the kill, which is the whole point.
 
You don’t play Cs2, I can tell. If someone fires on the client side the server side’s subtick still accepts the input. The 64 updates a second aren’t the point of limitation.

240 to 500 fps can be perceived if the motion of the game happens fast enough, I.e., panning camera quickly. 500 also isn’t the frame limit. Input lag difference can be noticed because the first person to fire usually gets the kill, which is the whole point.
Right, but the shot wont be updated until the next tick. Rendering 37 extra frames before that tick isnt going to make the shot appear earlier.

Ultra high FPS is the same realm as the diamond plated audio cables for audiophiles.
 
This generation is so frustrating. No MSRP cards for the Radeons either huh?
The Nitro+ is Sapphires top of the line card and has always carried a big premium
 
Back
Top