• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announces the $999 Radeon RX 7900 XTX and $899 RX 7900 XT, 5nm RDNA3, DisplayPort 2.1, FSR 3.0 FluidMotion

well i sure as fuck am not spending $1200 on a rtx 3080. so here is to hoping i can beat the fucking bots on december 13th.

if i i do, im retiring from the hobby for a solid 10 years. fuck the noise, time to game.
 
you wont be able to buy. bots and third party scalpers will get them all. wait and see.

Just wait for stock to stabilize, prices will drop

edit: yeah, if you want to build right now it's tough, but waiting out first wave is due to save money and stress

2 x 8pin
They actually pointed that out as a selling point..and made me laughed
They do catched up with news.

Ahah, they already announced that a couple weeks ago, right after the first problems with the 12pin connector appeared
 
70% performance increase for the same price sounds good....at $1699 MSRP do people think the 4090 is 70% faster than the 7900xtx to justify the 70% increase in price?

All the games in the slide have 50% increase except Cyberpunk. That means its a 50% increase, not 70. The 4090 is 1600, not 1700
 
Well, that 999 dollar card will be about 1537 here.

Thats a hard pass. Gonna have to go back to my other hobbies since I am being priced out of this one.
 
Well, that 999 dollar card will be about 1537 here.

Thats a hard pass. Gonna have to go back to my other hobbies since I am being priced out of this one.

well said good sir.

yep. my gtx 1070 laptop doesn't do it for me, so i mostly just read and go for nature walks this year. its been enlightening. to realize that these are just hobbies at end of day. which is one reason im eccentric with them sometimes, cause a little fun is what its all about lmao
 
I’m officially excited. Most likely going full AMD for my next build (hopefully 7800X3D will be out at that point), after using intel for over a decade and nVidia desktop for that long. My secondary desktop (made for a family member) currently has an older AMD card, and I vastly prefer the Radeon Software postprocessing options and results to nVidia Experience overlay options too.
 
Where is endnote 816? seems to be missing for the perf/watt claim.
RX-816 – Based on AMD internal analysis November 2022, on a system configured with a Radeon RX 7900 XTX GPU, driver 31.0.14000.24040, AMD Ryzen 9 5900X CPU, 32 GBDDR4-7200MHz, ROG CROSSHAIR VIII HERO (WI-FI) motherboard, set to 300W TBP, on Win10 Pro, versus a similarly configured test system with a 300W Radeon 6900 XT GPU and driver 31.0.12019.16007. System manufacturers may vary configurations, yielding different results.
 
Well, that 999 dollar card will be about 1537 here.

Thats a hard pass. Gonna have to go back to my other hobbies since I am being priced out of this one.
And how much for the nVidia 1600 USD card?
 
can my EVGA GD 700w psu handle a 13600k (stock, I won't be overclocking, in fact i may do a very light undervolt) and a 7900 xt? i know it prob cant handle the xtx. but if i opt for the xt i should be ok ya?
 
This aged REALLY WELL.

Now, we'll need to see if the flagship really trades blows with 4090, @ least in raster.
 
And how much for the nVidia 1600 USD card?
About $2506 :laugh:

That is the price for conversion, as well as 14% tax.. brutal man.
 
Last edited:
RX-816 – Based on AMD internal analysis November 2022, on a system configured with a Radeon RX 7900 XTX GPU, driver 31.0.14000.24040, AMD Ryzen 9 5900X CPU, 32 GBDDR4-7200MHz, ROG CROSSHAIR VIII HERO (WI-FI) motherboard, set to 300W TBP, on Win10 Pro, versus a similarly configured test system with a 300W Radeon 6900 XT GPU and driver 31.0.12019.16007. System manufacturers may vary configurations, yielding different results.

they went sneaky on that. Disappointing.
 
they went sneaky on that. Disappointing.
Not really, the reference 6900XT is a 300W card.
If they compare to the 375W 6950XT it would actuallly have made AMD looked better because the 6950XT is out of its efficiency curve.
 
I wouldn't actually mind getting an AMD GPU, but is there a way to have access to stuff like this on the AMD side?

Because i often use that for older games, from 2010 and before.


Schermopname_181.png
 
Does AMD have something similar to Nvidia in terms of Nvidia Inspector? To access hidden Anti-Aliasing options for older games (2010 and older) such as Sparse Grid Supersampling, 4X4 SS etc, cuz if so i may move to AMD instead.
There used to be RadeonMod, but you don't have to worry, ATi has the best AA in the industry (at least for older games), AF is a bit worse than nvidia's but still good. Just set it to 4xAAA (adaptiveAA w/ supersampling) with 8xAF or better - 8xAAA with 16AF. Smooooothvision. :)
 
Not really, the reference 6900XT is a 300W card.
If they compare to the 375W 6950XT it would actuallly have made AMD looked better because the 6950XT is out of its efficiency curve.

The reference 6950XT is 335W. The AIB 6950XTs are 375W.

Perf/w for reference 6950XT is about the same as the 6900XT, maybe a fraction worse.

The sneaky part is that they reduced the TBP of the test 7900XTX to 300W rather than have that run at stock settings which is what AMD have done in prior launches when measuring perf/watt
 
Imagine what's going to appear below this 899 price point. Its looking like I'm about to go for a dye change.

:rockout:
 
There used to be RadeonMod, but you don't have to worry, ATi has the best AA in the industry (at least for older games), AF is a bit worse than nvidia's but still good. Just set it to 4xAAA (adaptiveAA w/ supersampling) with 8xAF or better - 8xAAA with 16AF. Smooooothvision. :)

i think he is asking can you find those settings you just mentioned in the radeon driver, so where do you find "adaptiveAA w/ supersampling"? id like to know as well for older games.
 
Where is the 4GHz potential indication?
2.3GHz game clock for RX7900XTX and 2.0GHz game clock for RX7900XT on 5nm vs 2.61GHz game clock for RX6500XT on 6nm and 2.5GHz game clock for RX6750XT on 7nm.
It seems unlikely the RX 7900XTX on air highly OC models to be capable to hit more than 3GHz/2.8GHz regarding fronted/shader clocks.
Up to 1.7X vs 6950X at 4K and up to 1.6X vs 6950X with raytracing doesn't mean 1.7X on average and probably doesn't even mean 1.6X either for average 4K raster difference.
(Probably can be 1.6X with specific game testbed selection and CPU, but not on current TPU 5800X games testbed)
RX7900XT will be slower than RTX 4090 in 4K but at $999 it doesn't matter, great value (relatively speaking).
A little bit less value for 7900XT since the difference should be around 15% between them.
They will pressure price-wise the higher Ampere lineup for sure and all the cards from Nvidia & AMD will drop a little gradually but it will affect less and less as you go down to the lower priced models till it won't have an effect anymore.
 
Last edited:
What's the reason for them not showing benchmarks vs the 4090 or at least the 3090ti? They really enjoyed showcasing the 6800 vs nvidia gpu's.
 
Might be the first time i go RADEON since my Radeon 7970 Ghz edition!
 
Might be the first time i go RADEON since my Radeon 7970 Ghz edition!

those were some good times. i loved my 7950 and 7970.

and my 6950 before that. good times indeed
 
Why fuck RT?
Becvause a graphics bell and whistle that tanks your framerate while producing effects so minute that even with still frame people still cant see the difference is totally worthless to everyone but specwhores. To those of us that PLAY games, RT is functionally useless except for Ambient Occlusion, which can be done far easier with shader tricks without requiring the power of a nuclear sub to operate. The numebr of newer games coming out with impressive lighting effects that dont need any form of RT should indicate that RT, at least in its current form, will go down the same road as Hairworks and Physx.

Well, that 999 dollar card will be about 1537 here.

Thats a hard pass. Gonna have to go back to my other hobbies since I am being priced out of this one.
Why not just....not go for the halo card? I've been building gaming PCs for over 15 years at this point, and never have I owned the big dog card, with the exception for the vega 64s bought for $250 used. The upper mid end has been the best bang-for-buck for as long as I can remember.
 
Well all know Nvidia elevated the pricing for its 40xx's to increase the sales of its 30xx surplus stock. AMD coming in at this price at such slight performance difference puts a lot of pressure on Nvidia sales. As someone already said, "bring on the price wars!".
 
Back
Top