• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announces the $999 Radeon RX 7900 XTX and $899 RX 7900 XT, 5nm RDNA3, DisplayPort 2.1, FSR 3.0 FluidMotion

Thing is they attacked nVidia but when it came to make a real show of how good the 7900XTX was a fail performance wise, on top of that they went chip-let to make it cheaper but the price is still hiked up.

They should be ashamed on the detail release on the card(s), they acted like kids.
It wasn't the best performance preview, but it was an Architectural pr release, I would also say that both companies tend to make quick and big stride's in driver development over the first few months of a new architecture release.
It might make sense to let later reviews on newer driver's speak directly of its performance close to release on better driver's from AMD pov.

I believe they went chiplets to make they're cards viable while also advancing they're knowledge on 2.5D and 3D complexes and clearly were not ready for side by side GPU tile's, so a baby step with MCD and GCD was produced, imagine the same chip monolithic, it would have been big, expensive and in the same performance band anyway but, it would also likely have been 1600£, a harder sell.
 
-Throwing shades at Nvidia every chance they got and making their fanboys think the 7900xtx was actually super fast to then not even compare it with nvidia's gpu.
Because there's no direct competition to Nvidia this time around. I thought we've already established that.

- Ridiciliously stupid presentations - 8k gaming, the display port 800hz, throwing 7900xtx fps out and making viewers guess what they were meant to be(max,avg?),
It's a technological presentation aimed at DP 2.0. If you've seen a product launch video from any company that specifies whether demonstrated FPS numbers are avg or max, please show me. I haven't.

- That monitor advertisement :roll:
They needed that to justify DP 2.0. You don't care about it, I don't care about it, we're free to move on and look at other things.

-Not bothering with showing their cherrypicked benchmarks vs any of their old flagships or Nvidia's.
Because they're not aiming against old flagships. They're releasing a product on its own merits. Besides, if they compared against old Nvidia products, that would only signal that they're behind which is not necessarily the case (at least technologically), regardless of raw performance data.

Edit: But they actually did show comparisons against the 6950 XT.

Thing is they attacked nVidia but when it came to make a real show of how good the 7900XTX was a fail performance wise, on top of that they went chip-let to make it cheaper but the price is still hiked up.

They should be ashamed on the detail release on the card(s), they acted like kids.
How is it a fail? Do you know something that the rest of us don't?

This time Nvidia will not know the performance of these cards until 2 weeks before Xmas.
This is probably why they didn't give more detailed performance numbers. They didn't want Nvidia to gain the upper hand by knowing what they're up to. Nvidia still hasn't released the 4080 after all. Imagine if the 7900 XTX ends up being faster than the $1200 4080. I'm not saying that it will be, but it might be.
 
Last edited:
- HDMI 2.1 is still there. No DP 2.1 monitors on the market just yet. Will be released in 2023 at the earliest. DSC is still there with zero visible visual difference.
- Such heavy GPUs have existed before. I've not heard of any mass-reports about any issues related to its weight.
- It's not. The FE edition runs around 67C.
- Capped to ~315W (70% of TDP) it loses less than 5% of performance.
- The 90 series cards have always been super expensive. They are not for the average gamer.
- I don't give a .... about its UI. It works. At the same time I get lost in AMD's UI. Which tab do I need? Which place the option I'm looking for? Where's digital vibrance? People have been asking AMD for years for this option. Competive CSGO players do not touch AMD cards because of that.
- EVGA, what? Who the .... cares? 95% of the world have never seen EVGA cards.
- Out of literally tens of thousands of sold cards, fewer than a few dozen people have reported issues. And it looks likely all of them have done something wrong, including bending the cable too much or not properly inserting the adapter. Again, a card for the rich or professionals.

Literally not a single argument.

It's a free market. AMD is all yours. Remember Radeon 9800. Should I remind you about its release price? It was $400. Corporations are not your friend even though you want to love AMD so much.
I'm inclined to agree here.

The 4090 isn't a bad product. Its just pushed too far, in every metric: size/dimensions, wattage, price, and marketing. Its just another gen with just another Nv card on top of the stack, meanwhile, the price moves further away from reality every generation post-Pascal. Its literally overinflated, and it is also literally a better product in every way once you dial it back a little. Remarkably similar to Intel offerings - so here we have two companies with similar grasp on the markets over their direct competition, that use the same approach to keep generational leaps 'intact'.

Meanwhile, we've seen new Ryzen chips with similar, but yet still more conservative ideas and more sensible limitations - even if 95C is similarly uncanny, its still less, as is peak TDP, as is tuning out of the box. And GPU now confirms a similar thing for AMD.

The trend is there... but the approaches still differ as do the results. Time will tell what was the best way forward...
 
You can keep your wccftech where the sun doesn't shine, don't even remotely try to make an argument based on that source, nor YT, with me. All I will point out is your own sheep mentality, scroll a few pages back for proof. Or zoom in on my avatar and try to consider what it tries to convey.

Your laugh smilies also don't suit you nor your responses. No need to make a fool of yourself.
 
You can keep your wccftech where the sun doesn't shine, don't even remotely try to make an argument based on that source, nor YT, with me. All I will point out is your own sheep mentality, scroll a few pages back for proof.

RTX 4090 is not a bad product, it's an awful product which should not exist at all. :D

The NVIDIA GeForce RTX 4090 is the newest graphics card from the company. Its physical size and power were said to be unmatched. However, since its release, the graphics card has been reported to overheat the connection, melting the connection port and the cable. A recent post on Reddit now shows that the native ATX 3.0 power supply using the 12VHPWR power connector is now having the same melting issues.

:D
 
I'm inclined to agree here.

The 4090 isn't a bad product. Its just pushed too far, in every metric: size/dimensions, wattage, price, and marketing. Its just another gen with just another Nv card on top of the stack, meanwhile, the price moves further away from reality every generation post-Pascal. Its literally overinflated, and it is also literally a better product in every way once you dial it back a little. Remarkably similar to Intel offerings - so here we have two companies with similar grasp on the markets over their direct competition, that use the same approach to keep generational leaps 'intact'.

Meanwhile, we've seen new Ryzen chips with similar, but yet still more conservative ideas and more sensible limitations - even if 95C is similarly uncanny, its still less, as is peak TDP, as is tuning out of the box. And GPU now confirms a similar thing for AMD.
I watched Debauer load a Vbios to allow the 4090 to draw 1000 Watts that was at 1.35 volts. That is insane for a GPU to be able to pull that amount of power but speaking of performance increases it was underwhelming. outside of benchmark scores.
 
RTX 4090 is not a bad product, it's an awful product which should not exist at all. :D



:D
Right, let's go back to where we were; you overinflated the negatives, I'm bringing some nuance to that comparison by saying it was pushed too far. I'm fully aware of the 12VHPWR issue; but its like Intel CPUs; a power limited chip makes for a highly performant piece of silicon with great efficiency.

What are you looking for exactly, I don't get it.

I watched Debauer load a Vbios to allow the 4090 to draw 1000 Watts that was at 1.35 volts. That is insane for a GPU to be able to pull that amount of power but speaking of performance increases it was underwhelming. outside of benchmark scores.
Yeah, its the same shit we've been seeing since GPU Boost 3.0; the overclock is done out of the box. Any extra investment is futile.

But now we've crossed the barrier where not only is it OC'd out of the box, effectively (or just out of its efficiency curve), you also get to invest in specific cooling to keep the whole thing from running way below advertised speeds. These top end products aren't the price they specify. They're way, way more expensive, to get that last 5% of perf that gets eclipsed just a year later :p

Like I said elsewhere... we're in silly land with the top end of every chip stack on consumer right now. And it will continue until that penny drops, collectively, and we stop boasting about burning north of 1KW to play a game. Especially in this age.
 
Last edited:
I didn't even think about that. Didn;t they increase the Infinty cache size? That alone should have benefits,
They actually reduce infinity cache amount over the RX6000 series (128MB >> 96MB) for the top GPUs.
Yet it is way faster and increases performance because of the new architecture interconnection between dies.

1667674379020.png

Basically we are talking about up to a few TB/s on actual effective bandwidth. That's why bit bus doesn't mean anything on AMD since infinity cache introduction.

So they can always increase it further beyond 96MB, maybe up to double it...
 
Last edited:
Let's say that the 4090 are made for the "hall of fame" benchmarking and record scores for users called K|NGP|N and similar.
But for the regular market, the risk of running the card into a real fire hazard is something like 50-50, or it's close to impossible for the average users to keep the card alive.

I am not "overinflating" the negatives - the negatives do exist, and this time they are extremely serious - maybe the most serious since the original Fermi GTX 480 launch 13 years ago.
I will not support yours or anyone's "political correctness" and underestimation of the serious risks.
 
Can someone confirm/deny that "4090 only 33% faster in Lumen than 3090Ti" is true?

If you wonder what the heck it is, here is it (Unreal 5 demo using Lumen):


Cool eh. That's using "SOFTWARE ray tracing". Now, "hardware RT" in Lumen should be faster, shouldn't it?
Let this sink in:


Lumen also comes with hardware ray-tracing, but most developers will be sticking to the former, as it’s 50% slower than the SW implementation even with dedicated hardware such as RT cores. Furthermore, you can’t have overlapping meshes with hardware ray-tracing or masked meshes either, as they greatly slow down the ray traversal process. Software Ray tracing basically merges all the interlapping meshes into a single distance field as explained above.


a better product in every way once you dial it back a little
Indeed.
Such as 4080 12 GB after seeing what AMD is up to.

You dial it back a little:

nVidia "unlaunches" 4080 12GB

and suddenly it's a better product than before. :D
 
It wasn't the best performance preview, but it was an Architectural pr release, I would also say that both companies tend to make quick and big stride's in driver development over the first few months of a new architecture release.
It might make sense to let later reviews on newer driver's speak directly of its performance close to release on better driver's from AMD pov.

I believe they went chiplets to make they're cards viable while also advancing they're knowledge on 2.5D and 3D complexes and clearly were not ready for side by side GPU tile's, so a baby step with MCD and GCD was produced, imagine the same chip monolithic, it would have been big, expensive and in the same performance band anyway but, it would also likely have been 1600£, a harder sell.

Not the best?, it was terrible, i was not expecting it to be better or as good and just wanted some thing from them that did not get wrote up by what seemed like kids.

As for the GPU side by side i never expected it to be as they only just started chiplets, so step by step make extra money at the very least.

I'll buy one in Dec if i get the chance as i would of been happy with the 6900XT but AMD seem to cut support after 6-8 years or so and was thinking it be possibly cut sooner.
 
Because there's no direct competition to Nvidia this time around. I thought we've already established that.

This is actually the thing people are arguing about. AMD pretty clearly has ceded the high end and will have nothing to compete beyond a 4080 16GB - and maybe not even that.

For this reason the price comparisons vs the 4090 are also fallacies. A $999 AMD flagship card that probably lands below the $1199 4080 16GB in performance. They are in the same ballpark in price, given that Nvidia has always commanded a 10-20% premium.

Not that it matters to 99% of folks, who are not getting a 4090 anyway. Most of the people arguing here have older mid or upper midrange GPUs (now low end) and probably aren't in the market for a new one anyway, so just arguing about something they aren't buying from either corporation.
 
This is actually the thing people are arguing about. AMD pretty clearly has ceded the high end and will have nothing to compete beyond a 4080 16GB - and maybe not even that.

For this reason the price comparisons vs the 4090 are also fallacies. A $999 AMD flagship card that probably lands below the $1199 4080 16GB in performance. They are in the same ballpark in price, given that Nvidia has always commanded a 10-20% premium.

Not that it matters to 99% of folks, who are not getting a 4090 anyway. Most of the people arguing here have older mid or upper midrange GPUs (now low end) and probably aren't in the market for a new one anyway, so just arguing about something they aren't buying from either corporation.
How do you come up with that conclusion because almost nothing of what you wrote here adds up...
Everything so far suggests that the 900$ 7900XT will be the rival of the 1200$ 4080 16GB and the 1000$ 7900XTX will be just short of the 1600$ 4090, things that can also change with AIB OC variants.
All this at rasterization, just to be clear. RT on the new AMD GPUs is a full step behind RTX40 (3090/Ti region).
 
How do you come up with that conclusion because almost nothing of what you wrote here adds up...
Everything so far suggests that the 900$ 7900XT will be the rival of the 1200$ 4080 16GB and the 1000$ 7900XTX will be just short of the 1600$ 4090, things that can also change with AIB OC variants.
All this at rasterization, just to be clear. RT on the new AMD GPUs is a full step behind RTX40 (3090/Ti region).

Did you not read my post?

I said "A $999 AMD flagship card that probably lands below the $1199 4080 16GB in performance. They are in the same ballpark in price, given that Nvidia has always commanded a 10-20% premium."

The rest of your post is gibberish, as if Nvidia doesn't have AIB OC variants as well.
 
I said "A $999 AMD flagship card that probably lands below the $1199 4080 16GB in performance. They are in the same ballpark in price, given that Nvidia has always commanded a 10-20% premium."
Yes I did read it and nothing makes sense. Just like your mentioning of the low bit rate of the new(or future) GPUs on another thread(?)

The gap between 4090 and 4080 16 is big and at least 7900XTX will be between them if not the 7900XT also if the 6950XT vs 7900XTX (1.5x) holds up.

1667696454446.png


1667697184371.png
 
Yes I did read it and nothing makes sense. Just like your mentioning of the low bit rate of the new(or future) GPUs on another thread(?)

The gap between 4090 and 4080 16 is big and at least 7900XTX will be between them if not the 7900XT also if the 6950XT vs 7900XTX (1.5x) holds up.

View attachment 268706

View attachment 268708


It's pretty well known at this point, at least by those who have not deeply imbibed the AMD kool-aid and are blindly making some kind of excuse (albeit for what, I know not), that the 7900XT and possibly the 7900XTX are competitors to the 4080 16GB.

But you keep thinking what you want to think. I'm sure you'll have some reason or the other later on for being blind to the obvious...

[Radeon RX 7900 XTX] is designed to go against 4080 and we don’t have benchmarks numbers on 4080. That’s the primary reason why you didnt see any NVIDIA compares. […] $999 card is not a 4090 competitor, which costs 60% more, this is a 4080 competitor.

— Frank Azor to PCWorld


1667699065635.png
 
It's pretty well known at this point, at least by those who have not deeply imbibed the AMD kool-aid and are blindly making some kind of excuse (albeit for what, I know not), that the 7900XT and possibly the 7900XTX are competitors to the 4080 16GB.

But you keep thinking what you want to think. I'm sure you'll have some reason or the other later on for being blind to the obvious...



View attachment 268709
I'm well aware of that statement. AMD can be very conservative on their statements at this point, for their reasons, and people can think of what they want and ignore the numbers.
 
I'm well aware of that statement. AMD can be very conservative on their statements at this point, for their reasons, and people can think of what they want and ignore the numbers.

Uh-huh.. the chief of marketing guy at AMD is being conservative about performance...............................................
 
Uh-huh.. the chief of marketing guy at AMD is being conservative about performance...............................................
Yes I can understand your frustration...
 
@W1zzard What's really the HDMI version of the 7900 XT & 7900 XTX?
The latest 2022 2.1a 48Gbps or 2020 2.1 40Gbps like those of 6900 & 6950 XT's?
 
The problem is that the budget cards will cost $5-600 too.

Yep, exactly this - and they are what majority of people usually buy.
 
  • Haha
Reactions: ARF
Yep, exactly this - and they are what majority of people usually buy.

No.

Current pricing in Germany:

Radeon RX 6400 - 169.00
Radeon RX 6500 XT - 199.00
Radeon RX 6600 - 261.99

Radeon RX 6600 XT - 339.00
Radeon RX 6650 XT - 339.00
Radeon RX 6700 XT - 439.00
Radeon RX 6750 XT - 499.90
Radeon RX 6800 - 559.00
Radeon RX 6800 XT - 635.90
Radeon RX 6900 XT - 748.00

Radeon RX 6950 XT - 899.00

The majority of people will buy up to the RX 6650 XT that goes for 339 as of now but its price should go downward spiralling because it's crappy for 1080p only.
 
No.

Current pricing in Germany:

Radeon RX 6400 - 169.00
Radeon RX 6500 XT - 199.00
Radeon RX 6600 - 261.99

Radeon RX 6600 XT - 339.00
Radeon RX 6650 XT - 339.00
Radeon RX 6700 XT - 439.00
Radeon RX 6750 XT - 499.90
Radeon RX 6800 - 559.00
Radeon RX 6800 XT - 635.90
Radeon RX 6900 XT - 748.00

Radeon RX 6950 XT - 899.00

The majority of people will buy up to the RX 6650 XT that goes for 339 as of now but its price should go downward spiralling because it's crappy for 1080p only.
What about the 6700 non-XT?
 
nothing to compete beyond a 4080 16GB - and maybe not even that.
In which lala-land will AMD have "nothing to compete" with a 40% cutdown of 4090?
Tell us more about "unlaunching" of the other 4080....

:D
 
Back
Top