• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XTX to Lead the RDNA3 Pack?

I really hope they stop this trend of making the cards longer and longer for each generation.
 
I find it funny that while AMD bought ATI, today's AMD looks more like ATI, they dropped the green for Red, naming scheme. They should've dropped the AMD name for ATI.
As silly as it is, I remember when XTX was a statement. Anyone who has been around the block remembers the X1950 XTX
Yes the queen of DX9.
 
As silly as it is, I remember when XTX was a statement. Anyone who has been around the block remembers the X1950 XTX

I have fond memories of my X1950XTX crossfire setup. Think I have the cards in boxes upstairs somewhere.

Nostalgia is real if they go for the name.
 
Really curious about the performance, the power draw which comes with the performance but mostly about the price, as this one will tell if AMD is following NV greedy behavior or is trying rather to reach for dominance or at least higher market share.
 
There is only one "logic die" (i.e. GCD) and six MCDs on Navi 31. Confirmed long ago after it was reported by Angstronomics.
 
Lookes bad, I agree. Interesting to know at what circumstance it happened and if it was running under spec.

Guy had literally just bought it and had it running his system for a few hours. Not sure how you'd expect one to run it out of spec.

But we also have burned 8PIN and many other connectors type as well, so nothing new here.
The 12VHPWR is new and somewhat controversial so any pop with it achieve news in a flesh. No one is interested about burned 8PIN but they happened rest assure - LMGTFY.

Every single one of those(I checked) was either due to a bad power supply or continuous mining for a couple years (probably the PSU again). So yeah, people aren't going to talk about a failure that occurs after years of heavy use/abuse.
 
So I guess the AMD GPUs will never deal with CUDA? If I understand correctly, that is the main difference between Nvidia and AMD GPUs.
 
I'm not certain that's going to be the best metric to try and assume performance from.

The 3090 has 189.8/556 and the 6900XT has 359.7/899.2 - and the 6900XT certainly didn't perform over and above the 3090 relative to those numbers.

In fact, the 6900XT vs 3090, it had 188% GPixel/s, and 161% GTexel/s

This seems like 7900XTX vs 4090, 130% GPixel/s and 184% GTexel/s - weaker in pixel fill rate relatively and significantly vs the older comparison, but stronger in Texture fill rate relatively.

My takeaway? who knows, but I doubt it's a metric you could hang your hat on for comparison.


XFX Radeon RX 7900XTX Merc Speedster 6969 XXX edition :roll:
Actually, the 6900xt has superior raster performance in a good many titles compared against both the 3090 and 3090 TI and it does so while sucking down 150w less, with 8 gigs less of slower ram and a MSRP of $500 less. Mostly they trade blows, depending on the title, until you get into Ray tracing, where nvidia has a clear lead simply because they have more mature tech, heavily aided by their proprietary DLSS.

If i was going to buy a card for CAD or video editing, i'd snap up a 3080/90/TI in a heartbeat. But i don't think anyone should be that impressed by what Nvidia, or intel for that matter is having to do to cling to their crowns. The planet is dying, and Nvidia is there trying to normalize 600w for a video card so they can market themselves as the leader in 8k gaming performance. Intel is once again pushing CPU power consumption, up to 350w this time so they can claim a 10% performance increase (sometimes) and all they had to do was DOUBLE the power draw. This isn't the direction that things are supposed to be moving in.
 
X1950XTX's MSRP was USD450 according to this.

That's what I would like to see back the most from that moniker, being top of the line in your generation without requiring me to choose between being able to stay relatively well fed or game at the highest possible settings available at the time.
 
Another big brick that draws 350-400W and costs a fortune. I'll pass.
I mean they could make a card that tops out at 230w, but then people would complain about AMD not being competitive at the high-end. This new gen of hardware opened a can of worms: You either find a way to design an arch that is 50% more efficient than the competition on the same node for the same perf, or you have to push the limits to match them.
 
@btarunr I call bs on that image, because Jayz2cents and others have said with confidence that RDNA3 will be using the new power connector that 4090 uses, and that image is using two older style power connectors (if my zoom was correct).

heh, we will see soon enough.
 
  • Haha
Reactions: ARF
Actually, the 6900xt has superior raster performance in a good many titles compared against both the 3090 and 3090 TI and it does so while sucking down 150w less, with 8 gigs less of slower ram and a MSRP of $500 less.
what titles does a 6900XT have superior raster performance (by a non insignificant margin, shall we say 10%+? ) and do that consuming 200w while a 3090 consumes 350w. I don't think I've ever seen that.

More materially to my point, the 6900XT enjoyed advantages in the metrics I quoted to the tune of 88% and 61%, I don't think it enjoys a single win over the 3090 to the tune of even 61%, let alone 88%, but I'm sure if you dig hard enough you might find an unrealistic niche example or two where that might be the case.

From what I know, the 6900XT enjoyed a minor lead at 1080p (less than 10% on average), roughly par at 1440p, and the 3090 enjoyed a minor lead at 4k (less than 10% on average)
Mostly they trade blows, depending on the title, until you get into Ray tracing, where nvidia has a clear lead simply because they have more mature tech, heavily aided by their proprietary DLSS.
That's the reality I remember, expect most publications don't really test DLSS, at least not in like for like testing, because then it wouldn't be like for like... so the 3090 trounces a 6900XT for RT, and then you have DLSS to help even more.

This isn't the direction that things are supposed to be moving in.
If I were you I'd brace for AMD being all too happy to follow this trend, hell, it's already started.
 
Two logic tiles will be very interesting to see. If it will make the same ZEN effect we might see actual NV-AMD market changes to a more 50:50 situation.
Hopefully they can do 4090 with less (30W?) power but not going 12VHPWR is the wrong way I thing. We will see 4*8pin on OC 3rd party cards...
The size look like a 4090 volume fallow up, and it`s not a good thing at all.
Also, the render pic only have 2*8pin (maybe it`s for the 7900XT, not the XTX) so I corrected it* :)



*for 3rd party cards
AMD being the good guy and eliminating the need for squid connectors that light on fire *toast*.

The 12VPWR connector is unnecessary, frankly. 8 pins are larger and clunky, but they work well, and have for 15 years.

X1950XTX's MSRP was USD450 according to this.

That's what I would like to see back the most from that moniker, being top of the line in your generation without requiring me to choose between being able to stay relatively well fed or game at the highest possible settings available at the time.
Well that card would be $662 today. Oh, and dont forget the core 2 at the time was a pricy bunch, a lower end core 2 was over $300, or $441 today.

$600 still buys you a whole lotta GPU today.
 
Modern GPU's and PSU's shouldn't catch fire, specially costing an eye-watering $1600. That's all I'm saying.
 
Very nice! All I need is news like this on the 7700 XT (including release date) before I click "buy" on the 6750 XT that I have in my basket.
 
It's not so far-fetched really, some companies, unlike nVidia, care about their reputation, take EVGA for example.

Right but there is a flip side to that. If you position yourself as the cheaper alternative and sell for less than you can get away with than you cement your reputation as the crappier and budget version in the market and that's a bad thing and hurts you. Conversely if you are the market leader you want to over charger for your product to cement your reputation as the leader and the better product.
 
GPU-L claims this beast will have 576 GP/s and 2304 GT/s vs the 4090's 440ish/1250ish. If the drivers are sound, we might have a good game here!
With Infinity Cache, AMD was able to compete on par while on much slower bus. I don't see why it would be different this time.
 
With Infinity Cache, AMD was able to compete on par while on much slower bus. I don't see why it would be different this time.
Except that the bus is also much wider. It may be difficult to compare these to any AMD card before it. They seem to be fundamentally different than RDNA2 for sure.
 
If they retain the 999 price tag for 7900XT that will already by generous. Their competition is pricing 4080 16GB at 1200 and everyone knows it will be slower than 7900XT, even in RT.
If AMD does this, I would be flabbergasted. Last gen, they priced their cards based on performance relative to the Nvidia cards. I hope like hell they hit that $1,000 MSRP, but I just don't see it.
 
Back
Top