• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Radeon RX 7900 GRE Gets Benchmarked

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
18,515 (2.47/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
AMD's China exclusive Radeon RX 7900 GRE has been put through its paces by Expreview and the US$740 equivalent card should in short not carry the 7900-series moniker. In most of the tests, the card performs like a Raden RX 6950 XT or worse, with it being beaten by the Radeon RX 6800 XT in 3D Mark Fire Strike, even if it's only by the tiniest amount. Expreview has done a fairly limited comparison, mainly pitching the Radeon RX 7900 GRE against the Radeon RX 7900 XT and NVIDIA's GeForce RTX 4070, where it loses by a mile towards AMD's higher-end GPU, which by no means was unexpected as this is a lower tier product.

However, when it comes the GeForce RTX 4070, AMD struggles to keep up at 1080p, where NVIDIA takes home the win in games like The Last of US Part 1 and Diablo 4. In games like F1 22 and Assassin's Creed Hall of Valor, AMD is only ahead by a mere percentage point or less. Once ray tracing is enabled, AMD only wins in F1 22 and it's by less than one percent again and Far Cry 6, where AMD is almost three percent faster. Moving up in resolution, the Radeon RX 7900 GRE ends up being a clear winner, most likely partially due to having 16 GB of VRAM and at 1440p the GeForce RTX 4070 also falls behind in most of the ray traced game tests, if only just in most of them. At 4K the NVIDIA card can no longer keep up, but the Radeon RX 7900 GRE isn't really a 4K champion either, dropping under 60 FPS in more resource heavy games like Cyberpunk 2077 and The Last of Us Part 1. Considering the GeForce RTX 4070 Ti only costs around US$50 more, it seems like it would be the better choice, despite having less VRAM. AMD appears to have pulled an NVIDIA with this card, which at least performance wise, seems to belong in the Radeon RX 7800 segment. The benchmark figures also suggests that the actual Radeon RX 7800 cards won't be worth the wait, unless AMD prices them very competitively.

Update 11:45 UTC: [Editor's note: The official MSRP from AMD appears to be US$649 for this card, which is more reasonable, but the performance still places this in in a category lower than the model name suggests.]



View at TechPowerUp Main Site | Source
 
Low quality post by LFaWolf
This sentence is tough to read -

However, when it comes the GeForce RTX 4070 AMD struggles at 1080p, where NVIDIA takes home the win in games like The Last of US Part 1 and Diablo 4.
 
Low quality post by the54thvoid
This sentence is tough to read -

However, when it comes to the GeForce RTX 4070, AMD struggles at 1080p, where NVIDIA takes home the win in games like The Last of US Part 1 and Diablo 4.

This is what you're looking for.

EDIT: LQ'ing own post to remove clutter.
 
and the US$740 equivalent card
I am reading that's the China price

US price is $649.
1690523819863.png
 
And we still don't know what a Golden Rabbit is (GRE).
 
Low quality post by Assimilator
1. The words "gold" or "golden" do not appear at all in the article you linked.
2. I can't educate myself on random words thrown at me without any reference, hence my question.

Thanks anyway. :)

Edit: Before you suggest, Google pointed me towards "Golden Rabbit Enamelware" and "The Golden Rabbit", a 1962 British comedy. Which of these has got anything to do with the 7900 GRE?
 
Last edited:
It's the Chinese year of the rabbit. The golden prefix is often used for branding for luck or fortune (I think).
 
It's the Chinese year of the rabbit. The golden prefix is often used for branding for luck or fortune (I think).
Ah, thanks. :) Now to figure out why AMD chose that name. :D
 
Basically rx 7800xt, even the price is same as the release price of 6800xt, just released half year later
 
How does the 4070Ti cost only $50 more when its MSRP is $800 and this goes for $649?
 
Doesn't look that bad, lower TDP than the other 79xx class GPUs and performance in 1440p looks like what it should be, especially against the 4070.
This author of this article has some "interesting" opinions on this card and the rumored 7800 XT.. let's wait and see what AMD come up with and what prices in the market will be, first.
 
Last edited by a moderator:
How does the 4070Ti cost only $50 more when its MSRP is $800 and this goes for $649?
Converting the pricing in xina makes it $740.
 
Converting the pricing in xina makes it $740.
The pricing of what? The GRE? What does that matter anyway? You're to compare the MSRP, not current pricing in a given country. Imagine if people reviewed tech based on its prices in Brazil lol
 
The RTX 4070 Ti was selling for $700 USD in Canada during Amazon day. I'm not sure a $650 card that can lose to the 4070 non-Ti in games is gonna sell well. We never get good pricing for AMD cards in Canada.
 
The RTX 4070 Ti was selling for $700 USD in Canada during Amazon day. I'm not sure a $650 card that can lose to the 4070 non-Ti in games is gonna sell well. We never get good pricing for AMD cards in Canada.
And it's selling for 850-950 euro in EU. Country specific pricing is all over the place, which is why you compare MSRPs...
 
Chiplets don't seem to be working out so great for GPUs... hope they can sort whatever the issues are out.
 
The signal has to hop between the GCD and MCD with added latency of 1 clock cycle. 2.5Ghz. and 0,5 Ghz worth of GPU is wasted on hopping.. There is no way to fix that, but to go monolithic again. But funny it is they created the exact configuration of 6950 just to show the weak points even if it has dual issue shaders still falls behind.
 
The signal has to hop between the GCD and MCD with added latency of 1 clock cycle. 2.5Ghz. and 0,5 Ghz worth of GPU is wasted on hopping.. There is no way to fix that, but to go monolithic again. But funny it is they created the exact configuration of 6950 just to show the weak points even if it has dual issue shaders still falls behind.
Genuine question here, why does it work for cpu ? Is there structural reason or how cpu works that make chiplets works for them ?
 
Genuine question here, why does it work for cpu ? Is there structural reason or how cpu works that make chiplets works for them ?
It works in a very similar way actually. The chiplet design always adds latency to processing. But ryzen is at its fourth generation whereas this is the first attempt. The first gen ryzen was similarly problematic. But it followed a design that was straight up garbage, so it was still much better than that, and was much cheaper than the competition (which the Radeons aren't).
 
Back
Top