• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 GRE Tested, Fills Gap Between RX 9060 XT and RX 9070

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,886 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
AMD released the China-exclusive Radeon RX 9070 GRE in May, and ComputerBase.de caught hold of a Sapphire Pulse branded RX 9070 GRE card to test. While the RX 9060 XT specs sheet reads as being exactly half of the RX 9070 XT, the RX 9070 GRE is configured to be three quarters of it. It's based on the same 4 nm "Navi 48" silicon as the rest of the RX 9070 series, but is configured with 48 compute units out of the 64 present, and comes with 12 GB of memory across a 192-bit wide GDDR6 memory bus, in place of the 256-bit wide one that the RX 9070 and RX 9070 XT come with. With 48 CU, the RX 9070 GRE has 3,072 stream processors, 96 AI accelerators, 48 RT accelerators, 192 TMUs, and 96 ROPs. The Infinity Cache size is reduced to 48 MB. The card comes with the same 220 W TBP as the RX 9070.

Testing by ComputerBase.de finds that despite its reduction in compute units and memory, the RX 9070 GRE is still a 1440p-class GPU, and a significant upgrade over the RX 9060 XT 16 GB and the NVIDIA GeForce RTX 5060 Ti 16 GB. Averaged across 13 game tests, at 1440p, the RX 9070 GRE tests 28.4% faster than RX 9060 XT 16 GB, 22% faster than RTX 5060 Ti 16 GB, and 11% faster than the previous-gen RX 7800 XT, and 5% faster than RTX 4070. The current-gen RTX 5070 is 9% faster, RX 9070 is 14% faster, and the current flagship RX 9070 XT is 29% faster. This makes the RX 9070 GRE an interesting SKU that's at the intersection of various price-performance combinations within the 1440p class. In the Chinese domestic market, the RX 9070 GRE is priced slightly higher than the RTX 5060 Ti 16 GB, but lower than the RTX 5070, making it a good value proposition. Find more test results and insights in the source link below.



View at TechPowerUp Main Site | Source
 
a 12gb 7900gre with better ray tracing capability... price dependant but i suspect the 9070 non xt is a better buy here
 
The card looks fine. I'm just having issues with the GRE naming. It can confuse people thinking GRE is better than 9070(non XT) just because it has some letters at the end.
 
a 12gb 7900gre with better ray tracing capability... price dependant but i suspect the 9070 non xt is a better buy here
All depends on pricing if it will be for around RX 7800 XT price ~ 450€ then it's a decent gpu to choose. Raster is good but RT is closer to RX 9060 XT.
 
This card only exists as a price segmentator so that the 9070 /XT does not go down in price. This crap started by AMD almost 6 years ago with the 5700 /XT when the RX 5650 XT came out.

Do they understand that they helped turn the computer into a luxury item? I used to spend thousands of dollars per year on Tech parts. Not any more.

And it is a proven fact that AMD has lost more market share than ever before by this marketing practice.
 
And it is a proven fact that AMD has lost more market share than ever before by this marketing practice.
Not sure about RX 9060 XT but RX 9070 XT are selling very well by amd's standards.

RX 9070 XT is above MSRP because people are buying it well. Look at RX 9060 XT 16GB it's even right now much sooner already at its msrp people probably are not buying it as well becouse of poor p/p ratio. You have 16GB of VRAM but weak performance in generel even for it's price and RTX 5060 Ti 16GB is similar story.

In this generation

50 Class gpus are weak RTX 5050 same as RTX 4060 for same price waste of two years zero gains
60 Class gpus are weak RTX 5060 Ti 16GB, RX 9060 XT 16GB weak performance for the money
70 Class gpus are only average RTX 5070 12GB, RX 9070 XT, RX 9070 only average nothing more (RTX 5070 Ti is the worst of those three)
80 Class gpus are weak RTX 5080 performance gain is extremely terrible
90 Class gpus are weak RTX 5090 average perfomance jump but price.... It's not a gaming gpu more like titan crap or something like that where you are paying for name not for actual performance.
 
Last edited:
I have read it there


Note: Benchmark Names and numbers are universal language.

It seems to be as of now china only graphic card. Not worth bothering for Radeon 7800xt owners for example.

I'm not interested in any 9000 AMD series GPU anymore. I see the issue with this card especially the poor performance and not the future proof 12Gib VRAM. A good 720p, some titles 1080p graphic card. 1440p for older titles. It's faster than expected that i saw an unannounced graphic card for the 9000 series. typical AMD move. Expensive part. Wait. Keep stuff expensive. Offer worse parts or offer worse parts with much better price per performance ratio.
 
Last edited:
The card looks fine. I'm just having issues with the GRE naming. It can confuse people thinking GRE is better than 9070(non XT) just because it has some letters at the end.
Pretty much. The naming is extremely weird. AMD (and NV) has for several gens now trained the consumer that “letters after model name = good” and suddenly going the other way just confirms that their marketing team is staffed by imbeciles. Of course, it’s not yet a given that this one will get a global release, but assuming it does - terrible product naming and segmentation.
 
Pretty much. The naming is extremely weird. AMD (and NV) has for several gens now trained the consumer that “letters after model name = good” and suddenly going the other way just confirms that their marketing team is staffed by imbeciles. Of course, it’s not yet a given that this one will get a global release, but assuming it does - terrible product naming and segmentation.
It's confusing. 7900 GRE was the weakest of the 7900's, but there was no non-suffix 7900 (only the XT and the XTX). If anything, Radeon's GRE now stands somewhat like GeForce's LE of old.
 
I'd buy that. I think that 12GB/192b at this power envelope is a sweet spot for 1440p. Fingers crossed for wider availability & a reasonable price.
 
I'd buy that. I think that 12GB/192b at this power envelope is a sweet spot for 1440p. Fingers crossed for wider availability & a reasonable price.
Better version of a RX 7800 XT/7900 GRE but price is a big question.
 
The gap between 60Ti-16 and 70 is around a $100 so most of the people will either just buy a 5070 or if they're dead set on buying AMD stretch to 9070 non-XT. 9060XT would need to drop to its MSRP for this to become a viable option somewhere in the stack.
 
All depends on pricing if it will be for around RX 7800 XT price ~ 450€ then it's a decent gpu to choose. Raster is good but RT is closer to RX 9060 XT.

It probably won't be at that price given the 9060 XT 16GB is currently at around €400-500 (tax included) in some european countries, so they'll either need to drop the 9060 XT 16GB prices or sell this 9070 GRE for €550-600, and you can easily guess which GPU is at the €550-650 price point.
 
It probably won't be at that price given the 9060 XT 16GB is currently at around €400-500 (tax included) in some european countries, so they'll either need to drop the 9060 XT 16GB prices or sell this 9070 GRE for €550-600, and you can easily guess which GPU is at the €550-650 price point.
Right now lowest price for RX 9060 XT 16GB is 370€ which is exactly at its msrp RX 7800 XT is 456€. Depends on location. For €550-600 it's the same crap as RTX 5050 = RTX 4060 for the same price/perfoamnce after two years (wasted time) Technology works best when it's improving over time not sitting at the same spot! Just changing the names and prices will not help but not as smart people may not notice it over this smoke screen.
 
Last edited:
It's an awkward sale. 48CU, 12GB, and limited to 432gb/s bandwidth.

It's trying to directly compete against 5070 per spec, but will end up a bit weaker overall, esp in esports stuff where NVIDIA typically has better driver overhead when CPU bound.

I would say $450~ price point (USD) if it ever gets launched. Not sure if that includes kickbacks from AMD.
 
It's an awkward sale
It's even worse than that. It doesn't beat 4070 (last gen card of terrible value), consumes more power and offers less features for professional use. Why would anyone in their right mind consider buying that?
 
Look at RX 9060 XT 16GB it's even right now much sooner already at its msrp people probably are not buying it as well becouse of poor p/p ratio
9060 XT 16GB (pulse) is currently sitting at $360 in Amazon
The 9070 is 150% of the 9060 XT performance, but at 186% ($670) of its price
The 9070 XT is 164% of the 9060 XT performance, but at 200% ($720) of its price

If the 9070s were at their msrp they would be much closer and probably the cards to go for, but they're not beating the 9060 in p/p right now. The reason the 9060 is closer to msrp might be because amd can produce them in higher quantities.
 
9060 XT 16GB (pulse) is currently sitting at $360 in Amazon
The 9070 is 150% of the 9060 XT performance, but at 186% ($670) of its price
The 9070 XT is 164% of the 9060 XT performance, but at 200% ($720) of its price

If the 9070s were at their msrp they would be much closer and probably the cards to go for, but they're not beating the 9060 in p/p right now. The reason the 9060 is closer to msrp might be because amd can produce them in higher quantities.
That's your personal scenario not mine. Best prices for both.

RX 9060 XT 16GB - 369€
RX 9070 XT 16GB - 694€

RX 9070 XT is 92% faster @4k for 88% more money.
average-fps-3840-2160.png
 
So, wait, it’s 150% of the hardware of a 9600 XT, uses 140% of the power of a 9600 XT, but has only 122% of the performance of a 9600 XT.

Is this supposed to be good or something?
 
This card only exists as a price segmentator so that the 9070 /XT does not go down in price. This crap started by AMD almost 6 years ago with the 5700 /XT when the RX 5650 XT came out.
Cards exist at price points and new ones are introduced to fill gaps? WHAAAAAAA@?@?@!?!!?@?!?

Bruh these have existed since the dGPU was created.
Do they understand that they helped turn the computer into a luxury item? I used to spend thousands of dollars per year on Tech parts. Not any more.
This feels like the manifestation of cognitive dissidence. If you were spending thousands of dollars per year on tech parts, those were, by definition, luxury items.

Much as nobody wants to admit it, the current high prices are the result of rampant runaway inflation resulting from nearly doubling the money supply in the last decade. Adjusted for inflation, the majority of computer hardware has either gone down in value or remained relatively stable.

A geforce 580 would cost in excess of $800 today, according to the highly conservative US inflation calculator. We all know real inflation is higher than the CPI will admit to. And the process nodes those GPUs are made on are over 10x the price they were in 2011. Now the wafers are also bigger, but the same size silicon today would easily cost at least 5x more today. So those $30 GPU dies that used to make up the likes of the 550ti cost in excess of $150 each now.
And it is a proven fact that AMD has lost more market share than ever before by this marketing practice.
This has far more to do with AMD's utter inability to make competitive hardware consistently for over a decade opposed to any change in pricing.
 
Cards exist at price points and new ones are introduced to fill gaps? WHAAAAAAA@?@?@!?!!?@?!?

Bruh these have existed since the dGPU was created.

This feels like the manifestation of cognitive dissidence. If you were spending thousands of dollars per year on tech parts, those were, by definition, luxury items.

Much as nobody wants to admit it, the current high prices are the result of rampant runaway inflation resulting from nearly doubling the money supply in the last decade. Adjusted for inflation, the majority of computer hardware has either gone down in value or remained relatively stable.

A geforce 580 would cost in excess of $800 today, according to the highly conservative US inflation calculator. We all know real inflation is higher than the CPI will admit to. And the process nodes those GPUs are made on are over 10x the price they were in 2011. Now the wafers are also bigger, but the same size silicon today would easily cost at least 5x more today. So those $30 GPU dies that used to make up the likes of the 550ti cost in excess of $150 each now.

This has far more to do with AMD's utter inability to make competitive hardware consistently for over a decade opposed to any change in pricing.

That's also ignoring modern EE design requirements (high wattage cards) and PCB layer requirements for PCI 5.0 signaling. (needs to hit the correct SNR)

Wafer size is the same 300mm or 12 inch if you're talking about TSMC 40nm. 580's were big dies, 520mm2. Scales linearly with current silicon bar yield rate.

Both NVIDIA and AMD are keeping tier 2 dies around high 350-400mm2 range. GB202 being a behemoth at 750mm2...

AMD is giving kickbacks to reach MSRP on both the 9060's and 9070's. They sort of have to price this at $450 "MSRP" or else it just looks dumb being limited to 12G
 
Last edited:
Cards exist at price points and new ones are introduced to fill gaps? WHAAAAAAA@?@?@!?!!?@?!?

Bruh these have existed since the dGPU was created.
Cards like this to fill in gaps without lowering prices of existing ones, or just make the existing ones better is a tactic copied from Nvidia, bruh.
This feels like the manifestation of cognitive dissidence. If you were spending thousands of dollars per year on tech parts, those were, by definition, luxury items.
Depends if they're building more than one gaming PC spending thousands of dollars, that used to be possible but not anymore.
A geforce 580 would cost in excess of $800 today
Except the x80 tier used to be the flagship, now anything below $1000 is either a compromise or a renamed version of an existing card but maybe slightly better.
This has far more to do with AMD's utter inability to make competitive hardware consistently for over a decade opposed to any change in pricing.
AMD making competitive hardware isn't the issue, trying to compete with a near monopoly which has influenced the tech press and AAA gaming studios is the problem.
 
Except the x80 tier used to be the flagship, now anything below $1000 is either a compromise or a renamed version of an existing card but maybe slightly better.

That's why it's more logical to just look at the die size and calculate the percentage of SM/CU disabled relative to legacy models.

5070 TI for example is more linear to the older base 70 class from the GTX 600-1000 series generation.

70/84 = 83.33% of die. 378mm2.

GTX970 = 13/16 = 81.25% of die. 398mm2

Trying to base things off the current 5090 is pretty silly. Just tech tube brain rot.

Edit: There are exceptions like the 20 series being overly large and moved around a bit. IE: 2070 having a full die TU106 @ 36/36 SM, but the price increase per MSRP was obvious. $380 > $500... They went from 314mm2 with 15/20 SM or 75% of die. to 445mm2 on TSMC 16 >12 (Similar node).

30 series was a bit strong relatively speaking, but it diverged from TSMC and was manufactured on Samsung 8. There was likely incentives here.

3080 was moved up to a GA102 flagship die, but quite a bit of SM's disabled. 68/84 or 80% of die. 30 series hosted the first xx90 GPU.
 
Last edited:
Back
Top