• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 GRE 12 GB Graphics Cards Allegedly in the Pipeline

T0@st

News Editor
Joined
Mar 7, 2023
Messages
3,113 (3.91/day)
Location
South East, UK
System Name The TPU Typewriter
Processor AMD Ryzen 5 5600 (non-X)
Motherboard GIGABYTE B550M DS3H Micro ATX
Cooling DeepCool AS500
Memory Kingston Fury Renegade RGB 32 GB (2 x 16 GB) DDR4-3600 CL16
Video Card(s) PowerColor Radeon RX 7800 XT 16 GB Hellhound OC
Storage Samsung 980 Pro 1 TB M.2-2280 PCIe 4.0 X4 NVME SSD
Display(s) Lenovo Legion Y27q-20 27" QHD IPS monitor
Case GameMax Spark M-ATX (re-badged Jonsbo D30)
Audio Device(s) FiiO K7 Desktop DAC/Amp + Philips Fidelio X3 headphones, or ARTTI T10 Planar IEMs
Power Supply ADATA XPG CORE Reactor 650 W 80+ Gold ATX
Mouse Roccat Kone Pro Air
Keyboard Cooler Master MasterKeys Pro L
Software Windows 10 64-bit Home Edition
AMD and its board partners cleared "phase one" of RDNA 4 earlier on in March, with the launch of Radeon RX 9070 Series graphics cards. At the tail end of special introductory events, Team Red representatives—on both sides of the Pacific—teased a second quarter release of lower end Radeon RX 9060 Series models. A handful of AIBs have registered multiple custom Radeon RX 9060 XT 16 GB and 8 GB SKUs, so expectations have been set for an imminent arrival. A fresh insider leak suggests that AMD has something else Navi 48 GPU-related in the pipeline; possibly scheduled for launch before rumored Radeon RX 9060 XT cards. Earlier today, IT Home picked up on chatter regarding a mysterious Radeon RX 9070 GRE model. Apparently Zhongzheng Computer (note: machine translated name) issued an intriguing tidbit on its WeChat official account—the March 9 bulletin stated: "friends who don't have enough budget for Radeon RX 9070 XT can wait for RX 9070 GRE, which will have a better price-performance ratio. Radeon RX 9060 XT will have to wait for a while."

Based on this news, VideoCardz believes that Chinese market stock of custom Radeon RX 9070 16 GB (non-XT) cards was not topped up last week. Local sources have observed regional market conditions with almost zero availability—conjecture points to Team Red's local office "deliberately" paving the way for "gap-filling" Radeon RX 9070 GRE 12 GB options. As reported by TechPowerUp on multiple occasions, AMD's "GRE" (aka Golden Rabbit Edition) nomenclature debuted with their introduction of a Radeon 7900 GRE 16 GB model back in 2023—the Year of the Rabbit. This (now) very out-of-date naming scheme was revised earlier this year—with a modernized abbreviation of "Great Radeon Edition." Benchlife.info weighed in on rumors regarding a new-generation GRE package: "(it) uses the same Navi 48 die, that is, the RDNA 4 GPU architecture, as the Radeon RX 9070 XT and Radeon RX 9070 currently on sale, but the memory will be reduced to 12 GB and the memory interface will be 192-bit. Our sources have informed us that the Radeon RX 9070 GRE 12 GB is currently being planned by AIB partners and is ready to enter mass production." VideoCardz has kindly assembled a relevant comparison chart—see below. Naturally, these theorized specifications place the incoming GRE somewhere in between the already released Radeon RX 9070 16 GB cards, and a rumored Radeon RX 9060 XT class.



View at TechPowerUp Main Site | Source
 
Just a 9060XT with a 9070 name to charge more, you know because they can. Nvidia taught them well.
 
Just a 9060XT with a 9070 name to charge more, you know because they can. Nvidia taught them well.
If Nvidia does it and the public reward Nvidia with more sales at ridiculously high prices, then AMD should do the same, because the public approved that method with their wallets.
 
They can put whatever they want in the pipeline but if nobody can buy them anywhere on earth then what's the point?
 
Since it is the year of the snake, I wish they would just call it the RX 9070 GSE.

Golden Snake Edition :p
 
Isn't this the role the 9060 cards are supposed to fill?
RDNA4 has two chips - a 64 CU, 256 bit memory bus version used in the 9070XT (and cut down to 56 CU for the 9070) and an upcoming 32 CU, 128 bit chip that will end up in the 9060 lineup. That of course leaves a pretty big performance gap; a further cut down 9070 chip with say ~48 CU, 192 bit memory bus to use up any partially defective chips would fill that gap almost perfectly, while not requiring much work on AMD's part.
 
RDNA4 has two chips - a 64 CU, 256 bit memory bus version used in the 9070XT (and cut down to 56 CU for the 9070) and an upcoming 32 CU, 128 bit chip that will end up in the 9060 lineup. That of course leaves a pretty big performance gap; a further cut down 9070 chip with say ~48 CU, 192 bit memory bus to use up any partially defective chips would fill that gap almost perfectly, while not requiring much work on AMD's part.
Yep there is a lot of space between 9060XT and a 9070LE/GRE. It all comes down to the pricing.
 
Just a 9060XT with a 9070 name to charge more, you know because they can. Nvidia taught them well.
Why stop with GRE? Why not flatten the entire numbering scheme to just 9070 + some incomprehensible acronym at the end?
9070 pb < 9070 sempron < 9070 GRE < 9070 < 9070 xt < 9070 xtx < 9070 ti < 9070 ti super < 9070 fx
 
Why stop with GRE? Why not flatten the entire numbering scheme to just 9070 + some incomprehensible acronym at the end?
9070 pb < 9070 sempron < 9070 GRE < 9070 < 9070 xt < 9070 xtx < 9070 ti < 9070 ti super < 9070 fx
And the final boss, 9070 USD.
 
9070gre would be cool at 450.
I take that over a 128bit 9060xt.
i doubt 9060xt beat 7800xt so the 9070gre have its place.
 
Last edited:
Just a 9060XT with a 9070 name to charge more, you know because they can. Nvidia taught them well.

Dramatic much? The 7900 GRE was the same deal, it's just defective silicon that doesn't make the cut to be sold as 9070. Instead of trashing it all, you just bin and sell as a low end product. Nothing wrong with that.
 
why buy a new CPU/GPU when new PS/XBOX will be released in a few years ?
specially when the performance of CPUs/GPUs hasnt really seen a major leap lately
i built my PC for almost 2K about 4 or 5 ya with a R9 5900X and a RX6800, at 2560x1080 resolution, I still get over 90 FPS in all the games I play, usually by tweaking settings (medium/high no RT for the most demanding titles) i won’t upgrade until the next PS/XBOX gen arrives
there’s a right time to upgrade, and many youtube ect reviews to made a right choice
 
Dramatic much? The 7900 GRE was the same deal, it's just defective silicon that doesn't make the cut to be sold as 9070. Instead of trashing it all, you just bin and sell as a low end product. Nothing wrong with that.
I'm not quite sure the GRE and M were 'defective silicon' quite the way we'd come to expect it.
Navi 31 XL is an odd duck

RX7900M Navi 31 XL and RX7900GRE Navi 31 XL are unique in the Navi 3x lineup (The GRE being the 'fuller fat' SKU).
Both, operate on a considerably lower voltage curve than the 7900XT and XTX. Both, share packaging form factor w/ Navi 32(RX7800).
Both, appear to be purpose-engineered and efficiency-binned assemblies of Navi 31's GCD and MCDs. -made to 'solder-in' wherever a Navi 32 would otherwise sit.

This 9070 GRE though, is probably exactly like 'salvaging' of old. The rumored specs 100% look like an RX 9070 w/ part of its memory controller left cut/disconnected.

Why stop with GRE? Why not flatten the entire numbering scheme to just 9070 + some incomprehensible acronym at the end?
9070 pb < 9070 sempron < 9070 GRE < 9070 < 9070 xt < 9070 xtx < 9070 ti < 9070 ti super < 9070 fx
Someone hasn't been in this game for long... :laugh:
1743713212311.png
1743713172036.png
 
I have no issue with a 12 Gb Radeon 9700/9600 GRE moving the mid range tier to higher than 8 Gb for 1080p and 1440p. Nvidia surely won't do it.
 
Dramatic much? The 7900 GRE was the same deal, it's just defective silicon that doesn't make the cut to be sold as 9070. Instead of trashing it all, you just bin and sell as a low end product. Nothing wrong with that.

Nothing wrong with naming a product properly either. These companies are taking the piss out of us customer and loyalists defend them like sheep.

Sad actually because then the companies see they can get away with these shady practices.
 
Last edited:
Nothing wrong with naming a product properly either. These companies are taking the piss out of us customer and loyalists defend them like sheep.

Sad actually because then the companies see they can get away with these shady practices.

What's... wrong with the naming this time? AMD has been doing "GRE" for a low quality bin GPU for 2 generations now. :wtf:
 
What's... wrong with the naming this time? AMD has been doing "GRE" for a low quality bin GPU for 2 generations now. :wtf:

Same as I disagree with Nvidia BS, as the 5070TI should be a 5070 and the 5070 should have been a 5060TI. They up ranking their cards so they can charge more for it.

The 9070 -50$ from the 9070XT is already a pisspoor showing from AMD. Now this really? Keep defending these companies and then cry later and ask yourself why they get away with charging the price they do.
 
Same as I disagree with Nvidia BS, as the 5070TI should be a 5070 and the 5070 should have been a 5060TI. They up ranking their cards so they can charge more for it.

The 9070 -50$ from the 9070XT is already a pisspoor showing from AMD. Now this really? Keep defending these companies and then cry later and ask yourself why they get away with charging the price they do.

GPU "shrinkflation" has been a real thing that has been slowly creeping up since the Kepler era (first generation with a 80 SKU that had the midrange chip, GTX 680 had the GK104 while the GF104 was used in the GTX 460 and GF114 on GTX 560), and I'm sure you watched GN's video released yesterday, that is something I've been going on about for years at this point and nobody really cared, telling me that it was the performance and not the GPU core underneath that mattered. The market made its choice, so this isn't a "defense" of anything. Besides, I did purchase an RTX 5090 ;)

AMD's been doing the same, albeit much less aggressively due to the many issues they've been facing with drivers, performance consistency and market adoption rates. The 9070 XT was a step in the right direction, IMHO. They just weren't apt to compete, and needed a solid product with a low cost to drive up market adoption rates, and it fills that role quite nicely.

I can't call the 9070 XT piss poor, because it never pretended to be more than what is advertised on the tin: "solid midranger", and even though it is still affected by the market conditions, that is beyond the vendors' control.
 
Back
Top