• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Rumor: AMD RDNA2 6X50 Series Refresh With 18 Gbps VRAM Expected Around April 20th

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.15/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
The rumor mill is attempting to nail down the specific launch date of AMD's purported RDNA 2 series refresh. According to renowned leaker Enthusiastic Citizen over at Chiphell forums, AMD is now planning to launch updated versions of its RDNA2 graphics cards around April 20th or April 21st. It seems that AMD is updating three different SKUs based on the RDNA2 silicon, perhaps in order to increase their market attractiveness (and competitiveness) against both NVIDIA's lineup and Intel's upcoming Arc Alchemist series, which is expected to launch in the next several weeks as well.

The new cards, which are expected to carry updated model names, are currently expected to be the RX 6950XT (a response to NVIDIA's oft-delayed RTX 3090 Ti graphics card), the RX 6750XT (likely meant to compete against Intel's upcoming Arc Alchemist A700 series, and the RX 6650XT. The only available details purport to the RX 6950XT, which is expected to not only carry upgraded 18 Gbps GDDR6 VRAM, but also an increased power limit of 350 W (above the 300 W from the reference RX 6900XT. The other two GPU updates should follow suit along the memory frequency and power consumption increases.



View at TechPowerUp Main Site | Source
 
April 20th, about 30-35 intel Arc leaks away, then! Hell, intel might start "leaking" info on the next Arc, by then! Let's hope the new Radeons will be able to compete with intel's leaks. I hear they're going to be very competitive.
 
Lets take a poll on the launch price of the 6950XT.

I'm guessing $2100, ±$75
 
This is not Nvidia
No its not. average Price for 6900XT is $1600, so $2100 for such a new release is close speculation.
 
Looks like I'm going to die of old age before RDNA3 sees the light of day.
I'll buy a ouija board so I can tell all about it to you when you pass.
 
Things we don't need:

- Insane amounts of VRAM, will only make devs lazy in optimization and make old cards obsolete very fast
- More skus, refreshs, when there isn't enough supply of the models that exists
- Increase in MSRP's, don't be cunts and try and suck the blood out of gamers

Things we need:

- More cards at affordable MSRP's sold in a way everybody can get one and not just bots, scalpers and miners
 
Things we don't need:

- Insane amounts of VRAM, will only make devs lazy in optimization and make old cards obsolete very fast
- More skus, refreshs, when there isn't enough supply of the models that exists
- Increase in MSRP's, don't be cunts and try and suck the blood out of gamers

Things we need:

- More cards at affordable MSRP's sold in a way everybody can get one and not just bots, scalpers and miners
Also fixed 6550XT with PCI-E x8, better decoding/encoding and more display outputs.
 
Moar $$$, more power, business as usual. Pointless battle.
 
Things we don't need:

- Insane amounts of VRAM, will only make devs lazy in optimization and make old cards obsolete very fast
- More skus, refreshs, when there isn't enough supply of the models that exists
- Increase in MSRP's, don't be cunts and try and suck the blood out of gamers

AKA "ways to spin PR for our barefaced naked greed"

Hell, I could also care less if AAA GPU-pushing games died too. Boring iterative trash IMO even before poor optimization and microtransactional nonsense.
 
Last edited:
AKA "ways to spin PR for our barefaced naked greed"

Hell, I could also care less if AAA GPU-pushing games died too. Boring iterative trash IMO even before poor optimization and microtransactional nonsense.
I mostly agree, the only interesting looking title that came out recently is elden ring and apparently that's even locked at 60 fps.
On the other hand if you don't care about 4k you can be perfectly fine with an older card(like my 290x I got for 100€) and avoid chasing after the latest shiny hardware.
 
Things we don't need:

- Insane amounts of VRAM, will only make devs lazy in optimization and make old cards obsolete very fast
- More skus, refreshs, when there isn't enough supply of the models that exists
- Increase in MSRP's, don't be cunts and try and suck the blood out of gamers

Things we need:

- More cards at affordable MSRP's sold in a way everybody can get one and not just bots, scalpers and miners
Games are going to use more VRAM. Sorry, but your 512 MB 8800 needs replaced at some point.

You really think you can do 4k textures with 2GB of VRAM?
 
Things we don't need:

- Insane amounts of VRAM, will only make devs lazy in optimization and make old cards obsolete very fast
- More skus, refreshs, when there isn't enough supply of the models that exists
- Increase in MSRP's, don't be cunts and try and suck the blood out of gamers

Things we need:

- More cards at affordable MSRP's sold in a way everybody can get one and not just bots, scalpers and miners
I feel 8GB, 12GB and 16GB are decent amount of VRAM for the low, mid and high end cards. After all, the higher the resolution and graphic quality settings, the higher the VRAM requirements. Insane amount of VRAM will be the RTX 3090 with 24GB, but then again, this card is not exactly meant for gamers. Think of it as a Titan class card in the past and may be used for professional purpose/ content creator, and less for gaming. But because it is sold as a GeForce card, so that does not deter gamers with deep pockets to buy it.
 
At this point ETH's proof of stake merge can't come fast enough. Let the market overflow with used GPUs and may TPU forum flourish with new members asking how to fix those GPUs.
 
At this point ETH's proof of stake merge can't come fast enough. Let the market overflow with used GPUs and may TPU forum flourish with new members asking how to fix those GPUs.

the thing that will be delayed for the 100th time? because money right
 
No its not. average Price for 6900XT is $1600, so $2100 for such a new release is close speculation.
In the EU it's better. 1300-1400€ for 6900XT. I dont expect 6950XT to cost much past 1600€ considering the small upgrade.
 
Things we don't need:

- Insane amounts of VRAM, will only make devs lazy in optimization and make old cards obsolete very fast
- More skus, refreshs, when there isn't enough supply of the models that exists
- Increase in MSRP's, don't be cunts and try and suck the blood out of gamers

Things we need:

- More cards at affordable MSRP's sold in a way everybody can get one and not just bots, scalpers and miners

For a high-end card, having 16GB of ram is what you exactly want. Not only makes it the card more future-proof, it's also for a 2nd reason. Allocate VRAM vs actually using it. The more you can stuff into the GPU's RAM the less it has to pull from system memory which offers extra performance. I dont think you have a clue on why AMD chooses for a 256 bits bus and it's available RAM chips there are now.

As for supply's - it's obvious that a run on gamer cards was performed by miners but also scalpers. People who have no business in gamer cards suddenly starting buying it. They are all limited to what TSMC can produce. As for MSRP's - the pricing of components obviously was driven up. So it's normal that your card that offers you far more performance these days then 10 years ago rises up in pricing. You'll get something that would last you for quite some time now.
 
the thing that will be delayed for the 100th time? because money right
I mean I'm not too into cryptosphere. But aren't they trying to fast-track the merge because of money?
Wasn't PoS the reason so many old money institutions are finally giving a shit about crypto?

If I was a rich old guy with falling hair up my bum, I wouldn't wanna give transaction fee to a filthy peasant miner living in his parents' basement ballooning their electricity cost. Not even a penny. Move out and get a real job for minimum wage you lazy bum!
 
There are still more coins to mine after ETH. But hopefully the majority "gives up" and moves on.

There will be quite more of such bubbles happening in the future. NFT's is just another laundry scam if you ask me.
 
this card is not exactly meant for gamers.
by saying this you're implying the 3090 Ti isnt meant for gamers either.

If both were not meant for gamers, something would have said so in the PR. So who are they meant for? :kookoo:
 
Things we don't need:

- Insane amounts of VRAM, will only make devs lazy in optimization and make old cards obsolete very fast
- More skus, refreshs, when there isn't enough supply of the models that exists
- Increase in MSRP's, don't be cunts and try and suck the blood out of gamers

Things we need:

- More cards at affordable MSRP's sold in a way everybody can get one and not just bots, scalpers and miners
I think the MSRP are decent for AMD. It’s just that not even the retailers or etailers are not following them.


You really think you can do 4k textures with 2GB of VRAM?
Close if it’s coded/optimized well with a good game engine

I know old game but 4k not impossible on low ram.
 
Last edited:
Close if it’s coded/optimized well with a good game engine
very few newer games are in a position to say they are optimized, and less than 5 engines are what would be called good many engines are designed for one game, maybe 2 but rare. Many games use heavily modify tried and true engines, like Unreal 4, and with modifications come more issues. The developers saying goes change one thing something else breaks, thats so very true. Alot less than perfect code is overcome by having a better than average GPU, but hiding shit code only lasts so long.
 
Back
Top