• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD RDNA2 Based Radeon RX Graphics Cards Launching This September

Be that as it may, it IS true that RDNA2 is architecturally different... once again... from RDNA. See my previous post on subject. Vega is also not getting the long term support it should be getting. Fury X was abandoned quite rapidly and performance is abysmal on newer titles. The list goes on... every time they change to a new setup/product line, its like they hit that reset button. Except long term issues do seem to return.

The days of long term AMD support on for example HD 6xxx/7xxx is behind us. I hope, and honestly also do think, that RDNA2 will be 'saved' because the consoles will carry something similar-ish.


Fury X with its 4GB... no way it's going to work properly on newer titles anyways.
Vega is a compute-oriented architecture, with many compromises for the normal user.

RDNA2 will be a true new architecture, RDNA is just a hybrid or halfway between GCN and RDNA2.
 
It's the same 7nm unless they managed to get the density up to 62 Mtr/mm2 where it should be, not this 41Mtr/mm2 currently that is more 10nm-like. then 5nm +60% and 3nm +60% are just around the corner, 7nm is pretty much obsolete. So if you are in the market for videocards $300 is the maximum reasonable price for something like 5700 +60%, Yeah that $349 is too much for PS5 console-like graphics even $200 is too much for 2304 processors..

These new RDNA 2/Ryzen 4000 Chips from what i know are using EUV process. So density should be different. IPC gains, and better watt/performance should increase by a lot.
 
It's the same 7nm unless they managed to get the density up to 62 Mtr/mm2 where it should be, not this 41Mtr/mm2 currently that is more 10nm-like. then 5nm +60% and 3nm +60% are just around the corner, 7nm is pretty much obsolete. So if you are in the market for videocards $300 is the maximum reasonable price for something like 5700 +60%, Yeah that $349 is too much for PS5 console-like graphics even $200 is too much for 2304 processors..


"5700 + 60%" should be Navi 23 with its 250 sq. mm die size.
Navi 22 with its 350 sq. mm and Navi 21 with 505 sq. mm have to be faster.
 
These are "client-segment" RX series graphics cards to launch September 2020. That usually does not mean "Consumer/Gaming-segment cards" or at least not yet. I'd say first "enthusiast orientated Consumer/Gaming-card" will be 3-month after, while a replacement for the RX 5700XT (Navi 10) would be at best Q1 2021. So that would make Navi 10 that released July 7th 2019, having something like a 19-month tour of duty, which is about normal.
If it's named "RX", then it's for gaming. Server gpu are called "Instinct", and worksation are simply called "Radeon pro".
AMD line up doesn't follow the same order as nvidia were it's : Big server gpu first, and then cut down gaming gpu. RDNA was first used on a gaming gpu, then on workstation. The server side is still vega on 7 nm.
 
If it's named "RX", then it's for gaming. Server gpu are called "Instinct", and worksation are simply called "Radeon pro".
AMD line up doesn't follow the same order as nvidia were it's : Big server gpu first, and then cut down gaming gpu. RDNA was first used on a gaming gpu, then on workstation. The server side is still vega on 7 nm.


And not only that but compute will be Arcturus, while here the topic is Navi 2X RDNA2.
Arcturus doesn't come with RDNA2.

And RX 5700 XT is nothing but something like a small pipe-cleaner type of card for the N7-class process.
 
Both Nvidia and AMD releasing new GPUs in September? This is going to be fun. Here's hoping they'll duke it out in regards to pricing and performance.

Finally a good reason to upgrade my 980 ti
 
It's the same 7nm unless they managed to get the density up to 62 Mtr/mm2 where it should be, not this 41Mtr/mm2 currently that is more 10nm-like. then 5nm +60% and 3nm +60% are just around the corner, 7nm is pretty much obsolete. So if you are in the market for videocards $300 is the maximum reasonable price for something like 5700 +60%, Yeah that $349 is too much for PS5 console-like graphics even $200 is too much for 2304 processors..
AMD already uses 62MT/mm2 desity in their high clocked Renoir APU's and it has 4x Clock speed of Ampere.
Why not ask Nvidia to do that and like Turing they launched the new generation GPU first. If Nvidia prices their gpus resonable AMD will have to do the same.

5700XT 1 year lifespan

That was fast.
Better than buying 2000$+ laptop with Turing gpu now when Ampere is just around the corner.
 
The mobile GPU's usually shows up a few months later tho. Four months later the last time.
Well you can change a desktop gpu easily, but can you do the same in those 2000$+ laptop?
 
Well you can change a desktop gpu easily, but can you do the same in those 2000$+ laptop?
Nooo, I'm just saying that around the corner doesn't really apply universally to both desktop and mobile. The new laptops might not be just as "around the corner" as the new 3000 graphics cards.

I'm not updated with gaming laptop upgradeability, but I assume MXM cards isn't really as common as it should be anymore.
 
For consoles AMD has a volume advantage and they also have nearly guaranteed sales for it. Margins can be very low and still profitable, which I reckon is also part of the reason AMD is still supplying these chips. Cost effectiveness, which is what consoles still need even if specs are better.

That's not true and it never was. Yet some people keep recycling "the consoles" argument over and over again.

There were 106 million PS4 and 46.9 million Xbox One consoles sold in total since their release in 2013. That's 7 years and counting and roughly 21 million per year. NVIDIA alone sold more than 15 million of their RTX GPUs since their release less than 2 years ago. And that's a higher end product (and even a more expensive than previous generations). Combined with lower end cards and those of AMD, PC hardware market seems just as big, if not bigger, than consoles. It is certainly more profitable (Nvidia's 65% margin speaks for itself).

Consoles are not an advatage for AMD. They help AMD survive, barely. They also suck a big portion of waffer allocation from AMD, that could have been otherwise sold in a more profitable PC or server market (if AMD was competitive, which it is not except for CPUs, unless RDNA2 changes that somehow).
 
That's not true and it never was. Yet some people keep recycling "the consoles" argument over and over again.

There were 106 million PS4 and 46.9 million Xbox One consoles sold in total since their release in 2013. That's 7 years and counting and roughly 21 million per year. NVIDIA alone sold more than 15 million of their RTX GPUs since their release less than 2 years ago. And that's a higher end product (and even a more expensive than previous generations). Combined with lower end cards and those of AMD, PC hardware market seems just as big, if not bigger, than consoles. It is certainly more profitable (Nvidia's 65% margin speaks for itself).

Consoles are not an advatage for AMD. They help AMD survive, barely. They also suck a big portion of waffer allocation from AMD, that could have been otherwise sold in a more profitable PC or server market (if AMD was competitive, which it is not except for CPUs, unless RDNA2 changes that somehow).

Missing the point completely...

- AMD has produced 150+ million of the exact same GPUs. That is volume. Turing is comprised of a whole range of cards: even two radically different designs, 20 and 16 series, with fundamentally different blocks in them. And yet, Nvidia still reaches about 1/5th of AMDs console chip sales.

- AMD has no limit on wafer allocation.. they buy in and price is factored into the product cost. But consoles do offer a long term contract that allows AMD to project demand way ahead of everyone else. Win win; fab gets work, AMD gets a lower price. And the design is always the same too.
 
Anyone curious to see what the new Radeon Pro cards are like in crossfire despite I believe being Vega based still the infinity fabric is certainly intriguing. If they scale well for gaming they'll actually be rather neat dual purpose work and play cards expensive though, but not so much relative to the current lineup of Quadro's.
 
Fury X with its 4GB... no way it's going to work properly on newer titles anyways.
Vega is a compute-oriented architecture, with many compromises for the normal user.

RDNA2 will be a true new architecture, RDNA is just a hybrid or halfway between GCN and RDNA2.

LMAO yes that is exactly what we heard a half dozen times before right?
 
The question is; Will they be readily available worldwide with this virus doing the rounds?
 
The 5700 [XT] will be made obsolete by RDNA 2, simply because of ray-tracing. Will it still have good perf in most games? Yeah of course. But if it's lacking a feature that even consoles are going to have, it's a bad decision to buy one from here on out at anything more than $200. So champsilva's point actually holds true.
 
Which Navi 2X GPU was this:
1590138487303.png

 
beat nvidia?
Looks it for now at least in that particular benchmark. By the time it's actually released though it might not be the case. Still if it gets closer to a 2080 Ti it'll at least lower the barrier a bit on the lower performance tier of cards. Some competition is better than none at least. Gotta also wonder how the power efficiency is just the same that comes into play if for no other reason than power supply restraints people may have in their systems.
 
beat nvidia?


Imagine if it is the 250 sq. mm Navi 23.

Looks it for now at least in that particular benchmark. By the time it's actually released though it might not be the case. Still if it gets closer to a 2080 Ti it'll at least lower the barrier a bit on the lower performance tier of cards. Some competition is better than none at least. Gotta also wonder how the power efficiency is just the same that comes into play if for no other reason than power supply restraints people may have in their systems.

How is it faster in the benchmark but over time it would become slower ? ?
 
Probably not, it will likely be the lower end part's next time round so will retain active support, Rdna2 might only be in the big new Navi with everything below being made up from Navi 1 chip's in the 6### lineup.


Complaining about new products being better probably one of the most pointless types of complaints. Progress is good, if you can't deal with it, don't buy anything. :cool:
It's not like back in the late 90s when buying a $2500 high-end PC would barely run (if at all) the coolest games released the following year, and technologies and APIs were deprecated left and right.

I didn't write anything beside my short rant, do I ? :D
I mean short lifespan in my PEG slot, and that's it. You guys watching too much conspiracy theory :D
 
I didn't write anything beside my short rant, do I ? :D
I mean short lifespan in my PEG slot, and that's it. You guys watching too much conspiracy theory :D
Nah no way mate.
If you can't control the Itch, that's on you stop moaning:D.
And chips are directly linkable to aliens, we didn't have chip's before Bob lazar's mate's back engineering efforts on Aliens ships;) :p :D
 
Nah no way mate.
If you can't control the Itch, that's on you stop moaning:D.
And chips are directly linkable to aliens, we didn't have chip's before Bob lazar's mate's back engineering efforts on Aliens ships;) :p :D
:laugh: Regardless of aliens or not I still marvel at how we have come from the1980s with 3MB hard drives to today's 7nm transistors and 8TB NVME drives.
 
Back
Top