• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announces the Radeon RX 6000 Series: Performance that Restores Competitiveness

Famous games have the most players. Most players mean Nvidia can sell more cards. Which card do you think a Fortnite player will choose? A card without DLSS or a card with DLSS??
Because Fortnite is all about eye-candy... Bad example.
my bad... confused with RTRT

I still believe this tho...
I like eye-candy games but right now both RTRT and DLSS2.0 are more than limited. Maybe in a year or 2.

and with game that doesn't support DLSS , you think player chooses a card with DLSS or a card with more raw power? What I wanted to say that DLSS support is limited.
AMD left a lot clocks for AIB card.I'm sure RDNA2 AIB can reach 2400mhz or more with water cooling.
AIBs dont need water coolers to push 350W on these cards. I had a card in the past with 375W on air(2fan) cooler and it was fine. As long as you can get that heat outside of case.
 
so now I have to hate LG for not implementing Freesync on C9 and B9 Oleds.
 
NV's problem is that they push for their own standard forcing developers to go with it. Look what happened with Gsync. Still is but then NV cards can use the open standard Adaptive sync which AMD's FreeSync is based on. Same is with DLSS. It would seem, NV wants to cut down the competition with their product tying developers hands. Game developers will do it since there is no alternative. Now there is and it will be way easier to implement on variety of games. AMD's gonna have another presentation about the RDNA2 architecture, There will be more about the features and how they work exactly. Maybe it's worth to take a look at it.

so now I have to hate LG for not implementing Freesync on C9 and B9 Oleds.
Hate NV. They forced it upon LG. Besides, there can still be screens with FreeSync.
 
Moreover, Nvidia created so much hype around Ampere launch that demand was unprecedented.
Yeah, has Nvidia interns doing their market research?
 
Which card do you think a Fortnite player will choose? A card without DLSS or a card with DLSS??

Are you serious ? Your average Fortnite player is probably like 14 and hardly knows on what planet he is, hell he probably plays it from a phone or console, DLSS is the last thing he would think or care about. You think that the more popular the game is the more important these things are, couldn't be further from the truth. It's the other way around, the more popular a game the wider the audience meaning most people wont know or care to know about some niche technology.

It's the 25+ year old enthusiast buying crazy expensive hardware that thinks about whether or not a GPU has a certain feature and he is in the minority.
 
Last edited:
so now I have to hate LG for not implementing Freesync on C9 and B9 Oleds.
This might turn out to be wrong, but AFAIK these GPUs should still support VRR on C9 and B9 OLEDs, as those (at least on paper) support HDMI 2.1 VRR, which these GPUs also do. The proprietary implementation on them is "HDMI 2.1 VRR" support on non-HDMI 2.1 Nvidia GPUs. But at least in theory, you should be able to enable "FreeSync" (i.e. HDMI 2.1 VRR) on these GPUs regardless of explicit FreeSync support.
 
I'm curious to see how smart access memory turns out. The 3070 is still the best value card as of today, and there will likely be a 3080ti in the $1000-1200 range to challenge the 6900XT soon
 
This might turn out to be wrong, but AFAIK these GPUs should still support VRR on C9 and B9 OLEDs, as those (at least on paper) support HDMI 2.1 VRR, which these GPUs also do. The proprietary implementation on them is "HDMI 2.1 VRR" support on non-HDMI 2.1 Nvidia GPUs. But at least in theory, you should be able to enable "FreeSync" (i.e. HDMI 2.1 VRR) on these GPUs regardless of explicit FreeSync support.
Actually. The VRR is supported by these screens but it is not FreeSync equivalent. Maybe it will work but it may not necessarily work as a FreeSync monitor or a TV would work.
 
The GPU industry hasn't seen anything like this for a decade or more.
Let's not kid ourselves, AMD could well be promising the jump wrt the worst performing RDNA card out there as opposed to say the most efficient one ~ which btw is inside a Mac not PC:nutkick:

Yeah, has Nvidia interns doing their market research?
With a major launch, great job JHH learning on the job/at consumer's expense :laugh:
 
DXR is hardware agnostic, AMD just has to provide a back end implelentation.
I was told multiple times that actual implementation in green sponsored games uses green proprietary extensions.

I just don't get why people think it is a good thing, as it basically spells d e a t h for RT.

The 3070 is still the best value card as of today,
No it's not, 6800 is.

6800 => $80 (16%) more for +18% performance and +8GB of VRAM.
 
No it's not, 6800 is.

6800 => $80 (16%) more for +18% performance and +8GB of VRAM.
Having same impression. Of course reviews are needed to show the bigger picture but I don't understand why people see 3070 as a better value here and bring it up as if it's something obvious at this point.
 
I doubt that. As you may know, having cards age gravefully is not in the interest of tech companies. They need to sell new hardware, not support old hardware. So right now they'll give you DLSS 2.0 support for current games, next years games will get DLSS 3.0 support, but sooner or later older cards wont support future DLSS versions, so that current cards will be limited to DLSS Versions of their corresponding generation.
The Polaris and 1080ti said What?!
 
I was told multiple times that actual implementation in green sponsored games uses green proprietary extensions.

I just don't get why people think it is a good thing, as it basically spells d e a t h for RT.

It doesn't really matter if they use proprietary extensions because every manufacturer will have to get DXR working using their own extensions anyway. But when you actually write applications using DXR you can only use the API calls Microsoft wrote.
 
I was honestly expecting more, especially from the 6800 that was my target. I mean, 16 Gb of VRAM are highly unnecessary (10 would have been perfect)
I think that AMD has better knowledge than Your how VRAM size is better. Especially for long-term users. Nobody write comment about that:
12:10PM EDT - RDNA3 by the end of 2022
@Anandtech liveblog. Must be reason to break 15-18 months development cycle.
 
performance is good, price not so much, sigh.

like take the RX6800, beats the 2080ti so it will be a bit faster then an RTX3070, but it also costs 580 dollars vs 500 dollar for the RTX3070, so not really a clear winner in the "What to buy" discussion.

AMD is a clear winner to me, ill buy AMD just to "not buy nvidia", their performance will be more than enough. If can "not buy nvidia" and get similar perforamcne even ifi pay 100 dollars more, or in the case of 6900XT for 500 dollars less, ill take it, just to punish nvidia. They milked everyone for so long for so much. All in the name of profit.
 
Are you serious ? Your average Fortnite player is probably like 14 and hardly knows on what planet he is, hell he probably plays it from a phone or console, DLSS is the last thing he would think or care about.
do you have ANY data to support your claim ?
Or just because you don't like the game you are making assumptions?
 
It doesn't really matter if they use proprietary extensions because every manufacturer will have to get DXR working using their own extensions anyway. But when you actually write applications using DXR you can only use the API calls Microsoft wrote.

When I mean using extensions, I mean calling vendor specific ops.

do you have ANY data to support your claim ?
Do you have other double standards to apply to arguments?
 
I think that AMD has better knowledge than Your how VRAM size is better.
not necessarily.
Or do you think AMD has better knowledge than Nvidia, that put 8 Gb on the 3070 ?

Did you ever check VRAM usage on your VGA while playing at 1440P ?

Do you have other double standards to apply to arguments?
also to you: do you have ANY data about Fortnite average players age ?

I don't know, and I'm quite sure you don't either
 
Or just because you don't like the game you are making assumptions?

It's funny because I played Fortnite for about a year with a friend of mine and his little brothers and their friends who were all 14.
 
also to you: do you have ANY data about Fortnite average players age ?
I have plenty of anecdotal data, including my kid and his friends and kids of my friends.
No, it's in no way scientific and good to see a person who applied the same high standard to own words had time to make @Vya Domus and myself better.
 
No it's not, 6800 is.

6800 => $80 (16%) more for +18% performance and +8GB of VRAM.

so you already have a comparison between 6800 and 3070 ?
Quite strange, since AMD didn't show one yesterday ...

I have plenty of anecdotal data

that answer my question: you don't have ANY data
 
not necessarily.
Or do you think AMD has better knowledge than Nvidia, that put 8 Gb on the 3070 ?

Did you ever check VRAM usage on your VGA while playing at 1440P ?
It is not knowledge but it is marketing's knowledge. Just like there where 3080 coming with 20Gb. Shortage of memory and the idea was scrapped.
so you already have a comparison between 6800 and 3070 ?
Quite strange, since AMD didn't show one yesterday ...
AMD did show slides comparing 3070 with 6800. The only thing is that AMD used 2080 Ti which is basically same performance. Surprised you have missed it.
 
Last edited:
16gb of ram is very attractive and that sends a clear message to nvidia and now nvidia needs to respond with something similar or superior than 16gb of gddr, competition is good for us.
 
that answer my question: you don't have ANY data

Neither do you, all studies that I found required a minimum age of 18. But you'd have to pretty dense to believe for a second this game isn't played mostly by children given the memes and plethora of complaints from parents and all the media attention. That's enough evidence.
 
AMD did show slides comparing 3070 with 6800. Surprised you have missed it.
Hardly surprising people in green reality distortion field missed it.
There are 3 typical reactions:

1) Missed it completely
2) Didn't catch that 6800 is 18% faster than 2080Ti
3) But muh DLSS/muh useles RT checkboxes in a handful of games, but muh CUDA, but muh Drivers
 
Back
Top