• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RTX 2060 VS RX 5700 (July-August 2020)

RTX 2060 VS RX 5700

  • RTX 2060

    Votes: 14 48.3%
  • RX 5700

    Votes: 15 51.7%

  • Total voters
    29
Joined
Oct 10, 2018
Messages
157 (0.06/day)
If you build PC with RTX 2060 or RX 5700 now, what will you use?
For me
RTX 2060:
Pros
-Ray Tracing and DLSS, most importantly DLSS
Cons
-6GB VRAM
RX 5700:
Pros
-8GB VRAM and %5-15 powerful than RTX 2060
Cons
-It does not support Ray Tracing and DLSS

What would you pick? What's your opinion? Thanks for answering
 
This depends....... they are two different cards performance-wise, really... the 5700XT is faster by like 15 on average. A 2060 Super is a more apples to apples comparison on performance and price, note.....

5700 XT, among other Navi based AMD cards, you have a greater chance to run into driver issues that have been ongoing since release.

Why is 6GB of VRAM a con? That is a 1080p card and 6GB is fine.

Tough call, honestly......... for my needs, since i play at 2560x1440 and 144 Hz, I'd take the AMD card and cross my fingers I don't have a driver issue. Everyone is going to have a different opinion which applies to their needs. For the same cost a 2060 Super and 5700 XT, I'd likely go Nvidia...

Are you in the market for one of these cards I assume and trying to pick it based on others and how they use the card..............?
 
For gaming, 5700, or more exactly the 5700XT because where I live it is the same price with the 2060.
For video content creation and gaming, the 2060 KO.

But I wouldn't build now, I would wait 4 months.

DLSS is a very promising technology, but does it work for any of the games that you want to play? Does RTX work for the games that you want to play, and are the framerates acceptable?

Anyways, as I said, I wouldn't build now, in a few months I could buy a whole console with the price of these cards that will beat them both in raw performance AND RT.
 
Last edited:
Ray Tracing
The 2060 (S) is far too weak for proper RT. Lowest I'd use for RT is a 2070 Super, but even the 2080 Ti can't do enough in my opinion. Maybe RT will get better with the 3000 series and / or new AMD GPUs.
 
The 2060 (S) is far too weak for proper RT. Lowest I'd use for RT is a 2070 Super, but even the 2080 Ti can't do enough in my opinion. Maybe RT will get better with the 3000 series and / or new AMD GPUs.
dlss 2.0 on 2070S in Control was the first time I felt like I'm, playing a proper RTX game.

Is there some limitation to AMD's FidelityFX that warrants mentioning Nvidia's DLSS 2.0 but not AMD's solution?
yes it's a sharpening filter not image reconstruction.

If you build PC with RTX 2060 or RX 5700 now, what will you use?
For me
RTX 2060:
Pros
-Ray Tracing and DLSS, most importantly DLSS
Cons
-6GB VRAM
RX 5700:
Pros
-8GB VRAM and %5-15 powerful than RTX 2060
Cons
-It does not support Ray Tracing and DLSS

What would you pick? What's your opinion? Thanks for answering
may wanna reconsider that performance gap,amd mostly optimizes well for games that are covered by reviews.
look at all those recent titles.amd just doesn't care apparently.

 
If you build PC with RTX 2060 or RX 5700 now, what will you use?
For me
RTX 2060:
Pros
-Ray Tracing and DLSS, most importantly DLSS
Cons
-6GB VRAM
RX 5700:
Pros
-8GB VRAM and %5-15 powerful than RTX 2060
Cons
-It does not support Ray Tracing and DLSS

What would you pick? What's your opinion? Thanks for answering

if you plan to play cyberpunk 2077 a lot, then you want the rtx 2060 because DLSS 2.0 is a gamer changer and 2077 is promised to support it day 1. it basically increases your FPS/graphics both an insane amount, look at DLSS 2.0 Control (game) benches. it's insane how good DLSS 2.0 is. more and more games are supporting it too. it's the future, I have no doubt about it.
 
I went the AMD RX5700 (non XT) and am very happy with it.
between Radeon Image Sharpening and Radeon Boost (Movement based Dynamic Resolution) I don't seem to have any probs running games on ultra settings even though I only have a 1080p monitor I do use AMD super resolution modes namely 2160p down to 1080p which looks great don't really need RTX as it's not really performing where it should be for good FPS and DLSS meh I don't stand still staring at my surroundings for long enough for it to make a difference
 
Radeon Chill is the most interesting thing about Radeons in my opinion, as it has the potential to save a lot of energy / decreases heat output. The RX 5700 is also a stronger performer, not only on paper, but in real life as well. All games get optimized by both companies, I'm a long time Nvidia user myself, but I will talk facts here. Cyberpunk is a mainstream AAA game, it will 100% be optimized for any recent Radeon cards, even as old as from 2012.

DLSS is interesting, but Radeon Image Sharpening (RIS) has a similar potential, more or less. IMO, simply use 1440p and do not bother with DLSS or RIS. 1440p has crisp picture quality, there's no need for tricks. I used DSR when I had a 1080p monitor, to emulate 2x 1080p resolution, which is higher than 1440p. But after getting a 1440p monitor I never used it again, as there is absolutely no need.

Also there should be said, that DLSS is a companion to RTX, to make the picture quality higher because of use of a lower resolution to activate the ressource heavy ray tracing application. There is not much use for it otherwise. But Ray Tracing is a hype of its own, and I think not where it should be yet. Give it another 1-2 generations.
 
Last edited:
In fact, my friend will buy 2060/5700 but he will use it for at least 4 years. Sorry, i could not write it in topic.
 
In fact, my friend will buy 2060/5700 but he will use it for at least 4 years. Sorry, i could not write it in topic.
8 GB Vram is more useful for 4 years. Clearly, and history has shown. HD 7970 vs GTX 680 for example.
 
8 GB Vram is more useful for 4 years. Clearly, and history has shown. HD 7970 vs GTX 680 for example.
dlss 2.0 will be a bigger benefit
by that I mean there are and will be more games that use dlss 2.0 than those bottlenecked by 6gb.by far more.
and turing's memory management is different from kepler v1,you don't see 2060 losing to 8gb cards more at 4K than at 1440
2060 is a better overclocker,too.
 
Last edited:
If you build PC with RTX 2060 or RX 5700 now, what will you use?
For me
RTX 2060:
Pros
-Ray Tracing and DLSS, most importantly DLSS
Cons
-6GB VRAM
RX 5700:
Pros
-8GB VRAM and %5-15 powerful than RTX 2060
Cons
-It does not support Ray Tracing and DLSS

What would you pick? What's your opinion? Thanks for answering
It depends on the planned usage scenario. If RTX and DLSS are important, the 2060 is the clear choice. If those features are not important, then the 5700 is the clear choice.

The difference in VRAM will be minimal unless gaming in 4k, but then neither of those cards are contenders at that resolution. So 1080p is a thing for those two cards and 1080p will not benefit from 8GB VRAM VS 6GB VRAM. 1440p would feel a difference, but only just. If planning on 1440p or 4k gaming, save some more money, buy a 5700XT, 2070s or 2080s.
In fact, my friend will buy 2060/5700 but he will use it for at least 4 years. Sorry, i could not write it in topic.
This is more of a reason to save more money and buy a better card as a better card will age better over 4 years.
 
dlss 2.0 will be a bigger benefit
by that I mean there are and will be more games that use dlss 2.0 than those bottlenecked by 6gb.by far more.
and turing's memory management is different from kepler v1,you don't see 2060 losing to 8gb cards more at 4K than at 1440
2060 is a better overclocker,too.
DLSS is just a upscaler supported via cloud compute, comparable to RIS or any other upscaler. Nothing too special. 2 GB Vram is 2 GB Vram, it has nothing to do with that. More Vram enables higher possibilities, it is not replacable via software gimmicks.

2060 Super loses to RX 5700 now, and it will especially lose to it in 4 years. The supporting data is overwhelming.

2060 is not a better overclocker if you unlock 5700 overclocking via regedit. Then the 5700 is far better, as there is more benefit to get from a more powerful GPU than from a smaller one. But overclocking is not part of the discussion, as it is unknown if said person will even do it. If it is not specifically said someone will, I will always assume no. OC in general is a moot point for a argument if talking in general. And this is general.

The difference in VRAM will be minimal unless you're gaming in 4k, but then neither of those cards are contenders at that resolution. So 1080p is your thing for those two cards and 1080p will not benefit from 8GB VRAM VS 6GB VRAM. 1440p would feel a difference, but only just. If you plan on 1440p or 4k gaming, save some more money, buy a 5700XT, 2070s or 2080s.
The difference in Vram is significant if a GPU is used for 4 years. Do not limit the argument to today. Yes, today it is barely beneficial, but it is beneficial in the coming years and for lower resolutions than 4K as well. Again, history has proven this.
 
2060, no brainer, because it is actually stable, it does some light RT and the performance gap won't make you go from playable to non playable or vice versa between these two.

That and the other feature set differences, such as DLSS, Freestyle, Ansel... lots of nice little tools to play around with. There's just quite simply more fun to be had from a Geforce card.

VRAM is irrelevant at this perf level between these cards, 6GB will do fine, I have yet to find an exception. So far the 1080 with 8GB does nothing better against its 6GB near-equivalent. And the 1080 has some age to it. That also goes for 1440p, and you don't really want to push higher with this card anyway.

The difference in Vram is significant if a GPU is used for 4 years. Do not limit the argument to today. Yes, today it is barely beneficial, but it is beneficial in the coming years and for lower resolutions than 4K as well. Again, history has proven this.

The 7970 is very much obsolete and has been for quite a while now. 3GB or not. It did last a bit longer, but you can't just compare this as apples to apples. The VRAM capacities have exploded shortly after that card released (Maxwell onwards - 2GB doubled to 4, and one gen later doubled again to 8 for an x80 card). This is not something that will keep happening - we stabilized at 6~8GB now for sub 4K resolutions and it really is fine. In the meantime, delta compression was also improved on Nvidia cards. The proof of that can be found in the next Ampere release - you won't see 12 or 16GB x104 cards pop up (or x106 if latest rumor is to be believed) and its likely we will see 11GB or close to it as a cap once again, same as the past two gens.
 
Last edited:
The difference in Vram is significant if a GPU is used for 4 years. Do not limit the argument to today. Yes, today it is barely beneficial, but it is beneficial in the coming years and for lower resolutions than 4K as well. Again, history has proven this.
Not at 1080p. Whether today or in 4 years the 2GB extra will not make a difference at that resolution and neither of those cards will perform on an acceptable level at 4k so the argument is academic. The VRAM should not play a factor in the decision for 1080p. If going up from there, a better GPU is required.
 
Not at 1080p. Whether today or in 4 years the 2GB extra will not make a difference at that resolution and neither of those cards will perform on an acceptable level at 4k so the argument is academic. The VRAM should not play a factor in the decision for 1080p. If going up from there, a better GPU is required.
Nobody talked about 1080p on a 1440p card. Even the 5600 XT is beyond 1080p capable, despite being marketed as a 1080p card.

VRAM is very relevant, especially in 4 years, do not let yourself be fooled.
 
2060 Super loses to RX 5700 now
it doesn't

2060S aib vs 5700xt aib are as fast,5700 is slower.And navi cards are spent on factory clocks,2% oc is all you get while even my 2070S does 1950->2100 oc easily.
DLSS is just a upscaler supported via cloud compute, comparable to RIS or any other upscaler.
lol,and RIS is just sharpening.
if you're comparing image reconstruction to image sharpening you're just wrong.watch any dlss 2.0 commentary.dlss 2.0 adds detail,resolution scale drop loses detail and adds artifacts like any sharpening filter.

2 GB Vram is 2 GB Vram, it has nothing to do with that. More Vram enables higher possibilities, it is not replacable via software gimmicks.
and memory compression is memory compression,it's a hardware feature.sorry,2gb vram on 5700 vs 2060 + dlss 2.0 is a hands down victory to nvidia

2060 Super loses to RX 5700 now, and it will especially lose to it in 4 years. The supporting data is overwhelming.
lol,what evidence.
that you lie about 5700 being faster than 2060S ? The fact that navi doesn't support dx12 ultimate features like vrs and mesh shaders,doesn't support simultaneous fp+int execution,doesn't support dlss ?
give us break with kepler 600 as your reference.those are a decade old.


seems like you're ignoring what I posted about non triple A games.there's solid evidence that amd just doesn't optimize for them.
 
Nobody talked about 1080p on a 1440p card. Even the 5600 XT is beyond 1080p capable, despite being marketed as a 1080p card.
The known and readily available benchmarks say otherwise, very definitively.
VRAM is very relevant, especially in 4 years, do not let yourself be fooled.
Not at 1080p.

DLSS is just a upscaler supported via cloud compute
No it isn't. There is nothing "cloud" about it. DLSS is done real-time, on the GPU, local to the system running it.

if you're comparing image reconstruction to image sharpening you're just wrong.watch any dlss 2.0 commentary.dlss 2.0 adds detail,resolution scale drop loses detail and adds artifacts like any sharpening filter.
This is correct.
 
The known and readily available benchmarks say otherwise, very definitively.

Not at 1080p.


No it isn't. There is nothing "cloud" about it. DLSS is done real-time, on the GPU, local to the system running it.


This is correct.
dlss 2.0 is done in the gpu via driver

DLSS 2.0 works as follows: The neural network is trained by Nvidia using "ideal" images of video games of ultra-high resolution on supercomputers and low resolution images of the same games. The result is stored on the video card driver. It is said the Nvidia uses DGX-1 servers to perform the training of the network.
 
VRAM is very relevant, especially in 4 years, do not let yourself be fooled.

You're not wrong, but its too much of a blanket statement when you put it like this and apply it to everything existing in GPU land ever. Each card is about the right balance, and about meaningful differences. When you speak of 3GB versus 4GB or 6GB mid range cards like the 1050ti - 1060 - 1060 6GB example then YES... the larger capacity immediately translates to a higher life expectancy - just like the 2GB vs 3GB example in 2013-2015. But at the topic's comparison, most certainly not.
 
that you lie about 5700 being faster than 2060S ? The fact that navi doesn't support dx12 ultimate features like vrs and mesh shaders,doesn't support simultaneous fp+int execution,doesn't support dlss ?
give us break with kepler 600 as your reference.those are a decade old.
Boy, keep yourself in check, nobody is lying here. I referenced 5700 unlocked / overclocked, it is a point, but not the one I wanted to bring. Seems I remembered wrong, shit happens.
But if you do not know about Vram being useful over the years, as historically proven, you do not know much about GPUs anyway.
You're not wrong, but its too much of a blanket statement when you put it like this and apply it to everything existing in GPU land ever. Each card is about the right balance, and about meaningful differences. When you speak of 3GB versus 4GB or6GB mid range cards like the 1050ti - 1060 - 1060 6GB example then YES... the larger capacity immediately translates to a higher life expectancy. But at the topic's comparison, most certainly not.
Ah, I do not agree, sorry. Historically it was proven many times that "stronger" GPUs, or barely stronger ones, like the GTX 680 and 780, and also 780 Ti, can come falling down, when hitting their limits over the years. In every regard, the Radeon GPU that was 1-5% slower, as Nvidia intended, was later overthrown and is now very much behind. The same can happen to the 2060 Super vs the 5700.

However, that being said, I do not recommend the 5700 - it is only better, if unlocked / overclocked, which I do not expect here. Fact is, I wouldn't buy any of these, 5700 XT is best.
 
Boy, keep yourself in check, nobody is lying here. I referenced 5700 unlocked
that is a 5700xt.
so get your facts right.

and again,your "hictoric evidence" is kepler vs first gcn.a case of amd getting too long to get their driver right.

none of this happened in the case of maxwell vs gcn 1.3 or vega vs pascal

and if anything turing has capabilities to gain over navi 10 over time with dx12 ultimate,fp+int and dlss

so yeah,keep bringing up kepler
 
dlss 2.0 is done in the gpu via driver
Wrong, DLSS uses Tensor cores in RTX cards, which is also why GTX 1080 Ti, for example, has no way of using DLSS.

that is a 5700xt.
so get your facts right.
Bullshit, a unlocked 5700 is not a 5700 XT, again you have proven that you have limited knowledge about GPUs. Talk less and read more. Regedit unlocks overclocking for 5700 cards, this has nothing to do with a 5700 XT.
 
You are actually both wrong. DLSS uses Tensor cores in RTX cards, which is also why GTX 1080 Ti, for example, has no way of using DLSS.


Bullshit, a unlocked 5700 is not a 5700 XT, again you have proven that you have limited knowledge about GPUs. Talk less and read more. Regedit unlocks overclocking for 5700 cards, this has nothing to do with a 5700 XT.
loooool,but you mixing up 2060 and 2060S is fine.
I'm done here.
reported troll.

and yes,"in gpu" means "on tensor cores".how am I "wrong" ?
 
Back
Top