• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RTX 2060 VS RX 5700 (July-August 2020)

RTX 2060 VS RX 5700

  • RTX 2060

    Votes: 14 48.3%
  • RX 5700

    Votes: 15 51.7%

  • Total voters
    29
Joined
Apr 30, 2011
Messages
2,652 (0.56/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
I don't need to, you provided that proof for me. Context is important. What do those video's show that applies to this thread? What you tried to do is called "spin-doctoring". You are attempting to twist information to fit your narrative. You failed.
No problem at all. I am sure others who watch the thread to get useful info will understand what I proved with the screenshots and videos in my previous post (DX12 and Vulkan games need more VRAM on the same settings and resolution. And almost all AAA games made for both next-gen consoles and PC will make use of those APIs and will require more VRAM than most games today).
 
Joined
Sep 17, 2014
Messages
20,952 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
When AMD themselves try to tell us through their marketing that 4GB isn't enougn anymore even at 1080P for today's games, think what is incoming after 1-2 years when the console games will be heavier in HW requirments than everything we have ever seen. History repeats 1-2 years after a new console gen arrives. Simple logic me thinks.

4GB is not 6GB and also not 8GB, last I checked.

Its easy to escalate to 16GB while we're at it, but then even you would say 'yeah, that ain't gonna happen anytime soon'... So where is this artificial barrier exactly, between good aging cards and bad ones?

Note, I'm a VRAM capacity advocate as I've always been. Skimping on it is not a good idea if you want cards to last and in that we agree, I think. But as I pointed out before, supported by evidence of the past generations, is that we've had our major surge in VRAM capacity upgrades the past few generations and it won't last - if it did, next gen would cap out at 16GB - the number I know you know is ridiculous to begin with for the foreseeable future. Its also not like the new consoles will push native 4K into your face, while PC resolutions are actually capable of doing so - and the best cards doing that cap out at 11GB, for two gens in a row now.

The discussion between 6 and 8 GB is really a non discussion. When its about 3 or 4 versus 6, absolutely correct, you want 6 even if it costs more. But if your res is 1080p, 6GB will last you the lifetime of the card in a 2060's case - or a 1060, or even a 980ti - a card that is STILL relevant with 6GB whereas a Fury X is not, despite similar release quarters.

This whole topic multiple people have stressed that exact difference in multiple ways. If you want to believe 8GB is necessary then by all means believe it, but between the two cards on show here its a complete and utter non issue. 5700/5700XT is obsolete more so than a 2060 even for just lacking any RT capability. Especially because the very console gen that might utilize 8GB at some point... will actually utilize it along with RT which the 8GB card doesn't have.

Show me the logic in that madness because I don't see it - Realistic AMD Marketing would tell you that along with >4GB you would need an RT capable card to be any sort of future proof to begin with... but guess what... And on top of that, lets not even begin to mention the overall DirectX12 feature level capability gap. Navi is simply lacking features all over the place, and those gaps influence longevity in a much bigger way than a few % of raw perf.

No problem at all. I am sure others who watch the thread to get useful info will understand what I proved with the screenshots and videos in my previous post (DX12 and Vulkan games need more VRAM on the same settings and resolution. And almost all AAA games made for both next-gen consoles and PC will make use of those APIs and will require more VRAM than most games today).

You might be surprised to find that game developers optimize around the mainstream spec and not the high end spec. If the vast majority of cards is 4-6GB, you will find drivers and engines optimized around that number. Which is the actual case as it has been. Its not really happening within the next few years that 8GB is a norm. With the decent sales of 2060's and 1660's, and 1060's before it you can rest assured the norm will linger around 6GB for quite some time and if you have to cut back, it won't be on 1080p but on substantially higher resolutions and bigger feature sets (such as RT). That is exactly why generally a 3GB card is no longer recommended and a 4GB is sub optimal unless its bottom mid range.

Also of note is that consoles generally utilize more VRAM because they can offload resource requirements that way. PC games borrow that tech but they don't really need it as multiple levels of volatile storage are available in usually bigger capacity and with much lower latencies. It was only by the END of the PS4/X1 era that 6GB became mainstream, as in last year, but the console had access to it for many years. The new gen will bring some parity in that sense because an SSD is going to become a big thing as its used as some form of system RAM albeit very slow, a bit like RAMdisk.
 
Last edited:
Joined
Apr 30, 2011
Messages
2,652 (0.56/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
4GB is not 6GB and also not 8GB, last I checked.

Its easy to escalate to 16GB while we're at it, but then even you would say 'yeah, that ain't gonna happen anytime soon'... So where is this artificial barrier exactly, between good aging cards and bad ones?

Note, I'm a VRAM capacity advocate as I've always been. Skimping on it is not a good idea if you want cards to last and in that we agree, I think. But as I pointed out before, supported by evidence of the past generations, is that we've had our major surge in VRAM capacity upgrades the past few generations and it won't last - if it did, next gen would cap out at 16GB - the number I know you know is ridiculous to begin with for the foreseeable future. Its also not like the new consoles will push native 4K into your face, while PC resolutions are actually capable of doing so - and the best cards doing that cap out at 11GB, for two gens in a row now.

The discussion between 6 and 8 GB is really a non discussion. When its about 3 or 4 versus 6, absolutely correct, you want 6 even if it costs more. But if your res is 1080p, 6GB will last you the lifetime of the card in a 2060's case - or a 1060, or even a 980ti - a card that is STILL relevant with 6GB whereas a Fury X is not, despite similar release quarters.

This whole topic multiple people have stressed that exact difference in multiple ways. If you want to believe 8GB is necessary then by all means believe it, but between the two cards on show here its a complete and utter non issue. 5700/5700XT is obsolete more so than a 2060 even for just lacking any RT capability. Especially because the very console gen that might utilize 8GB at some point... will actually utilize it along with RT which the 8GB card doesn't have.

Show me the logic in that madness because I don't see it - Realistic AMD Marketing would tell you that along with >4GB you would need an RT capable card to be any sort of future proof to begin with... but guess what... And on top of that, lets not even begin to mention the overall DirectX12 feature level capability gap. Navi is simply lacking features all over the place, and those gaps influence longevity in a much bigger way than a few % of raw perf.

You might be surprised to find that game developers optimize around the mainstream spec and not the high end spec. If the vast majority of cards is 4-6GB, you will find drivers and engines optimized around that number. Which is the actual case as it has been. Its not really happening within the next few years that 8GB is a norm. With the decent sales of 2060's and 1660's, and 1060's before it you can rest assured the norm will linger around 6GB for quite some time and if you have to cut back, it won't be on 1080p but on substantially higher resolutions and bigger feature sets (such as RT). That is exactly why generally a 3GB card is no longer recommended and a 4GB is sub optimal unless its bottom mid range.

Also of note is that consoles generally utilize more VRAM because they can offload resource requirements that way. PC games borrow that tech but they don't really need it as multiple levels of volatile storage are available in usually bigger capacity and with much lower latencies. It was only by the END of the PS4/X1 era that 6GB became mainstream, as in last year, but the console had access to it for many years. The new gen will bring some parity in that sense because an SSD is going to become a big thing as its used as some form of system RAM albeit very slow, a bit like RAMdisk.
A simple switch in Ghost Recon Wildlands from high to ultra texture quality sends tha VRAM usage at 1080P from 2,5GB to over 4GHz. So, game devs use options to allow the low-mid tier GPUs to work with their games and it will continue to be so, but in the next gen of games made for next-gen consoles and PCs this setting will go from under 6GB to under 8GB VRAM usage and whoever doesn't have that amount of VRAM will be forced to have washed out textures even at 1080P. Just check what VRAM size the upcoming consoles will have and understand what I have said. It is very usual for game devs to up their basic game requirements once the consoles allow to do so.
 
Last edited:
Joined
Sep 17, 2014
Messages
20,952 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
A simple switch in Ghost Recon Wildliands from high to ultra texture quality sends tha VRAM usage at 1080P from 2,5GB to over 4GHz. So, game devs use options to allow the low-mid tier GPUs to work with their games and it will continue to be so, but in the next gen of games made for next-gen consoles and PCs this setting will go from under 6GB to under 8GB VRAM usage and whoever doesn't have that amount of VRAM will be forced to have washed out textures even at 1080P. Just check what VRAM size the upcoming consoles will have and understand what I have said. It is very usual for game devs to up their basic game requirements once the consoles allow to do so.

We will see, but your response is missing out a lot of the considerations I mentioned above regardless. Time will tell. The market is changing in terms of game software. It is becoming more unified, with Sony taking a stake in Epic you can rest assured they will aim for parity between platforms because its simply cheaper to do so, and your target market becomes a whole lot bigger. Money always talks.

But most of all, you missed out on the RT capability feature gap between a 5700 and a Turing card. And your Ghost Recon example underlines my point. If you can achieve 1080p ultra with all those effects in that game (which I know is easy), and the game does look pretty marvelous even today, also on the texture front, with 4GB, why would 6 not be sufficient going forward? The differentiator between that, even if you include the switch to DX12 with a higher VRAM requirement, is going to be RT.

TL DR the focus on the VRAM capacity difference between these cards, is the wrong focus because it won't make or break your experience in any way over the lifetime of it.

Neither of these cards is going to last that long. Each lacks one of the most important features of the upcoming consoles, which will have both at least 8GB VRAM and hardware-accelerated ray-tracing. Going forward we are going to see game developers targeting their engines at these consoles, which means we can expect some newer games to require these features.

That said, I don't expect that the number of such games will be high. If you target 8GB VRAM, you exclude every GPU that has less memory, and there are still a lot of those around (and being manufactured and sold right now). If you target RT, you exclude every GPU that can't do RT, which is currently every GPU except the Turing series. If you go for both options, you exclude every GPU except the 2060S, 2070/S, 2080/S, and 2080 Ti.

So... at the end of the day, it's up to your own personal crystal ball. If you believe 8GB VRAM will be more important going forward, choose the 5700. If you believe RT will be more important, choose the 2060.

My personal recommendation: don't buy anything now. Wait until Ampere launches and see how much faster it is in RT, and whether that performance will make it a better long-term buy. Wait until RDNA2 launches, and see how fast or slow AMD's first-gen implementation is, which will dictate how likely game devs are to implement RT for console games and thus how important it will be going forward. As a bonus, when the new cards launch, the prices of the current-gen cards will drop.

If your friend must upgrade now, the 2060 Super is the slowest card that supports RT, and it also has 8GB VRAM, so it ticks both boxes. Unfortunately it's going to be quite a bit more expensive, but that is the price you pay if you want to (try to) be future-proof.

Best advice in the whole topic IMO
 
Last edited:
Joined
Feb 5, 2017
Messages
44 (0.02/day)
Location
North Carolina, USA
System Name ASUS Z97-PRO Gamer
Processor Intel Core i7-4790K (Delidded) with Rockit 88 100% Copper IHS
Motherboard ASUS Z97-PRO Gamer
Cooling Noctua NH-D15
Memory 32GB (8GB x4) GSkill TridentX DDR3 2400 (XMP Mode #1) Timings: 10-12-12-31
Video Card(s) EVGA 1070 FTW + ACX 3.0 (8GB GDDR5)
Storage 512GB Samsung 970 PRO (NVMe), 2TB WD Gold WD2005FBYZ-01YCBB1 SATA-3 Data drive
Display(s) LG 27UL500-W 27-Inch UHD (3840 x 2160) IPS Monitor with Radeon Freesync Technology and HDR10, White
Case Fractal Design Define R5 (Black, no Window)
Audio Device(s) Onboard Optical (TOSLINK) from MB to VIZIO soundbar
Power Supply EVGA Supernova G2 650
Mouse Logitech M215
Keyboard Logitech K330
Software Windows 10 Pro Secured by Emsisoft Internet Security & Malwarebytes Premium Version 3
Benchmark Scores http://www.userbenchmark.com/UserRun/2157372
8 GB Vram is more useful for 4 years. Clearly, and history has shown. HD 7970 vs GTX 680 for example.

I agree, my EVGA GTX 1070 FTW has been running strong since purchase. Never installed the optional cooling pads sent to me, rather used their upgraded BIOS to control heat. BTW, not being a gamer, although love video at 4K (this card is rated for 8K!), is perfect for my needs. Running a small OC, nothing to cause the card to overheat. Plus keep the card blown out with air whenever I have to go inside the PC. Haven't replaced the thermal paste, mainly because don't see the need to, 40-50C is reasonable for browsing & watching YouTube, as well as Hulu movies & Pluto, among other TV apps & watching via the browser. There's lots of 4K demo videos of nature on YouTube, if needed, one can drop down to what their monitor/GPU will handle.


Until then, my still owned EVGA GTX 770 Classified (4GB GDDR5) was enough, so much I skipped the 970 release. Limited to 4K, still a decent card for that purpose, in a secondary system. Given there was a controversy over the 970 cards (only 3.5GB, not 4), glad I passed it over, these cards weren't exactly cheap. Last year, I purchased a 2nd of the 770 Classified for $65 on eBay, the seller had the P/N wrong in the listing, but a pic of the 770 Classified was there. So knowing I had both PayPal & eBay protections, purchased & saved $40-50.


8GB GDDR5 has served me well since at least early 2016 & most likely will until a HDMI 2.1 & Displayport 2.0 card is released. Don't know why the only concern(s) are beefing up memory, more bandwidth, a larger bus, w/out bringing video standards to current. Kind of a half-baked approach. Being there's many HDMI 2.1 TV's on the market that such a GPU would perform well 4K at 120 Hz over HDMI, until this revision unheard of, no need not to include the standard. Would make 2K monitors obsolete, in fact have seen only one TV of that standard at Walmart, of all places. Every 4K TV which LG has released this year has been HDMI 2.1, soon the value brands such as VIZIO will follow. In fact, they are!


If EVGA could release a firmware to support 4K at 120 Hz for the GTX 1070 FTW, would stick with it, has ran butter smooth since purchase, I get the latest drivers from here, which aren't shown on the NVIDIA site. And install only standard drivers, don't care for DCH. Temps rarely surpasses 60C & that's only if many Web pages are open while watching videos (or a 4K demo). Had the chance a few years back when GPU's were in shortage to accept $1,100 for the card & declined, no regrets. Why purchase an expensive card which doesn't meet current video specs, as well as eARC. the latest audio spec which is 48x faster than HDMI ARC?

Cat
 

Attachments

  • Capture (GPU Z (1).PNG
    Capture (GPU Z (1).PNG
    76.5 KB · Views: 44
  • Capture (GPU Z (2).PNG
    Capture (GPU Z (2).PNG
    43.5 KB · Views: 52
  • Capture (GPU Z (3).PNG
    Capture (GPU Z (3).PNG
    67.2 KB · Views: 40
  • Capture (Speccy Graphics Only).PNG
    Capture (Speccy Graphics Only).PNG
    236 KB · Views: 75
Top