• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
The majority buys AAA games like Origins, Odyssey, Wolfenstein 2, DOOM, Witcher 3 and others just fine...

What I am doing as an enthusiast is enabling what Ultra used to mean 15-20 years ago. :) Nothing more, just a return to how things used to be. I do not think people have to do that since I think the "Ultra or nothing" mentality in PC Gaming is insane.

Those requirements do not reflect reality so the reason they are there is not objectively correct and has not been in the past decade. Show me at least 2-3 times when it meant something from the last 5 years and I will give you some props. As it is, it is often just lies.
There are AAA titles that can bring a GPU down...

Good for you! I am glad you dig down and edit ini files. Just saying most people don't.

How about you show me where it doesn't give a general reference point to start from (minimum). I'm not the one pushing the rock uphill and in need of support for my assertions. The recs are a GUIDELINE, not a rule...but you are just saying they are lies, LOL.
 
Joined
Jan 23, 2016
Messages
96 (0.03/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 5600X I Core i7 6700K
Motherboard B550 Phantom Gaming 4 I Asus Z170-A ATX
Video Card(s) RX 6900 XT PowerColor Red Devil I RTX 3080 Palit GamingPro
Storage Intel 665P 2TB I Intel 660p 2TB
Case NZXT S340 Elite I Some noname case lmao
Mouse Logitech G Pro Wired
Keyboard Wooting Two Lekker Edition
There are plenty of AAA titles that can bring a GPU to its knees.

Good for you! I am glad you dig down and edit ini files. Just saying most people don't.

How about you show me where it doesn't give a general reference point to start from (minimum). I'm not the one pushing the rock uphill and in need of support for my assertions. The recs are a GUIDELINE, not a rule.

And yet they sell by the millions and average gamers play them. Hell my poor countrymen play these games on their low-end PCs without issues...

I know most people don't dig through the ini files, but as long as you agree that true Ultra is there, tis fine. We agree.

If you define things super generally, I guess you can stretch everything enough to win. My old ATI 5770 finished Witcher 3 at Low settings, 30 fps locked, 900p just fine. It was under the minimum requirements, had locked fps, and was not even using the lowest possible settings (resolution wasn't as low as it would go). i5 750 didn't bottleneck it. So obviously this page is a lie to me:
https://www.systemrequirementslab.com/cyri/requirements/the-witcher-3-wild-hunt/12446

Read the OpenGL part here:
https://www.hardocp.com/article/2014/05/21/wolfenstein_new_order_performance_review

Hell, for a recent example see this:

R9 380 is much slower than R9 290... and yet it runs well. We can do this for almost all games ultimately, but what matters is that the requirements are made up. I can give better requirements than the developers and that is sad.
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
And yet they sell by the millions and average gamers play them. Hell my poor countrymen play these games on their low-end PCs without issues...
because they meet the minimum requirements or chances are the gaming the experience is poor (not as the dev wants it).

I know most people don't dig through the ini files, but as long as you agree that true Ultra is there, tis fine. We agree.
With respect, I honestly dont care. This has nothing to do with anything here really.

Your examples seem like exactly what I have explained, is it not? I also don't believe there is a standard for minimum/recs set by anyone so it varies by dev. That said, pretty sure 1080p is a given here...not less. I wouldn't call having to lower the resolution to make 30 fps (which many feel is unplayable) meeting minimum specs.

The bottom line is the are there are a GUIDELINE, not a rule. There is some flexibility there but the gaming experience may also suffer.

We'll agree to disagree.;)
 
Joined
Jan 23, 2016
Messages
96 (0.03/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 5600X I Core i7 6700K
Motherboard B550 Phantom Gaming 4 I Asus Z170-A ATX
Video Card(s) RX 6900 XT PowerColor Red Devil I RTX 3080 Palit GamingPro
Storage Intel 665P 2TB I Intel 660p 2TB
Case NZXT S340 Elite I Some noname case lmao
Mouse Logitech G Pro Wired
Keyboard Wooting Two Lekker Edition
because they meet the minimum requirements or chances are the gaming the experience is poor (not as the dev wants it).

With respect, I honestly dont care. This has nothing to do with anything here really.

Your examples seem like exactly what I have explained, is it not? I also don't believe there is a standard for minimum/recs set by anyone so it varies by dev. That said, pretty sure 1080p is a given here...not less. I wouldn't call having to lower the resolution to make 30 fps (which many feel is unplayable) meeting minimum specs.

The bottom line is the are there are a GUIDELINE, not a rule. There is some flexibility there but the gaming experience may also suffer.

We'll agree to disagree.;)

I actually am shocked to see someone talking about system requirements in late 2018. I thought people stopped looking at these years ago and would never guess enthusiasts to use them. No offence, this is just a shock to me as I seriously have not seen this happen in a very long time (years). And no, ;p most dont meet the requirements.

I understand your argument about the ini file manipulation, but do know that when you say you run X game on Ultra I will say that you do not actually run it at Ultra.

So your idea of minimum is literally twice the minimum playable fps that the majority of gamers tolerate at a resolution that is much higher than the minimum supported by the actual application? Your idea of minimum is literally my idea of recommended requirements...

Words have meanings in languages. Minimum should be minimum, not some standard above what the majority of console gamers (which is like half of gaming) can achieve. And JayZ's R9 380 was doing a LOT better than that. An R9 290 can literally almost max the game at 60+ fps at 1080P. What kind of minimum is that???

I would prefer actual rules with meaning over things I can toss aside and laugh at, with API lies (as proven by the new order) tossed in. Solid, dependable and correct rules with clear meanings and definitions. That is something I think all humans love.
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
I will say that you do not actually run it at Ultra.
Ultra is what what the dev's say it is through their preset. Anything else is adding on top. Ultra is what they say it is. Just because you go to fuel cutoff and past redline.... :p

your idea of minimum is literally twice the minimum playable fps that the majority of gamers tolerate at a resolution that is much higher than the minimum supported by the actual application? Your idea of minimum is literally my idea of recommended requirements...
Maybe? I just know that generally 30 fps many wouldn't consider an enjoyable gaming experience on PC. I'm part of that group.
 
Joined
Jan 23, 2016
Messages
96 (0.03/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 5600X I Core i7 6700K
Motherboard B550 Phantom Gaming 4 I Asus Z170-A ATX
Video Card(s) RX 6900 XT PowerColor Red Devil I RTX 3080 Palit GamingPro
Storage Intel 665P 2TB I Intel 660p 2TB
Case NZXT S340 Elite I Some noname case lmao
Mouse Logitech G Pro Wired
Keyboard Wooting Two Lekker Edition
Ultra is what what the dev's say it is through their preset. Anything else is adding on top. Ultra is what they say it is. Just because you go to fuel cutoff and past redline.... :p

Maybe? I just know that generally 30 fps many wouldn't consider an enjoyable gaming experience on PC. I'm part of that group.


So Nightmare settings in DOOM do not exist :p ? I mean I am all for Authorial intent, but there is an argument to be made for Death of the Author, especially when a decade ago things were more logical and Ultra really meant "as high as it would go before breaking". That makes sense.

Many and majority are not synonymous. I don't consider 30 fps playable either, but beggars cant be choosers and if its good enough for some rich Americans on their consoles, its good enough for me.
 
Joined
Oct 29, 2018
Messages
127 (0.06/day)
C'mon guys anything below 30 fps is a disaster, anything below 20 fps unplayable. We are talking about MIDRANGE GPU in 2019 here, not low end stuff like 560/1050, with perf somewhere in between 1070 - 1070TI given the same number of CUDA cores as 1070 and slightly better IPC. So 1440p/+60fsp capable GPU. Pairing it with 3 GB of RAM is a sin.
 
Joined
Jul 10, 2015
Messages
839 (0.26/day)
Location
Romania
System Name Comet Lake
Processor Intel® Core™ i5-10600K CPU @ 5.0GHz
Motherboard MSI MPG Z490 GAMING PLUS
Cooling Arctic Freezer 34 eSports Duo
Memory CORSAIR LPX 32GB DDR4 3200 CL16 B-die
Video Card(s) GeForce® RTX 3060Ti™
Storage Samsung 970 Evo Plus M2 1TB & Seagate 2TB ST2000DM008-2UB102
Display(s) Dell 24 Gaming Monitor - S2422HG 165 Hz Curved
Case AQIRYS Arcturus
Audio Device(s) Realtek® ALC1200 Codec | Logitech Z533
Power Supply RMx White Series™ RM750x
Mouse Genesis Krypton 770
Keyboard Logitech G910
VR HMD N/A Skip
Software W10 Pro x64
Benchmark Scores XXX
3 Gigs or 3.5 gigs of vram so much debate about it )))) for 1080p medium to high is OK.

Lets hope and expect that will be at competitive price for us,AMD has been rebranding us with crappy 590 and the 56 and 64 a joke saddly
 

bug

Joined
May 22, 2015
Messages
13,163 (4.07/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Do not look at minimum and recommended system requirements. They are made up and make no sense at all. I do not even watch them these days and have not for over a decade.

The difference between Medium and Ultra settings... is overrated most of the time. Plus, most of the time, Ultra settings is not even true Ultra these days. Witcher 3's settings menu does not compare to what even a pleb like me can do in the ini file in 2 minutes without any real modding. That is real Ultra settings :) .

If you can afford an RX 560/ GTX 1050 then you are already quite a bit above the PS4 and Xbox One basic. PS4 Pro is about matched by 1050 Ti and Xbox One X always loses in games vs the GTX 1060 6 GB.

I do not like using different standards for different things. One standard and zero hypocrisy or the discussion is worthless.
You do however posses an uncanny ability to mix together all sorts of things barely related to the subject.

The simple truth is games happen to be playable at settings other than max. Some will look better, some will look worse, depending on budget and who coded it and made the assets. But I have never met a person who didn't play a game because they couldn't max out shadows or textures. I have met people that delayed playing a game, because they were planning an upgrade and thought they'd make the best of it.
 

M2B

Joined
Jun 2, 2017
Messages
284 (0.11/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
because they meet the minimum requirements or chances are the gaming the experience is poor (not as the dev wants it).

With respect, I honestly dont care. This has nothing to do with anything here really.

Your examples seem like exactly what I have explained, is it not? I also don't believe there is a standard for minimum/recs set by anyone so it varies by dev. That said, pretty sure 1080p is a given here...not less. I wouldn't call having to lower the resolution to make 30 fps (which many feel is unplayable) meeting minimum specs.

The bottom line is the are there are a GUIDELINE, not a rule. There is some flexibility there but the gaming experience may also suffer.

We'll agree to disagree.;)

They don't quite think 30FPS is unplayable, they think if they say 30FPS is unplayable they'll look cool.
More than 120 million people in the world are currently using consoles and enjoying their games at 30FPS.
Don't get me wrong, 30FPS is not ideal, it's far less enjoyable than 60FPS/60FPS+ but it's not unplayable by any means.
I'll proudly choose that "unplayable" 30FPS Red Dead Redemption 2 over 95 percent of the games at 60FPS+.
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Many and majority are not synonymous. I don't consider 30 fps playable either, but beggars cant be choosers and if its good enough for some rich Americans on their consoles, its good enough for me.
Oh, I'd bet good money a majority wouldn't consider 30 fps to be playable in most genres/titles. RTS, I can do... FPS...I'd cry and likely get a headache...

Playable and enjoyable I'm kind of using interchangeably. I mean... 15 is playable. The game plays...but the experience and it being enjoyable, the majority tend agree 30 fps isnt for PCs.

But....this is all a bit OT. I'd love a thread to get down to the bottom.of why 30 fps seems different on a console versus a PC (I know why movies can get away it).
 
Last edited:
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
I dont trust a single word from people who say that what they bought is the best.
So you don't trust actual usage and experience? Sounds like flawed logic. But hey, do carry on..
And also, 3gb are NOT enough.
Sure it is when the settings are properly configured.
You are biased as hell.
The word your looking for is experienced. I own a PC shop and we build every kind of system for every budget from bleeding edge gaming to economy minded gaming. From Ryzen or Intel to Radeon and Geforce. A 2060 3GB will be bare minimum but still doable as a gaming card, just like the Radeon 4GB cards are doable for 1080p gaming. Calling me biased only shows your ignorance. Good luck with that.
Of course you have to defend that you have been milked in a crazy way.
Interesting perspective. Another is that I'm sharing actual experience and that it is positive despite the price increase.
Those who try to defend this piece of crap can't understand our concern is not its performance
While I haven't used a 2060 yet, I have build systems with 2080ti's, 2080's(and own one), 2070's, Vega 64's, Vega 56's, RX580's and so on. Each have there pros and cons. Calling one or the other "crap" is so baselessly nonobjective as to effectively sound like drivel. The rest of your points only matter to people looking for reasons to whine and nitpick.
This is the worst price-performance GPU family ever.
That's an opinion, and not a very good one. Every RTX card I've installed/used kicks the ever living snot out of the GTX10XX/Titan counter parts to say nothing of Radeon cards. And that's before RTRT is factored into the equation. The only people whining about the price are the people who can't afford them or have to wait an extra bit of time to save money for one. Everyone else is buying them up, which is why they are consistently still selling out, months after release.
 
Last edited:
Joined
Mar 10, 2015
Messages
3,984 (1.20/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
Note: I didn't read the comments and skimmed the article and didn't see a reference to tensor cores. BUUUUTTTT, if this has any number of tensor cores, I want one. It will never see a 3d scenario so I can't care about how it performs in games.

But....this is all a bit OT. I'd love a thread to get down to the bottom.of why 30 fps seems different on a console versus a PC (I know why movies can get away it).

Doesn't it have to do with the fact they are a constant value (movies)?

I own a PC show

Really, what channel and time slot? :laugh:
 
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
I'd have to imagine youd be in a minority saying AA isnt needed at 1080p.
Not based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. Then there's Steam's own stats that show most people turn AA down or off, most of them running at 1080p. Pixel density 1080p, 1440p and up mostly eliminates the need for AA as the "pixel laddering" effect of the past isn't noticeable or pronounced like it was in the past with lower resolutions and simply isn't needed. Try it yourself. Turn it off and see if it matters that much to you. Be objective though.

Really, what channel and time slot? :laugh:
LOL! typo corrected..
 
Joined
Mar 10, 2015
Messages
3,984 (1.20/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
Not based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. Then there's Steam's own stats that show most people turn AA down or off, most of them running at 1080p.


I would assume most people that turn AA down on Steam are playing Insurgency or CS:GO.

EDIT:

For me, I don't turn anything down as long as I am over 60fps (75 now while I have this monitor). I can't stand no AA. It looks like crap. Although, still need to try rendering above my native and downscaling to see if visual vs perf is better/worse.
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Not based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. Then there's Steam's own stats that show most people turn AA down or off, most of them running at 1080p. Pixel density 1080p, 1440p and up mostly eliminates the need for AA as the "pixel laddering" effect of the past isn't noticeable or pronounced like it was in the past with lower resolutions and simply isn't needed. Try it yourself. Turn it off and see if it matters that much to you. Be objective though.


LOL! typo corrected..
links plz... :)

I recall that poll and walked away with a bit different of a meaning.

Yeah... let's be clear here. Nobody said max AA...but you said "AA off" and that "it wasn't needed at 1080p and higher". We concluded from that poll an overwhelming majority used AA be it max or somewhere in between. A single digit % turned it off while 47% depends on game settings which could be on or off.

But the reality was for most that they use it when they can.
 
Last edited:
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
I would assume most people that turn AA down on Steam are playing Insurgency or CS:GO.
That would be a big assumption. I personally doubt it, but who knows..
Although, still need to try rendering above my native and downscaling to see if visual vs perf is better/worse.
Oh turn that on and leave it on. Then turn your AA down or off. You'll like the steady framerates much better.
links plz... :)
https://store.steampowered.com/stats/
https://www.techpowerup.com/forums/...ings-for-running-games-and-benchmarks.250063/
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
See edit above for your poll.

I'm mobile and cant dig down on the steam stats link... feeling saucy and post an image of it?
 
Joined
Oct 13, 2016
Messages
165 (0.06/day)
Location
Nea Makri, Greece.
System Name Dark Sith Lord / Fat Boy
Processor AMD Ryzen7 2700X / AMD FX 8350
Motherboard Asus Rog Strix B450-F Gaming / Gigabyte 990XA UD3 R5
Cooling Coolermaster Hyper 212 Turbo LED (2x120mm in push-pull) / Coolermaster HyperX (2x120mm in push-pull)
Memory 2x8gb G.Skill TridentZ RGB DDR4 3200MHz / 4x4gb Kingston HyperX DDR3 1866MHz
Video Card(s) Gigabyte GTX 1060 Windforce OC 6gb / Gainward GTX 950 2gb
Storage Crucial P1 NVMe 1tb, 2x Barracuda 2TB, 1x IronWolf NAS 2TB / Samsung 840 Evo 500gb + Samsung F1 1tb
Display(s) BenQ GL2450 24'' / Samsung Syncmaster 20'' 2043nw
Case CM HAF 932 (CM 200mm [Red Led] front & side intake, Shark Evil Black 140mm exhaust) / NZXT Trinity
Audio Device(s) Realtek ALC1220 on Creative Inspire S2 2.1 / Realtek 889 on Creative SBS Vivid 60 2.0
Power Supply EVGA Supernova B2 750w / Thermaltake Toughpower 700w
Mouse Steelseries Rival 110 / Sharkoon Light² 100
Keyboard Bloody B975 (LK brown optical switches) / Philips SPK84 (blue switches) with silicone o'rings mod
Software Win10 Pro x64 / Win7 Ultimate x64
So you don't trust actual usage and experience? Sounds like flawed logic. But hey, do carry on..

Sure it is when the settings are properly configured.

The word your looking for is experienced. I own a PC shop and we build every kind of system for every budget from bleeding edge gaming to economy minded gaming. From Ryzen or Intel to Radeon and Geforce. A 2060 3GB will be bare minimum but still doable as a gaming card, just like the Radeon 4GB cards are doable for 1080p gaming. Calling me biased only shows your ignorance. Good luck with that.

So now mentioning our tech past experience, gives credibility to what we're saying? That's flawed logic, not mine.
Should i start then mentioning how many systems i have build in the last 25 years i work as IT specialist? Should i? Really?
I don't trust people who are buying something which obviously has some flaws, and defend it like there is no tommorow. That translates a bit as "butthurt" to me.

3gb vram are NOT enough for MANY games. Not all, but many (AAA titles mostly). I can name a few, i already DID. I've said that 2 times already. Those are the games we're buying gpu's for, not lightweight games. You dont agree on that?

Do you think what i am saying is coming out of my a**? You think i haven't made benchmarks myself to see what's what? Ofcourse with the "right settings" vram requirement could go below 3giga but that not the point, because for the games i am speaking for, going below 3gb vram usage usually means also to go for medium/low settings. GTA V for example takes about 3.5 for medium/high settings at 1080p, NOT even Ultra. Now if you start editing .ini files then we're talking for very customized experience and thats not normal for the average user. And still that method cannot change the performance hits or gains beyond the capabilities of the gpu. It is just a more customized method, and you can sacrifice quality over speed (and vice versa) exactly as you want it, because sometimes the in-game settings do not satisfy all tastes. Thats all. But the performance of the gpu with 3gb vram will be what it is.

You are saying all that about your past experience, yet you are ready to defend a NEW gpu for 2019 with 3gb vram. And you call me ignorant!!!!!!!!!!

P.S. And please dont start with nvidia's "magic" compression.
 
Last edited:
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
So now mentioning our tech past experience, gives credibility to what we're saying? That's flawed logic, not mine.
Should i start that mentioning how many systems i have build in the last 25 years i work as IT specialist? Should i? Really?
No, this isn't a contest.
Do you think what i am saying is coming out of my a**?
Yes? Mostly because you're not taking into account most real-world usage scenario's. For 1080p a 2060 with 3GB will work and perform very well in most games out today. Very few AAA titles can not be made to run well on such as card. How do I know this you ask? Because it can be done with 1060 with 3GB. A 2060 is a better performing card so, naturally, what a 1060 can do a 2060 will do better. Simply deductive reasoning is all that is required to arrive at that conclusion.
for medium/high settings at 1080p, NOT even Ultra.
Most people tinker with their settings so the "medium/high" argument is irrelevant as it ends up being customized.
And you call me ignorant!!!!!!!!!!
Though you took it out of context, that's what I said. And your statements above continue to lend merit to that conclusion.
 
Joined
Oct 13, 2016
Messages
165 (0.06/day)
Location
Nea Makri, Greece.
System Name Dark Sith Lord / Fat Boy
Processor AMD Ryzen7 2700X / AMD FX 8350
Motherboard Asus Rog Strix B450-F Gaming / Gigabyte 990XA UD3 R5
Cooling Coolermaster Hyper 212 Turbo LED (2x120mm in push-pull) / Coolermaster HyperX (2x120mm in push-pull)
Memory 2x8gb G.Skill TridentZ RGB DDR4 3200MHz / 4x4gb Kingston HyperX DDR3 1866MHz
Video Card(s) Gigabyte GTX 1060 Windforce OC 6gb / Gainward GTX 950 2gb
Storage Crucial P1 NVMe 1tb, 2x Barracuda 2TB, 1x IronWolf NAS 2TB / Samsung 840 Evo 500gb + Samsung F1 1tb
Display(s) BenQ GL2450 24'' / Samsung Syncmaster 20'' 2043nw
Case CM HAF 932 (CM 200mm [Red Led] front & side intake, Shark Evil Black 140mm exhaust) / NZXT Trinity
Audio Device(s) Realtek ALC1220 on Creative Inspire S2 2.1 / Realtek 889 on Creative SBS Vivid 60 2.0
Power Supply EVGA Supernova B2 750w / Thermaltake Toughpower 700w
Mouse Steelseries Rival 110 / Sharkoon Light² 100
Keyboard Bloody B975 (LK brown optical switches) / Philips SPK84 (blue switches) with silicone o'rings mod
Software Win10 Pro x64 / Win7 Ultimate x64
No, this isn't a contest.
Then why you've started it?

Yes? Mostly because you're not taking into account most real-world usage scenario's. For 1080p a 2060 with 3GB will work and perform very well in most games out today.
Yes, and also a 950 with 2gb vram will work and perform "well" in most games. But thats NOT the point. We're NOT talking about cards of 2016 (thats how old 1060 is in case you dont remember). And also, the vram requirements for games are keep getting higher. You just cannot accept it. And i can tell you games that look like shit with custom settings for less than 3gb vram usage. Like Far Cry 4 or Shadow of Mordor.

Very few AAA titles can not be made to run well on such as card. How do I know this you ask? Because it can be done with 1060 with 3GB. A 2060 is a better performing card so, naturally, what a 1060 can do a 2060 will do better. Simply deductive reasoning is all that is required to arrive at that conclusion.
On the contrary, a faster core with ram as fast as a card of 2016 (i am refering to the GDDR5 versions) might starve of data faster, and suffer from data bus bottlenecks, especially for vram hungry games! But obviously you know how the card will perform without even seeing it first.

Most people tinker with their settings so the "medium/high" argument is irrelevant as it ends up being customized.
Is that even an argument? Obviously most people customise their settings. So? We need points of origin in order to discuss this, otherwise we can say "yeah you can customize X game to run with even 2gb of vram usage". Thats not the point! You'are avoiding the point systematically.

Edit: Oh and one more thing by the way. My english are not great, but as far as i remember calling someone "biased" is not an insult, but calling someone "ignorant", actually it is. Especially, in the way you've used it.
 
Last edited:
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
Then why you've started it?
:wtf::kookoo:
Yes, and also a 950 with 2gb vram will work and perform "well" in most games.
No, it wouldn't.
But thats NOT the point.
Sure it is. Gaming performance is exactly the point here.
We're NOT talking about cards of 2016 (thats how old 1060 is in case you dont remember).
Oh gee wiz, thanks for reminding me...:rolleyes:
And i can tell you games that look like shit with custom settings for less than 3gb vram usage.
That's your opinion and a completely subjective one. You're welcome to it.
But obviously you know how the card will perform without even seeing it first.
Sure can, here's my premise for logic. I had a 1080 and upgraded to a 2080. Performance jump was significant. I have 1070 in one of my other PC's and it is known that the 2070 is a big jump in performance. It doesn't take much for a person to conclude that the 2060 will beat out a 1060. Therefore, very naturally, it is easy to conclude that anything a 1060 can do a 2060 will do much better. Don't need to see it to be able to accurately conclude the general performance of such a card.

but as far as i remember calling someone "biased" is not an insult
Depends on how you use it, but I digress..
 
Last edited:
Joined
Oct 13, 2016
Messages
165 (0.06/day)
Location
Nea Makri, Greece.
System Name Dark Sith Lord / Fat Boy
Processor AMD Ryzen7 2700X / AMD FX 8350
Motherboard Asus Rog Strix B450-F Gaming / Gigabyte 990XA UD3 R5
Cooling Coolermaster Hyper 212 Turbo LED (2x120mm in push-pull) / Coolermaster HyperX (2x120mm in push-pull)
Memory 2x8gb G.Skill TridentZ RGB DDR4 3200MHz / 4x4gb Kingston HyperX DDR3 1866MHz
Video Card(s) Gigabyte GTX 1060 Windforce OC 6gb / Gainward GTX 950 2gb
Storage Crucial P1 NVMe 1tb, 2x Barracuda 2TB, 1x IronWolf NAS 2TB / Samsung 840 Evo 500gb + Samsung F1 1tb
Display(s) BenQ GL2450 24'' / Samsung Syncmaster 20'' 2043nw
Case CM HAF 932 (CM 200mm [Red Led] front & side intake, Shark Evil Black 140mm exhaust) / NZXT Trinity
Audio Device(s) Realtek ALC1220 on Creative Inspire S2 2.1 / Realtek 889 on Creative SBS Vivid 60 2.0
Power Supply EVGA Supernova B2 750w / Thermaltake Toughpower 700w
Mouse Steelseries Rival 110 / Sharkoon Light² 100
Keyboard Bloody B975 (LK brown optical switches) / Philips SPK84 (blue switches) with silicone o'rings mod
Software Win10 Pro x64 / Win7 Ultimate x64
:wtf::kookoo:

No, it wouldn't.
Yes it would.


Sure it is. Gaming performance is exactly the point here.
We were not talking about performance in general, we were talking about how 3gb vram are enough or not.


Oh gee wiz, thanks for reminding me...:rolleyes:
When arguments end, irony starts.


That's your opinion and a completely subjective one. You're welcome to it.
And yours also. Stop presenting your subjective opinion as fact.


Sure can, here my premise for logic. I had a 1080 and upgraded to a 2080. Performance jump was significant. I have 1070 in one of my other PC's it is known that the 2070 is a big jump in performance. It doesn't take much for a person to conclude that the 2060 will beat out a 1060. Therefore, very naturally, it is easy to conclude that anything a 1060 can do a 2060 will do much better. Don't need to see it to be able to accurately conclude the general performance of such a card.
So from the point of discussion which was 3gb vram is not enough, you 've reach the conclusion that "the next gen card will be faster than previous gen". No shit Sherlock!? What a great discovery you've made. Ofcourse it will be!

But i speak of specific area in terms of performance. How do you know for example, if the bus size for the GDDR5 models is the same or not, and therefore the bandwidth is similar to the previous gen or not? How do you know if the GDDR5 models would not have performance hit because of bandwidth? How do you know especially if the 3gb vram models will have enough data feed to the gpu? Your deductive logic is as flawed just as like the rest of your arguments.
This is exactly what i mean you are avoiding the point systematically.
 
Last edited:
Top