Tuesday, December 25th 2018

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

NVIDIA drew consumer ire for differentiating its GeForce GTX 1060 into two variants based on memory, the GTX 1060 3 GB and GTX 1060 6 GB, with the two also featuring different GPU core-configurations. The company plans to double-down - or should we say, triple-down - on its sub-branding shenanigans with the upcoming GeForce RTX 2060. According to VideoCardz, citing a GIGABYTE leak about regulatory filings, NVIDIA could be carving out not two, but six variants of the RTX 2060!

There are at least two parameters that differentiate the six (that we know of anyway): memory size and memory type. There are three memory sizes, 3 GB, 4 GB, and 6 GB. Each of the three memory sizes come in two memory types, the latest GDDR6 and the older GDDR5. Based on the six RTX 2060 variants, GIGABYTE could launch up to thirty nine SKUs. When you add up similar SKU counts from NVIDIA's other AIC partners, there could be upward of 300 RTX 2060 graphics card models to choose from. It won't surprise us if in addition to memory size and type, GPU core-configurations also vary between the six RTX 2060 variants compounding consumer confusion. The 12 nm "TU106" silicon already has "A" and "non-A" ASIC classes, so there could be as many as twelve new device IDs in all! The GeForce RTX 2060 is expected to debut in January 2019.
Source: VideoCardz
Add your own comment

230 Comments on NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

#101
ASOT
3 Gigs or 3.5 gigs of vram so much debate about it )))) for 1080p medium to high is OK.

Lets hope and expect that will be at competitive price for us,AMD has been rebranding us with crappy 590 and the 56 and 64 a joke saddly
Posted on Reply
#102
bug
CharcharoDo not look at minimum and recommended system requirements. They are made up and make no sense at all. I do not even watch them these days and have not for over a decade.

The difference between Medium and Ultra settings... is overrated most of the time. Plus, most of the time, Ultra settings is not even true Ultra these days. Witcher 3's settings menu does not compare to what even a pleb like me can do in the ini file in 2 minutes without any real modding. That is real Ultra settings :) .

If you can afford an RX 560/ GTX 1050 then you are already quite a bit above the PS4 and Xbox One basic. PS4 Pro is about matched by 1050 Ti and Xbox One X always loses in games vs the GTX 1060 6 GB.

I do not like using different standards for different things. One standard and zero hypocrisy or the discussion is worthless.
You do however posses an uncanny ability to mix together all sorts of things barely related to the subject.

The simple truth is games happen to be playable at settings other than max. Some will look better, some will look worse, depending on budget and who coded it and made the assets. But I have never met a person who didn't play a game because they couldn't max out shadows or textures. I have met people that delayed playing a game, because they were planning an upgrade and thought they'd make the best of it.
Posted on Reply
#103
M2B
EarthDogbecause they meet the minimum requirements or chances are the gaming the experience is poor (not as the dev wants it).

With respect, I honestly dont care. This has nothing to do with anything here really.

Your examples seem like exactly what I have explained, is it not? I also don't believe there is a standard for minimum/recs set by anyone so it varies by dev. That said, pretty sure 1080p is a given here...not less. I wouldn't call having to lower the resolution to make 30 fps (which many feel is unplayable) meeting minimum specs.

The bottom line is the are there are a GUIDELINE, not a rule. There is some flexibility there but the gaming experience may also suffer.

We'll agree to disagree.;)
They don't quite think 30FPS is unplayable, they think if they say 30FPS is unplayable they'll look cool.
More than 120 million people in the world are currently using consoles and enjoying their games at 30FPS.
Don't get me wrong, 30FPS is not ideal, it's far less enjoyable than 60FPS/60FPS+ but it's not unplayable by any means.
I'll proudly choose that "unplayable" 30FPS Red Dead Redemption 2 over 95 percent of the games at 60FPS+.
Posted on Reply
#104
EarthDog
CharcharoMany and majority are not synonymous. I don't consider 30 fps playable either, but beggars cant be choosers and if its good enough for some rich Americans on their consoles, its good enough for me.
Oh, I'd bet good money a majority wouldn't consider 30 fps to be playable in most genres/titles. RTS, I can do... FPS...I'd cry and likely get a headache...

Playable and enjoyable I'm kind of using interchangeably. I mean... 15 is playable. The game plays...but the experience and it being enjoyable, the majority tend agree 30 fps isnt for PCs.

But....this is all a bit OT. I'd love a thread to get down to the bottom.of why 30 fps seems different on a console versus a PC (I know why movies can get away it).
Posted on Reply
#105
lexluthermiester
CandymanGRI dont trust a single word from people who say that what they bought is the best.
So you don't trust actual usage and experience? Sounds like flawed logic. But hey, do carry on..
CandymanGRAnd also, 3gb are NOT enough.
Sure it is when the settings are properly configured.
CandymanGRYou are biased as hell.
The word your looking for is experienced. I own a PC shop and we build every kind of system for every budget from bleeding edge gaming to economy minded gaming. From Ryzen or Intel to Radeon and Geforce. A 2060 3GB will be bare minimum but still doable as a gaming card, just like the Radeon 4GB cards are doable for 1080p gaming. Calling me biased only shows your ignorance. Good luck with that.
B-RealOf course you have to defend that you have been milked in a crazy way.
Interesting perspective. Another is that I'm sharing actual experience and that it is positive despite the price increase.
B-RealThose who try to defend this piece of crap can't understand our concern is not its performance
While I haven't used a 2060 yet, I have build systems with 2080ti's, 2080's(and own one), 2070's, Vega 64's, Vega 56's, RX580's and so on. Each have there pros and cons. Calling one or the other "crap" is so baselessly nonobjective as to effectively sound like drivel. The rest of your points only matter to people looking for reasons to whine and nitpick.
B-RealThis is the worst price-performance GPU family ever.
That's an opinion, and not a very good one. Every RTX card I've installed/used kicks the ever living snot out of the GTX10XX/Titan counter parts to say nothing of Radeon cards. And that's before RTRT is factored into the equation. The only people whining about the price are the people who can't afford them or have to wait an extra bit of time to save money for one. Everyone else is buying them up, which is why they are consistently still selling out, months after release.
Posted on Reply
#106
moproblems99
Note: I didn't read the comments and skimmed the article and didn't see a reference to tensor cores. BUUUUTTTT, if this has any number of tensor cores, I want one. It will never see a 3d scenario so I can't care about how it performs in games.
EarthDogBut....this is all a bit OT. I'd love a thread to get down to the bottom.of why 30 fps seems different on a console versus a PC (I know why movies can get away it).
Doesn't it have to do with the fact they are a constant value (movies)?
lexluthermiesterI own a PC show
Really, what channel and time slot? :laugh:
Posted on Reply
#107
lexluthermiester
EarthDogI'd have to imagine youd be in a minority saying AA isnt needed at 1080p.
Not based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. Then there's Steam's own stats that show most people turn AA down or off, most of them running at 1080p. Pixel density 1080p, 1440p and up mostly eliminates the need for AA as the "pixel laddering" effect of the past isn't noticeable or pronounced like it was in the past with lower resolutions and simply isn't needed. Try it yourself. Turn it off and see if it matters that much to you. Be objective though.
moproblems99Really, what channel and time slot? :laugh:
LOL! typo corrected..
Posted on Reply
#108
moproblems99
lexluthermiesterNot based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. Then there's Steam's own stats that show most people turn AA down or off, most of them running at 1080p.
I would assume most people that turn AA down on Steam are playing Insurgency or CS:GO.

EDIT:

For me, I don't turn anything down as long as I am over 60fps (75 now while I have this monitor). I can't stand no AA. It looks like crap. Although, still need to try rendering above my native and downscaling to see if visual vs perf is better/worse.
Posted on Reply
#109
EarthDog
lexluthermiesterNot based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. Then there's Steam's own stats that show most people turn AA down or off, most of them running at 1080p. Pixel density 1080p, 1440p and up mostly eliminates the need for AA as the "pixel laddering" effect of the past isn't noticeable or pronounced like it was in the past with lower resolutions and simply isn't needed. Try it yourself. Turn it off and see if it matters that much to you. Be objective though.


LOL! typo corrected..
links plz... :)

I recall that poll and walked away with a bit different of a meaning.

Yeah... let's be clear here. Nobody said max AA...but you said "AA off" and that "it wasn't needed at 1080p and higher". We concluded from that poll an overwhelming majority used AA be it max or somewhere in between. A single digit % turned it off while 47% depends on game settings which could be on or off.

But the reality was for most that they use it when they can.
Posted on Reply
#110
lexluthermiester
moproblems99I would assume most people that turn AA down on Steam are playing Insurgency or CS:GO.
That would be a big assumption. I personally doubt it, but who knows..
moproblems99Although, still need to try rendering above my native and downscaling to see if visual vs perf is better/worse.
Oh turn that on and leave it on. Then turn your AA down or off. You'll like the steady framerates much better.
EarthDoglinks plz... :)
store.steampowered.com/stats/
www.techpowerup.com/forums/threads/what-are-your-gpu-settings-for-running-games-and-benchmarks.250063/
Posted on Reply
#111
EarthDog
See edit above for your poll.

I'm mobile and cant dig down on the steam stats link... feeling saucy and post an image of it?
Posted on Reply
#112
CandymanGR
lexluthermiesterSo you don't trust actual usage and experience? Sounds like flawed logic. But hey, do carry on..

Sure it is when the settings are properly configured.

The word your looking for is experienced. I own a PC shop and we build every kind of system for every budget from bleeding edge gaming to economy minded gaming. From Ryzen or Intel to Radeon and Geforce. A 2060 3GB will be bare minimum but still doable as a gaming card, just like the Radeon 4GB cards are doable for 1080p gaming. Calling me biased only shows your ignorance. Good luck with that.
So now mentioning our tech past experience, gives credibility to what we're saying? That's flawed logic, not mine.
Should i start then mentioning how many systems i have build in the last 25 years i work as IT specialist? Should i? Really?
I don't trust people who are buying something which obviously has some flaws, and defend it like there is no tommorow. That translates a bit as "butthurt" to me.

3gb vram are NOT enough for MANY games. Not all, but many (AAA titles mostly). I can name a few, i already DID. I've said that 2 times already. Those are the games we're buying gpu's for, not lightweight games. You dont agree on that?

Do you think what i am saying is coming out of my a**? You think i haven't made benchmarks myself to see what's what? Ofcourse with the "right settings" vram requirement could go below 3giga but that not the point, because for the games i am speaking for, going below 3gb vram usage usually means also to go for medium/low settings. GTA V for example takes about 3.5 for medium/high settings at 1080p, NOT even Ultra. Now if you start editing .ini files then we're talking for very customized experience and thats not normal for the average user. And still that method cannot change the performance hits or gains beyond the capabilities of the gpu. It is just a more customized method, and you can sacrifice quality over speed (and vice versa) exactly as you want it, because sometimes the in-game settings do not satisfy all tastes. Thats all. But the performance of the gpu with 3gb vram will be what it is.

You are saying all that about your past experience, yet you are ready to defend a NEW gpu for 2019 with 3gb vram. And you call me ignorant!!!!!!!!!!

P.S. And please dont start with nvidia's "magic" compression.
Posted on Reply
#113
lexluthermiester
CandymanGRSo now mentioning our tech past experience, gives credibility to what we're saying? That's flawed logic, not mine.
Should i start that mentioning how many systems i have build in the last 25 years i work as IT specialist? Should i? Really?
No, this isn't a contest.
CandymanGRDo you think what i am saying is coming out of my a**?
Yes? Mostly because you're not taking into account most real-world usage scenario's. For 1080p a 2060 with 3GB will work and perform very well in most games out today. Very few AAA titles can not be made to run well on such as card. How do I know this you ask? Because it can be done with 1060 with 3GB. A 2060 is a better performing card so, naturally, what a 1060 can do a 2060 will do better. Simply deductive reasoning is all that is required to arrive at that conclusion.
CandymanGRfor medium/high settings at 1080p, NOT even Ultra.
Most people tinker with their settings so the "medium/high" argument is irrelevant as it ends up being customized.
CandymanGRAnd you call me ignorant!!!!!!!!!!
Though you took it out of context, that's what I said. And your statements above continue to lend merit to that conclusion.
Posted on Reply
#114
CandymanGR
lexluthermiesterNo, this isn't a contest.
Then why you've started it?
lexluthermiesterYes? Mostly because you're not taking into account most real-world usage scenario's. For 1080p a 2060 with 3GB will work and perform very well in most games out today.
Yes, and also a 950 with 2gb vram will work and perform "well" in most games. But thats NOT the point. We're NOT talking about cards of 2016 (thats how old 1060 is in case you dont remember). And also, the vram requirements for games are keep getting higher. You just cannot accept it. And i can tell you games that look like shit with custom settings for less than 3gb vram usage. Like Far Cry 4 or Shadow of Mordor.
lexluthermiesterVery few AAA titles can not be made to run well on such as card. How do I know this you ask? Because it can be done with 1060 with 3GB. A 2060 is a better performing card so, naturally, what a 1060 can do a 2060 will do better. Simply deductive reasoning is all that is required to arrive at that conclusion.
On the contrary, a faster core with ram as fast as a card of 2016 (i am refering to the GDDR5 versions) might starve of data faster, and suffer from data bus bottlenecks, especially for vram hungry games! But obviously you know how the card will perform without even seeing it first.
lexluthermiesterMost people tinker with their settings so the "medium/high" argument is irrelevant as it ends up being customized.
Is that even an argument? Obviously most people customise their settings. So? We need points of origin in order to discuss this, otherwise we can say "yeah you can customize X game to run with even 2gb of vram usage". Thats not the point! You'are avoiding the point systematically.

Edit: Oh and one more thing by the way. My english are not great, but as far as i remember calling someone "biased" is not an insult, but calling someone "ignorant", actually it is. Especially, in the way you've used it.
Posted on Reply
#115
lexluthermiester
CandymanGRThen why you've started it?
:wtf::kookoo:
CandymanGRYes, and also a 950 with 2gb vram will work and perform "well" in most games.
No, it wouldn't.
CandymanGRBut thats NOT the point.
Sure it is. Gaming performance is exactly the point here.
CandymanGRWe're NOT talking about cards of 2016 (thats how old 1060 is in case you dont remember).
Oh gee wiz, thanks for reminding me...:rolleyes:
CandymanGRAnd i can tell you games that look like shit with custom settings for less than 3gb vram usage.
That's your opinion and a completely subjective one. You're welcome to it.
CandymanGRBut obviously you know how the card will perform without even seeing it first.
Sure can, here's my premise for logic. I had a 1080 and upgraded to a 2080. Performance jump was significant. I have 1070 in one of my other PC's and it is known that the 2070 is a big jump in performance. It doesn't take much for a person to conclude that the 2060 will beat out a 1060. Therefore, very naturally, it is easy to conclude that anything a 1060 can do a 2060 will do much better. Don't need to see it to be able to accurately conclude the general performance of such a card.
CandymanGRbut as far as i remember calling someone "biased" is not an insult
Depends on how you use it, but I digress..
Posted on Reply
#116
CandymanGR
lexluthermiester:wtf::kookoo:

No, it wouldn't.
Yes it would.
Sure it is. Gaming performance is exactly the point here.
We were not talking about performance in general, we were talking about how 3gb vram are enough or not.
Oh gee wiz, thanks for reminding me...:rolleyes:
When arguments end, irony starts.
That's your opinion and a completely subjective one. You're welcome to it.
And yours also. Stop presenting your subjective opinion as fact.
Sure can, here my premise for logic. I had a 1080 and upgraded to a 2080. Performance jump was significant. I have 1070 in one of my other PC's it is known that the 2070 is a big jump in performance. It doesn't take much for a person to conclude that the 2060 will beat out a 1060. Therefore, very naturally, it is easy to conclude that anything a 1060 can do a 2060 will do much better. Don't need to see it to be able to accurately conclude the general performance of such a card.
So from the point of discussion which was 3gb vram is not enough, you 've reach the conclusion that "the next gen card will be faster than previous gen". No shit Sherlock!? What a great discovery you've made. Ofcourse it will be!

But i speak of specific area in terms of performance. How do you know for example, if the bus size for the GDDR5 models is the same or not, and therefore the bandwidth is similar to the previous gen or not? How do you know if the GDDR5 models would not have performance hit because of bandwidth? How do you know especially if the 3gb vram models will have enough data feed to the gpu? Your deductive logic is as flawed just as like the rest of your arguments.
This is exactly what i mean you are avoiding the point systematically.
Posted on Reply
#117
lexluthermiester
@CandymanGR
You're nitpicking and no longer offering merit based arguments. At this point it's obviously about ego for you, so I'm out.
Posted on Reply
#118
Xzibit
EarthDogSee edit above for your poll.

I'm mobile and cant dig down on the steam stats link... feeling saucy and post an image of it?
I was curious too as to why that reference and if SHS added it. I did not find any.
Posted on Reply
#119
CandymanGR
lexluthermiester@CandymanGR
You're nitpicking and no longer offering merit based arguments. At this point it's obviously about ego for you, so I'm out.
I quote specific parts, as it is part of the rules of the forum. I cannot quote each time a whole paragraph. I am not nitpicking, i am trying to prove a point here.
Posted on Reply
#120
lexluthermiester
EarthDogI'm mobile and cant dig down on the steam stats link... feeling saucy and post an image of it?
I know they've had those stats available. I just can't find it. Maybe they took it down?
EDIT;
There are still these;
store.steampowered.com/hwsurvey
Posted on Reply
#121
GoldenX
So, 192 bit for the 3 and 6GB variants, that's fine.
And 128 for the 4GB one? That's some nice downgrade in performance.
Posted on Reply
#122
lexluthermiester
GoldenXSo, 192 bit for the 3 and 6GB variants, that's fine.
And 128 for the 4GB one? That's some nice downgrade in performance.
We don't have those specs yet. I hope not. Maybe the 4GB 128bit will be the GDDR6 variant? That would even out the performance..
Posted on Reply
#123
GoldenX
lexluthermiesterWe don't have those specs yet. I hope not. Maybe the 4GB 128bit will be the GDDR6 variant? That would even out the performance..
Looking at that Gigabyte chart, there will be both GDDR6 and GDDR5 4GB variants. So, a GDDR5 128bit RTX, not nice.
Posted on Reply
#124
lexluthermiester
GoldenXLooking at that Gigabyte chart, there will be both GDDR6 and GDDR5 4GB variants. So, a GDDR5 128bit RTX, not nice.
Would have to agree unless the mem clocks are really high to make up for the difference.
Posted on Reply
#125
CandymanGR
lexluthermiesterWould have to agree unless the mem clocks are really high to make up for the difference.
I dont think the extra clock can cover 1/3 of bandwidth cut.
Posted on Reply
Add your own comment
Apr 16th, 2024 07:25 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts