Tuesday, December 25th 2018

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

NVIDIA drew consumer ire for differentiating its GeForce GTX 1060 into two variants based on memory, the GTX 1060 3 GB and GTX 1060 6 GB, with the two also featuring different GPU core-configurations. The company plans to double-down - or should we say, triple-down - on its sub-branding shenanigans with the upcoming GeForce RTX 2060. According to VideoCardz, citing a GIGABYTE leak about regulatory filings, NVIDIA could be carving out not two, but six variants of the RTX 2060!

There are at least two parameters that differentiate the six (that we know of anyway): memory size and memory type. There are three memory sizes, 3 GB, 4 GB, and 6 GB. Each of the three memory sizes come in two memory types, the latest GDDR6 and the older GDDR5. Based on the six RTX 2060 variants, GIGABYTE could launch up to thirty nine SKUs. When you add up similar SKU counts from NVIDIA's other AIC partners, there could be upward of 300 RTX 2060 graphics card models to choose from. It won't surprise us if in addition to memory size and type, GPU core-configurations also vary between the six RTX 2060 variants compounding consumer confusion. The 12 nm "TU106" silicon already has "A" and "non-A" ASIC classes, so there could be as many as twelve new device IDs in all! The GeForce RTX 2060 is expected to debut in January 2019.
Source: VideoCardz
Add your own comment

230 Comments on NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

#126
CandymanGR
lexluthermiester, post: 3966271, member: 134537"
@CandymanGR
You're nitpicking and no longer offering merit based arguments. At this point it's obviously about ego for you, so I'm out.
I quote specific parts, as it is part of the rules of the forum. I cannot quote each time a whole paragraph. I am not nitpicking, i am trying to prove a point here.
Posted on Reply
#127
lexluthermiester
EarthDog, post: 3966235, member: 79836"
I'm mobile and cant dig down on the steam stats link... feeling saucy and post an image of it?
I know they've had those stats available. I just can't find it. Maybe they took it down?
EDIT;
There are still these;
https://store.steampowered.com/hwsurvey
Posted on Reply
#128
GoldenX
So, 192 bit for the 3 and 6GB variants, that's fine.
And 128 for the 4GB one? That's some nice downgrade in performance.
Posted on Reply
#129
lexluthermiester
GoldenX, post: 3966279, member: 160319"
So, 192 bit for the 3 and 6GB variants, that's fine.
And 128 for the 4GB one? That's some nice downgrade in performance.
We don't have those specs yet. I hope not. Maybe the 4GB 128bit will be the GDDR6 variant? That would even out the performance..
Posted on Reply
#130
GoldenX
lexluthermiester, post: 3966281, member: 134537"
We don't have those specs yet. I hope not. Maybe the 4GB 128bit will be the GDDR6 variant? That would even out the performance..
Looking at that Gigabyte chart, there will be both GDDR6 and GDDR5 4GB variants. So, a GDDR5 128bit RTX, not nice.
Posted on Reply
#131
lexluthermiester
GoldenX, post: 3966283, member: 160319"
Looking at that Gigabyte chart, there will be both GDDR6 and GDDR5 4GB variants. So, a GDDR5 128bit RTX, not nice.
Would have to agree unless the mem clocks are really high to make up for the difference.
Posted on Reply
#132
CandymanGR
lexluthermiester, post: 3966284, member: 134537"
Would have to agree unless the mem clocks are really high to make up for the difference.
I dont think the extra clock can cover 1/3 of bandwidth cut.
Posted on Reply
#133
lexluthermiester
CandymanGR, post: 3966285, member: 167652"
I dont think the extra clock can cover 1/3 of bandwidth cut.
It's been done on previous gen cards of both AMD and NVidia. We'll see what happens once the review/information embargo's are lifted.
Posted on Reply
#134
Xzibit
lexluthermiester, post: 3966224, member: 134537"
Not based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. Then there's Steam's own stats that show most people turn AA down or off, most of them running at 1080p. Pixel density 1080p, 1440p and up mostly eliminates the need for AA as the "pixel laddering" effect of the past isn't noticeable or pronounced like it was in the past with lower resolutions and simply isn't needed. Try it yourself. Turn it off and see if it matters that much to you. Be objective though.
Still can't find it. As for Pixel density the survey doesn't specify what size monitor people use either.

lexluthermiester, post: 3966278, member: 134537"
I know they've had those stats available. I just can't find it. Maybe they took it down?
EDIT;
There are still these;
https://store.steampowered.com/hwsurvey
They never have shown AA use in those surveys.
Posted on Reply
#135
EarthDog
lexluthermiester, post: 3966278, member: 134537"
I know they've had those stats available. I just can't find it. Maybe they took it down?
EDIT;
There are still these;
https://store.steampowered.com/hwsurvey
I dont ever recall seeing AA stats there. I cant even fathom how they would keep track of something so variable anyway.

Thanks though. :)
Posted on Reply
#136
FordGT90Concept
"I go fast!1!11!1!"
6 GiB (11.16%) variant is obviously more popular than 3 GiB (6.53%) and 5 GiB (1.86%).
Posted on Reply
#137
Charcharo
bug, post: 3966216, member: 157434"
You do however posses an uncanny ability to mix together all sorts of things barely related to the subject.

The simple truth is games happen to be playable at settings other than max. Some will look better, some will look worse, depending on budget and who coded it and made the assets. But I have never met a person who didn't play a game because they couldn't max out shadows or textures. I have met people that delayed playing a game, because they were planning an upgrade and thought they'd make the best of it.
That you can't see the relation is not my problem. It is pretty obvious to me...

I have met people who won't play a game if they can't max it out, and also people who won't play a game since they will be getting an upgrade soon. I haven't met a person who looks at system requirements, unironically, since 2011.

EarthDog, post: 3966218, member: 79836"
Oh, I'd bet good money a majority wouldn't consider 30 fps to be playable in most genres/titles. RTS, I can do... FPS...I'd cry and likely get a headache...

Playable and enjoyable I'm kind of using interchangeably. I mean... 15 is playable. The game plays...but the experience and it being enjoyable, the majority tend agree 30 fps isnt for PCs.

But....this is all a bit OT. I'd love a thread to get down to the bottom.of why 30 fps seems different on a console versus a PC (I know why movies can get away it).
Majority rule is not a valid argument. It never has been, but if you want to play with that - consoles as a whole (not individual platforms) are bigger than PC Gaming. RTS is extremely demanding, what you are looking for seems to be turn-based strategy or tactics. Those you can tolerate at 30 fps.

PCs are not some ephemeral platform that is so different to consoles. The requirements in games do not mean much, that is the point. BTW, Wolfenstein The New Order used to be (before a game update which broke its wobbly engine) be playable at 60 fps, 1080p on a 6850. Even using your wacky ideas for what system requirements should mean, this is illogical.


3GB is way too little. 4GB is pathetic too. It is not a good idea at all. VRAM matters, modding matters, the 1160/2060 is fast enough for it to matter even for people who misunderstand how VRAM works.
Posted on Reply
#138
Batailleuse
M2B, post: 3965809, member: 172252"
You can easily run almost every single game on a 3GB card at 1080p just fine, but you're gonna lower the texture quality in some games and that's all. the 3GB version is not ideal but is not as usless as some people think, it actually might be a good value card for those needing a fairly powerful GPU and don't care about the maximum quality textures or VRAM related settings.
if the 3GB RTX 2060 is going to be priced under 200$, I don't see a problem but anything higher than 200$ is unacceptable.
8GB VRAM on a 800$ card (RTX 2080) is more disappointing than 3GB on a budget card to me.
Point of. Buying a 2060 3gb to run low. Everything? You can buy a 4-6gb 1060 that will do better for cheaper or even a 1070 that will do better for cheaper
Posted on Reply
#139
bibob94
Well all I can say is, this is bullshit.
Posted on Reply
#140
lexluthermiester
Xzibit, post: 3966306, member: 105152"
They never have shown AA use in those surveys.
Yes they have. Those kind of fine-grained details are available to the Steam Client and get reported to Steam telemetry service. The info is there and has been available, just doesn't seem to be there anymore.
Posted on Reply
#141
Chloe Price
ShurikN, post: 3965783, member: 140585"
3GB on a mid range gpu in 2019... I'm at a loss for words.
Still kicking with GTX 780 (I know, this was high-end back in the day)

BUT WHY?! Why there can't be just one or two models..
Posted on Reply
#142
Vayra86
This just goes to show that Nvidia still cannibalizes x50-x60 in every possible way they can. Been doing it since 550ti and never stopped. I'm not that into the generations prior to it when it comes to the VRAM trickery but maybe this was their thing for longer.

Nothing new here, apart from the ever increasing complexity when buying one of these GPUs. The take away is the same: when you're shopping in this territory, you will always be making tradeoffs from the moment you purchase it. And those tradeoffs can range well into uncomfortable territory for even casual gamers. As for the rest of the four-page discussion about quality settings... not very relevant I'd say.

For a meaty discussion on 3GB vs more GB, we have live examples of 780, 780ti and 1060 3GB already running into those limits at settings these GPUs can push comfortably. That says enough because the 2060 will be faster on the core. Balance is meh and 4GB and up is recommended. Its not even relevant what specs on the games' box say in that regard, its about how a GPU is balanced.


The key element is price, and knowing Nvidia, they will price it far too high, so TL DR these cards are going to be eclipsed by a more power hungry AMD alternative, /thread and Happy New Year :toast:
Posted on Reply
#143
goodeedidid
ShurikN, post: 3965783, member: 140585"
3GB on a mid range gpu in 2019... I'm at a loss for words.
Hey they give you the choice, if you wanna go budget then do it, nobody is going to be rocking 2060 with 4K screens so 3GB shouldn't be that bad, for FHD that is.
Posted on Reply
#144
efikkan
Charcharo, post: 3966367, member: 162483"
3GB is way too little. 4GB is pathetic too. It is not a good idea at all. VRAM matters, modding matters, the 1160/2060 is fast enough for it to matter even for people who misunderstand how VRAM works.
So how do you pick your threshold? Is it an arbitrary number? It's 2018, therefore cards needs x amount of memory?
I base my conclusions on facts, and the fact is that GTX 1060 3GB is still a good option for many buyers.

And as I've said many times already, just because some need more memory doesn't mean everyone needs it.
Modding and high-res texture packs is a niche thing - an edge case, if you're one of those who do it, buy a card with more memory, it's as simple as that.
Posted on Reply
#145
FordGT90Concept
"I go fast!1!11!1!"
4 is okay (not so much in RTX 2060 case because of gimped bandwidth), >4 is preferred.
Posted on Reply
#146
efikkan
FordGT90Concept, post: 3966413, member: 60463"
4 is okay (not so much in RTX 2060 case because of gimped bandwidth), >4 is preferred.
Why is 4GB okay and 3GB not?
Different GPUs have different levels of compression, and different ways of allocating and managing memory.
Posted on Reply
#147
Tsukiyomi91
would just rather take the full 6GB GDDR6 variant anyways coz any smaller/slower memory types will tank the GPU a lot & more complaints will show up...
Posted on Reply
#148
Vayra86
efikkan, post: 3966414, member: 150226"
Why is 4GB okay and 3GB not?
Different GPUs have different levels of compression, and different ways of allocating and managing memory.
Call it the value of experience, and lacking that, 3GB is a great choice for a midrange GPU. Then you use one, you learn its not all that rosy as the reviews and benches told you, and you make the better choice next time. Or you're thát casual in gaming that you never touch a game that hits a limit or accidentally never use settings that hit a limit. In those edges cases, yes, 3GB is fine.

Its not like there aren't any alternatives at the same price point. Back when the 780(ti) and 7970 were hot, that was different. Today, its silly to pick one. Its simply not the preferable choice especially given the often tiny price gap for something better.
Posted on Reply
#149
efikkan
Vayra86, post: 3966416, member: 152404"
Call it the value of experience, and lacking that, 3GB is a great choice for a midrange GPU. Then you use one, you learn its not all that rosy as the reviews and benches told you, and you make the better choice next time. Or you're thát casual in gaming that you never touch a game that hits a limit or accidentally never use settings that hit a limit. In those edges cases, yes, 3GB is fine.

Its not like there aren't any alternatives at the same price point. Back when the 780(ti) and 7970 were hot, that was different. Today, its silly to pick one. Its simply not the preferable choice especially given the often tiny price gap for something better.
In the real world, GTX 1060 3GB works just fine for an entry mid-range card, and that's what the experience tells us.

Comparing Kepler directly to Turing is not fair, newer GPU architectures are more efficient.
Posted on Reply
#150
HZCH
Confusing AF

What the hell where they thinking? And this 3gb RAM for a midrange card is an insult.

Hopefully for future buyers, it might be fake...
Posted on Reply
Add your own comment