Tuesday, December 25th 2018

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

NVIDIA drew consumer ire for differentiating its GeForce GTX 1060 into two variants based on memory, the GTX 1060 3 GB and GTX 1060 6 GB, with the two also featuring different GPU core-configurations. The company plans to double-down - or should we say, triple-down - on its sub-branding shenanigans with the upcoming GeForce RTX 2060. According to VideoCardz, citing a GIGABYTE leak about regulatory filings, NVIDIA could be carving out not two, but six variants of the RTX 2060!

There are at least two parameters that differentiate the six (that we know of anyway): memory size and memory type. There are three memory sizes, 3 GB, 4 GB, and 6 GB. Each of the three memory sizes come in two memory types, the latest GDDR6 and the older GDDR5. Based on the six RTX 2060 variants, GIGABYTE could launch up to thirty nine SKUs. When you add up similar SKU counts from NVIDIA's other AIC partners, there could be upward of 300 RTX 2060 graphics card models to choose from. It won't surprise us if in addition to memory size and type, GPU core-configurations also vary between the six RTX 2060 variants compounding consumer confusion. The 12 nm "TU106" silicon already has "A" and "non-A" ASIC classes, so there could be as many as twelve new device IDs in all! The GeForce RTX 2060 is expected to debut in January 2019.
Source: VideoCardz
Add your own comment

230 Comments on NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

#151
Vayra86
efikkan, post: 3966418, member: 150226"
In the real world, GTX 1060 3GB works just fine for an entry mid-range card, and that's what the experience tells us.

Comparing Kepler directly to Turing is not fair, newer GPU architectures are more efficient.
You said it right. Entry midrange. Bottom barrel. And releasing a similar 3GB 'next gen' means it has dropped lower than that.

If that is something you feel comfy spending ~200 bucks on, by all means. I'd suggest spending 220~240 to get double VRAM and more consistency alongside higher performance. These 2060's are going to present a choice along those lines and we all know they will, so let's stop fooling each other. This is a typical penny wise / pound stupid trade off.
Posted on Reply
#152
M2B
efikkan, post: 3966414, member: 150226"
Why is 4GB okay and 3GB not?
Different GPUs have different levels of compression, and different ways of allocating and managing memory.
4GB allows you to use Ultra/High quality textures in most games where the Ultra textures is out of reach on a 3GB card in newer AAA titles. sometimes you need to put the texture quality on medium to avoid stuttering and other problems on a 3G card.
That extra 1GB of memory makes a noticeable difference.

"Different GPUs have different levels of compression"
That's mostly about memory bandwidth and not the frame buffer.
You can't use better textures on a 3GB card vs another 3GB card and say "my GPU has better memory compression".
In Real-World scenarios it won't work like that.
Posted on Reply
#153
FordGT90Concept
"I go fast!1!11!1!"
efikkan, post: 3966418, member: 150226"
In the real world, GTX 1060 3GB works just fine for an entry mid-range card, and that's what the experience tells us.

Comparing Kepler directly to Turing is not fair, newer GPU architectures are more efficient.
6 GiB version of GTX 1060 outsold the 3 GiB versions 2:1. What does that tell you? Now we're almost 2.5 years later. I wouldn't be surprised if it shifts to 4:1 to 8:1 with the RTX 2060.

Why? I already explained it to you: the 32-bit barrier is gone. Games aren't treading lightly with their memory footprint anymore. Xbox One X ports have access to 11 GiB RAM + VRAM. As the next generation of consoles launch, it's going to go even higher. 3 GiB is like a person with a broken leg: they'll live but not well. Games want more memory. Run out and framerate tanks.
Posted on Reply
#154
ShurikN
Buying 3GB in 2019 is buying for yesterday, not tomorrow. There is zero future-proofing with it. Which would be fine if that was a $120 card. In reality it'll be around $300.
Some games will work fine, but most visually high-end titles will struggle.
Battlefield 5 is already unplayable at 1080p with less than 4GB VRAM.
Posted on Reply
#155
CandymanGR
Yes, but still people here try to persuade the rest of us that.. "3gb is just fine". Like we haven't seen ourselves how far this is from the truth.
But nobody of these people would actually buy such a card. Its funny, people with 2080's telling to other members that "3gb is fine".
But hey.... they are 'experienced", they know all and the rest of us we are ignorants.

P.S. Yes, i am being sarcastic.
Posted on Reply
#156
Tatty_One
Senior Moder@tor
FordGT90Concept, post: 3966427, member: 60463"
6 GiB version of GTX 1060 outsold the 3 GiB versions 2:1. What does that tell you? Now we're almost 2.5 years later. I wouldn't be surprised if it shifts to 4:1 to 8:1 with the RTX 2060.

Why? I already explained it to you: the 32-bit barrier is gone. Games aren't treading lightly with their memory footprint anymore. Xbox One X ports have access to 11 GiB RAM + VRAM. As the next generation of consoles launch, it's going to go even higher. 3 GiB is like a person with a broken leg: they'll live but not well. Games want more memory. Run out and framerate tanks.
To be fair, that was not just due to the increased memory but possibly the 128 more shader units the 6GB version had, it was just a faster card even when not using the additional memory for some 40 or 50$ more...……. the combination of both just made so much more sense for those who could stretch to the additional cost.
Posted on Reply
#157
EarthDog
Tatty_One, post: 3966478, member: 22689"
To be fair, that was not just due to the increased memory but possibly the 128 more shader units the 6GB version had, it was just a faster card even when not using the additional memory for some 40 or 50$ more...……. the combination of both just made so much more sense for those who could stretch to the additional cost.
Indeed....if anyone knew about it. That wasn't exactly advertised much. Prospective buyers would have to know or look and compare specs. Most looking at these see '6GB more than 3GB' and think it's better. You give consumers too much credit. :p
Posted on Reply
#158
bug
This is off topic (if this thread ever had one), but I was just looking at some ads for laptops. The GPU model wasn't even mentioned, it was "Nvidia graphics with 4GB of RAM". A bit scary imho.
Posted on Reply
#159
phill
Has Nvidia gone mad?? So much effort for a middle of the road card, I just don't get it...
Posted on Reply
#160
micropage7
sometimes i feel like they poisoning the market, i mean they create small gap to push consumer to add little more money to get higher one
Posted on Reply
#161
bug
micropage7, post: 3966499, member: 82848"
sometimes i feel like they poisoning the market, i mean they create small gap to push consumer to add little more money to get higher one
Yeah, I hate them for that, too. I mean, before Nvidia, this tactics was almost unheard of. Oh, wait!
Posted on Reply
#162
lynx29
Can't wait for 7nm AMD gpu's to make Nvidia cry. Sick of this crap.
Posted on Reply
#163
Vayra86
micropage7, post: 3966499, member: 82848"
sometimes i feel like they poisoning the market, i mean they create small gap to push consumer to add little more money to get higher one
You can go to a random bar in town and find the same trickery in a simple price list of booze. Or even a supermarket where all that matters is where the product is positioned - at eye level or far below.
Posted on Reply
#164
efikkan
FordGT90Concept, post: 3966427, member: 60463"
6 GiB version of GTX 1060 outsold the 3 GiB versions 2:1. What does that tell you?
That means Nvidia sold more GTX 1060 3GB cards than AMD did of RX 400/500 combined (according to Steam survey), so that tells me more people want this than Polaris.

And don't forget the 6 GB version of GTX 1060 is like 5-6% faster too.

FordGT90Concept, post: 3966427, member: 60463"
Why? I already explained it to you: the 32-bit barrier is gone.
Stop with this nonsense. 32-bit CPUs/OS' have NOTHING to do with memory capacity.

ShurikN, post: 3966438, member: 140585"
Buying 3GB in 2019 is buying for yesterday, not tomorrow. There is zero future-proofing with it. Which would be fine if that was a $120 card. In reality it'll be around $300.

Some games will work fine, but most visually high-end titles will struggle.

Battlefield 5 is already unplayable at 1080p with less than 4GB VRAM.
Aah, the eternal "future proofing" argument.
I remember all those who bought GCN over Kepler because it was more "future proofing" in Direct3D 12. Then R9 390(X) with 8 GB for "future proofing". And then Fiji with HBM for "future proofing", but then suddenly memory capacity didn't matter any more, because HBM was so glorious. Then Polaris with 8 GB for "future proofing", because memory capacity suddenly mattered again.

In real world it's a balancing act. You'll have to guess your requirements for the immediate future. But taking "future proofing" too far is going to be wasted money in the end. History has proven that paying extra for a lot of "future proofing" has never paid off.

lynx29, post: 3966561, member: 153071"
Can't wait for 7nm AMD gpu's to make Nvidia cry. Sick of this crap.
Then prepare yourself for disappointment.
Posted on Reply
#165
lynx29
efikkan, post: 3966570, member: 150226"
Then prepare yourself for disappointment.
sure thing buddy.
Posted on Reply
#166
Casecutter
Just to ask, with a 192 Bit bus how would they access 4Gb with only 3 lanes.
Posted on Reply
#167
efikkan
Casecutter, post: 3966588, member: 94772"
Just to ask, with a 192 Bit bus how would they access 4Gb with only 3 lanes.
Two options that I know of:
- Disable one memory controller and use 128-bit, possibly compensate with faster memory.
- Use an imbalanced memory configuration, like GTX 660/660 Ti.
Posted on Reply
#168
yakk
LOL at 3GB...

*Looks through closet for old 3GB 7970... yup... launched in 2012...*

I don't get it, but ok... guess 2060 is the "Sucker's Edition"..
Posted on Reply
#169
bug
lynx29, post: 3966561, member: 153071"
Can't wait for 7nm AMD gpu's to make Nvidia cry. Sick of this crap.
Yeah, people have been waiting for that since AMD's nomenclature used four digits.

Edit: Oops, I forgot about the 285.
Posted on Reply
#170
Vayra86
efikkan, post: 3966570, member: 150226"
That means Nvidia sold more GTX 1060 3GB cards than AMD did of RX 400/500 combined (according to Steam survey), so that tells me more people want this than Polaris.

And don't forget the 6 GB version of GTX 1060 is like 5-6% faster too.


Stop with this nonsense. 32-bit CPUs/OS' have NOTHING to do with memory capacity.


Aah, the eternal "future proofing" argument.
I remember all those who bought GCN over Kepler because it was more "future proofing" in Direct3D 12. Then R9 390(X) with 8 GB for "future proofing". And then Fiji with HBM for "future proofing", but then suddenly memory capacity didn't matter any more, because HBM was so glorious. Then Polaris with 8 GB for "future proofing", because memory capacity suddenly mattered again.

In real world it's a balancing act. You'll have to guess your requirements for the immediate future. But taking "future proofing" too far is going to be wasted money in the end. History has proven that paying extra for a lot of "future proofing" has never paid off.


Then prepare yourself for disappointment.
You can believe whatever you want to believe and if you think 3GB GPUs are the shit in 2019, more power to you. The sales records show a different picture with even the midrange vastly outselling 3GB models with higher capacities. Compare that to the Kepler days where standard high end VRAM was 2GB. Nobody in their right mind bought the 4GB 670 or 680 - simply because games could barely even allocate more than 1.5GB. Today they allocate 6GB and up without issues. Game development has changed a bit. There is a reason 4GB 970's were released when Maxwell popped up (a 25% increase over a similar core power GTX 780(ti)), and that amound doubled for the next gen equivalent, GTX 1070. Its clear as day the balance has completely shifted towards the console norm in terms of VRAM. There is a reason even the midrange RX480 comes in 8GB flavors too.

Explain this, how does 5-6% performance gap translate to half or 25% less VRAM? Where is the balance in that? And why would you *not* suffer a performance hit from such a cutdown when you push data over the same, rather narrow bus?

Common sense, use it, instead of gazing endlessly at performance summaries that reduce all detail to a single percentage and rarely bases it on a fully comprehensive benchmark suite. Reviews are an indicator, not an absolute all encompassing truth. People apparently still didn't get that memo. Its the exception that makes the rule when it comes to VRAM and you only need one edge case to kill the experience.
Posted on Reply
#171
Gasaraki
M2B, post: 3965809, member: 172252"
You can easily run almost every single game on a 3GB card at 1080p just fine, but you're gonna lower the texture quality in some games and that's all. the 3GB version is not ideal but is not as usless as some people think, it actually might be a good value card for those needing a fairly powerful GPU and don't care about the maximum quality textures or VRAM related settings.
if the 3GB RTX 2060 is going to be priced under 200$, I don't see a problem but anything higher than 200$ is unacceptable.
8GB VRAM on a 800$ card (RTX 2080) is more disappointing than 3GB on a budget card to me.
OMG, I NEEDS 12GB of RAM!!!

The truth is that if most people game at 1080 (which it is) 3GB of RAM should be enough for the vast majority of games. I game at 3440x1440 and most games don't break 4GB of RAM.
Posted on Reply
#172
Tatty_One
Senior Moder@tor
Gasaraki, post: 3966642, member: 168493"
OMG, I NEEDS 12GB of RAM!!!

The truth is that if most people game at 1080 (which it is) 3GB of RAM should be enough for the vast majority of games. I game at 3440x1440 and most games don't break 4GB of RAM.
I struggle to comprehend what games you are playing then. I moved from a 4GB 290X to a 1070 a few months back, I only play one game which is world of tanks and they completely updated their game engine and overnight the 290X on ultra settings moved from an average 2.9Gig usage to 4.4 and that game is hardly demanding even on ultra at my 2560 x 1080.
Posted on Reply
#173
cyneater
I love how everyone is triggered with the 3,4 and 6GB?

What about the 11Gb flag ship.... its the same as the last generation... :P and its an odd number.
12GB would be better or 16gb :P

Since there is no competition Nvidia could stick there logo on a turd and market that

Where is the blast processing?
Posted on Reply
#174
kanecvr
Nxodus, post: 3965844, member: 183665"
AMD = overheating, power hungry, unstable, bad software, no innovation BUT IT'S CHEAPER!!
NVIDIA = expensive

I really don't get the NVIDIA haters, AMD is pure shit, it's the Walmart variant of video cards
You're kidding right? Latest nvidia software is garbage. AMD adrenalin has better features, includes out of the box OC and fan profiles, runs and feels smoother, is less taxing then geforce experience and launches faster, doesn't nag you to create an account to use some features, and it's been bug free for years - as opposed to nvidia's drivers witch had versions with critical bugs like "forgetting" to spin up your fans when in load, miss-managing GPU voltage, failing to install over stock, signed windows drivers, and so on.

And don't get me started on overheating. My 1080FE would quickly go to 83C and throttle down to 1300mhz, causing it to perform WORSE them my old 1070. I took the FE card back and got a MSI card, witch did pretty much the same thing. I had to buy and install a 100$ cooler to get the card to stop throttling. Same for power usage - the 180w TDP on the 1080 is pure fiction. Under load my 1080 draws 200-240w on it's own (tested with an ampermeter on the 2nd 12v rail on the PSU witch only the video card is using 20 amps x 12 = 240w. I tought the card draws 180w if not allowed to boost, but at stock 1530Mhz it draws allmost 16 amps - 12v x 16a = 192W. If you're refering to the 1060, the yeah - those are cool cards. 70-75C even with cheap, crap coolers - but they are fast cards. They're OK for 1080p, but that's it. The 580 can do a lot better, especially overclocked versions like the XFX 580GTS OC Black edition (that card does get pretty freakin' hot tough).

As for instability - you've never used an AMD card right? I've had a 7950, then two, then a 280x, then bought a second one, and then a 290 (no-x) - they were all rock-solid. I also played around with a Vega64 - and while the max FPS is not as high as on a 1080 (in some games), the minimum FPS and frame times are miles better on the Vega, so much so that most games are noticeably smoother, even tough the framerate is a tiny bit slower. I've been trying to trade my 1080 for a vega64, but guess what - nobody wants to take the trade! The only reason I switched to nvidia is the minding boom witch made AMD cards climb in price to a silly degree. A Vega 64 was twice the price I payed for my 1080, so I said screw that and bought what made sense at the time.

You are DEFINITELY RIGHT on the innovation part tough... AMD needs to get of their asses and release something competitive - not that 590 (i.e. overclocked 580 BULLSHIT). And this both for AMD and Nvidia fans. Left to their own devices, nvidia will end up charging 5000$ for a high end GPU.

lexluthermiester, post: 3965842, member: 134537"
For 1080p screens(which most gamers are still using), 3GB is still reasonable.


That will depend on the settings level.
The 470 and 570 are great little cards. And so is the 580. I picked up a 4GB 580 in november for 150$ for my living room PC (i5 2500k @ 4GHz, mATX form factor) and it runs 1080p @ ultra flawlesly. I even play some games in 4k (less demanding ones like civ6 and some oldies). For that price nvidia was ofering a 1050ti witch is significantly slower.
Posted on Reply
#175
CandymanGR
efikkan, post: 3966570, member: 150226"
Stop with this nonsense. 32-bit CPUs/OS' have NOTHING to do with memory capacity.
A 32bit cpu cannot address in 64 bit memory space. And you need 64bit addressing in order to have access in more than 4gigabytes of memory.
There is NO way the memory above 4g to be addressable from a 32bit cpu, not even with virtual memory paging. At least not from an x86 cpu.


efikkan, post: 3966592, member: 150226"
Two options that I know of:
- Disable one memory controller and use 128-bit, possibly compensate with faster memory.
- Use an imbalanced memory configuration, like GTX 660/660 Ti.
There is NO way to cover the deficit of one third of cutting the memory bus, with higher clocks, because the GDDR5 memory has limitations on the speeds it can achieve. Thats why i think they are using GDDR6 also for the same model. Because IT MATTERS for this generation, even for middle range. Propably the GDDR6 model will be much faster and maybe with slightly different core config. I guess nvidia knows the GDDR5 is not enough as the core needs its memory to be, but they dont give a damn. Milking the cow is the way for them. You will need 30-35% increase in memory speed to cover the deficit.

It so funny seeing you guys trying to defend something that sucks so hard. Really, some people here should consider a new carreer in comedy (that was a joke).
Posted on Reply
Add your own comment