Tuesday, December 25th 2018

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

NVIDIA drew consumer ire for differentiating its GeForce GTX 1060 into two variants based on memory, the GTX 1060 3 GB and GTX 1060 6 GB, with the two also featuring different GPU core-configurations. The company plans to double-down - or should we say, triple-down - on its sub-branding shenanigans with the upcoming GeForce RTX 2060. According to VideoCardz, citing a GIGABYTE leak about regulatory filings, NVIDIA could be carving out not two, but six variants of the RTX 2060!

There are at least two parameters that differentiate the six (that we know of anyway): memory size and memory type. There are three memory sizes, 3 GB, 4 GB, and 6 GB. Each of the three memory sizes come in two memory types, the latest GDDR6 and the older GDDR5. Based on the six RTX 2060 variants, GIGABYTE could launch up to thirty nine SKUs. When you add up similar SKU counts from NVIDIA's other AIC partners, there could be upward of 300 RTX 2060 graphics card models to choose from. It won't surprise us if in addition to memory size and type, GPU core-configurations also vary between the six RTX 2060 variants compounding consumer confusion. The 12 nm "TU106" silicon already has "A" and "non-A" ASIC classes, so there could be as many as twelve new device IDs in all! The GeForce RTX 2060 is expected to debut in January 2019.
Source: VideoCardz
Add your own comment

230 Comments on NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

#51
Vya Domus
FordGT90Concept said:
64-bit games (which most are these days) will use >4 GiB of VRAM if it is available.
Double-precision is rarely used, if ever for shading, in game. That is unless you meant something else ? Also, just because your application is compiled for 64-bit machines that doesn't mean there is direct correlation between that and memory usage. You can have a program that uses a smidget of 64-bit data types and yet it may use just a couple of megabytes or several gigabytes. It's more about quantity, not just data types that are used.

Recus said:
First AMD fanboys blaming Nvidia for holding AIBs on short leash. Then Nvidia let AIBs run loose... Still Nvidia's fault. :laugh:
You are completely out of touch with this subject.

Nxodus said:
AMD is pure shit
Nah, that'd be your comment.
Posted on Reply
#52
CandymanGR
What? No 5gb version? 3-4-6... where is the 5? I guess they missed that one.

Nvidia has fucked it up, and not matter what fanboys are telling you, the RTX was a fiasco.
And people who claim 3gb are enough for 1080p gaming are twisting the truth. That is true only for esports games, that are designed to be lightweight. You CANNOT play a lot of AAA modern games because they demand more than 3gb vram EVEN in 1080p/medium/high settings (GTA V, Far Cry 4, Shadow of mordor, etc). And this will happen more often as newer games come out. So lets cut the crap, shall we?

P.S. And before you criticize me for AMD fanboy, i will tell you that in more than 20 gpus i've had, only two were AMD.
Posted on Reply
#53
Nxodus
Vya Domus said:
You are completely out of touch with the subject.



Nah, that'd be your comment.
quality argument.

-Help! my game is crashing
-What's your setup?
-cheap AMD with 9 billion GDDR
-Ok please wait 2 weeks until devs fix the game so it runs properly on your crapstick
Posted on Reply
#54
lexluthermiester
CandymanGR said:
Nvidia has fucked it up, and not matter what fanboys are telling you, the RTX was a fiasco.
No it isn't. Whiners have made a bunch of needless noise. I have one. It rocks.
Posted on Reply
#55
Vya Domus
Your threadcrapping is getting annoying.
Posted on Reply
#56
CandymanGR
lexluthermiester said:
No it isn't. Whiners have made a bunch of needless noise. I have one. It rocks.
I dont trust a single word from people who say that what they bought is the best. And also, 3gb are NOT enought. You are biased as hell.
Posted on Reply
#57
Vya Domus
The point isn't that 3GB is enough or not. It's just rather pathetic to ship a card that will likely be at the very least 250$+ with just 3GB in 2019.
Posted on Reply
#58
CandymanGR
Vya Domus said:
The point isn't that 3GB is enough or not. It's just rather pathetic to ship a card that will likely be at the very least 250$+ with just 3GB in 2019.
It is. But the practical side is also important. You cannot build a gaming gpu in 2019 with 3gb vram, its stupid. But nvidia is milking the cow, doesnt care for anything else.
Posted on Reply
#59
Nxodus
Vya Domus said:
Your threadcrapping is getting annoying.
No, i'd say NVIDIA bashing is getting out of hand. Also, I really don't understand the AMD love. It's just absurd. NVIDIA needs a proper competitor. AMD is dragging NVIDIA down.
Posted on Reply
#60
windwhirl
Wow... I like how Nvidia doesn't rely on market research and prefers to have a field day, poking the sleeping dragon /s
Posted on Reply
#61
Durvelle27
People saying 3GB is enough, it is for some games but that’s not the case for all newer titles.

Plus let’s look at it like this

The RTX 2070 is meant to replace the GTX 1070 which is 8GB GPU aimed at 1440P gaming with ultra details

Would you really want to buy a 1070 replacement aiming for 1440 gaming with 3GB of VRAM

Also with the msrp of current RTX

RTX 2080 Ti $999
RTX 2080 $699
RTX 2070 $499

Nvidia likely will price the 2060 accordingly around $299-$399, are you really willing to pay that amount for a 3GB GPU that will become VRAM limited before it even utilizes it’s full potential
Posted on Reply
#62
Nxodus
Durvelle27 said:
around $299-$399, are you really willing to pay that amount for a 3GB GPU that will become VRAM limited before it even utilizes it’s full potential
Can't you people see, NVIDIA is finally opening to the cheapo market. The 3GB ones will be much cheaper than $300, why do you people assume they will charge $300 for them? Everyone knows ram is expensive, less ram, lower price.
Posted on Reply
#63
CandymanGR
Durvelle27 said:
People saying 3GB is enough, it is for some games but that’s not the case for all newer titles.
I already mentioned that.
But you contradict yourself in the next part of your post, when you say first that it is for some games.
You contradict yourself when you are saying that the 3gb WILL be a limiting factor (in the future), while in the first part you are saying that it is not limiting factor for all games (which is true for esports games for example). But if someone is playing esports, he has NO reason to upgrade the gpu, since he can play with previous gen just fine. We buy new gpu's to play the new AAA titles with as much performance as possible, not in order to play LoL with 500 fps.


Durvelle27 said:
Nvidia likely will price the 2060 accordingly around $299-$399, are you really willing to pay that amount for a 3GB GPU that will become VRAM limited before it even utilizes it’s full potential
But... but... 3gb vram is ALREADY a limiting factor. Ofcourse the card will never reach full potential because it is ALREADY limited by vram. What the....?
Posted on Reply
#64
B-Real
And some people were laughing at the RX590. TBH, RX590, looks a better product compared to this complete mess.

lexluthermiester said:
No it isn't. Whiners have made a bunch of needless noise. I have one. It rocks.
Of course you have to defend that you have been milked in a crazy way. Those who try to defend this piece of crap can't understand our concern is not its performance but the BIGGEST PROBLEM is the less performance gain than the 700-900 series switch (it had around 40%, now it is 30%), and while the 900 series was $50-100 cheaper than their 700 series equivalents, the 20 series is $100-300 (in reality: $130-500) more expensive than the 10 series... Moreover, the other great problem is that its mostly advertised feature is only present in 1 game yet since the cards 3,5 month lifespan, which is one of the most optimized games-engines on the market, yet it couldn't squeeze out fix 60 fps in FHD with RTX level over low with a $1100-1200 GPU. And with the patch that lowered picture quality you can get this to 77 fps average. 45-55 average only in 4K. And this is achieved by the developer announcing in advance that they had to decrease the effect of RTX. Tomb Raider may hit 30s in FHD with the 2080Ti. Third game that was announced with RTX (FFXV) is now cancelled. This is the worst price-performance GPU family ever.

Nxodus said:
AMD = overheating, power hungry, unstable, bad software, no innovation BUT IT'S CHEAPER!!
NVIDIA = expensive

I really don't get the NVIDIA haters, AMD is pure shit, it's the Walmart variant of video cards
LOL. Bumhurt guy.

Nxodus said:
quality argument.

-Help! my game is crashing
-What's your setup?
-cheap AMD with 9 billion GDDR
-Ok please wait 2 weeks until devs fix the game so it runs properly on your crapstick
You know which company released a driver with WHQL certificate that crashed an AAA title? It was NV with Watch Dogs 2. Laughing at you so loud now.
Posted on Reply
#65
FordGT90Concept
"I go fast!1!11!1!"
efikkan said:
They do, but this is still irrelevant.
Computation requires VRAM too.

Vya Domus said:
Double-precision is rarely used, if ever for shading, in game. That is unless you meant something else ? Also, just because your application is compiled for 64-bit machines that doesn't mean there is direct correlation between that and memory usage. You can have a program that uses a smidget of 64-bit data types and yet it may use just a couple of megabytes or several gigabytes. It's more about quantity, not just data types that are used.
Virtual memory address space.
Posted on Reply
#66
TheGuruStud
I see a paid shill showed up in short order to protect nvidia. Cmon, nvidia. You gotta hire Clinton levels of shills to have a chance (note: not a political remark, literally a reference in scale).
Posted on Reply
#67
Aquinus
Resident Wat-man
I guess nVidia's stock hasn't hit rock bottom yet.
Posted on Reply
#68
ArbitraryAffection
efikkan said:
I think one RTX 2060 is plenty to fill the market. But comparing GPUs and setting prices based on memory size is pure BS. 3GB is fine for an entry mid-range card as long as proper benchmarking shows it's fine.


RTX 2060 will perform far beyond RX 570, a low-end card. Brag all you want about 8GB, there is no way that card needs it.


Memory usage doesn't mean memory requirement, many applications allocate more memory than they need. What matters is performance, or lack thereof. Stuttering might be a indicator of too little memory.
Wow, i'm reading a lot of hostility in your post. There's no need for that. And actually, there are situations where the 570 could use over 4GB, so the 8GB can make sense. Especially if i want to run high resolution textures. Something that a 3GB 2060 will not ne able to do, and hilariously may even perform less than the 570 in that situation.

The fact is, AMD is offering more value in the mid-range. And 570 isn't low-end, it's lower-mid-range. :P But feel free to defend getting less hardware for more money. Seems basically the whole idea of Intel / Nvidia these days.
Posted on Reply
#69
Vya Domus
FordGT90Concept said:

Virtual memory address space.
That's on the OS side of things and again, doesn't have anything to do with how much VRAM an application will use.
Posted on Reply
#70
EarthDog
Wow...just wow. Cant say I like this many variants...holy cow.

lexluthermiester said:
Not really. I have a 2080 and still turn off AA. No real need for it at 1080p and above. Most gamers prefer framerate VS AA.
I'd have to imagine youd be in a minority saying AA isnt needed at 1080p. Even on fps I notice AA off at that res. If I wanted to play minecraft, I would. :p

4k is the only place I can disable it and still manage to notice.

Now if it's an fps issue, of course, shut it down...but it's to the detriment of IQ big time at 1080p.
Posted on Reply
#71
efikkan
ArbitraryAffection said:
And actually, there are situations where the 570 could use over 4GB, so the 8GB can make sense. Especially if i want to run high resolution textures. Something that a 3GB 2060 will not ne able to do, and hilariously may even perform less than the 570 in that situation.
Please note that I did not say there is no use cases for having more memory, but just because some may need it, doesn't mean everyone needs it. For many, the cheaper cards offer much more value. It's no accident that both AMD and Nvidia offer their RX 480/580 and GTX 1060 in low and high memory configurations respectively.

And as I've mentioned, the fact that a game allocates more memory doesn't mean it needs it. To evaluate that, you need to evaluate reductions in performance and/or rendering quality.

ArbitraryAffection said:

The fact is, AMD is offering more value in the mid-range. And 570 isn't low-end, it's lower-mid-range. :p But feel free to defend getting less hardware for more money. Seems basically the whole idea of Intel / Nvidia these days.
If you keep expanding it, it's no longer the mid-range. ;)
In the mid-range, GTX 1060 3GB/6GB have been the better choice over RX 480/580 4GB/8GB. The only slice in there where AMD have no direct competition from Nvidia is the new RX 590, but the only argument for this one is if you can't afford GTX 1070. RX 590 is still a little odd, considering how close to GTX 1060 and RX 580 it really is. In general, AMD is hardly competitive in the mid-range, and except for possibly the RX 590, they can't claim to offer better value. This is only going to get more challenging as RTX 2060 arrives.
In the low-end, AMD have more compelling options, like the RX 570 vs. GTX 1050 Ti.
Posted on Reply
#72
95Viper
Keep it on topic.
Stop the name calling.
Keep it civil and constructive.

Thank You and have a nice day.
Posted on Reply
#73
eidairaman1
The Exiled Airman
INSTG8R said:
I have a feeling RTRT will be all but useless at this level.
Good Ol NV adding to the confusion, first the 1060 now this.

If anything Skip the 3GB crap GDDR5 and make the 2050ti a 4 or 6GB only with GDDR6 only...
Posted on Reply
#74
FordGT90Concept
"I go fast!1!11!1!"
Vya Domus said:
That's on the OS side of things and again, doesn't have anything to do with how much VRAM an application will use.
It can't directly address anything outside of 32-bit virtual address space (includes system RAM and VRAM). Like RAM, VRAM can use a physical address extension like system to get around that but performance is terrible so not generally viable for games.

Point is, the reason why 1, 2, 3.5 GiB was okay for long is because games were developed for Xbox 360 and PlayStation 3 that only had 512 MiB of memory. Everything was kept 32-bit because AAA games had to fit in such a tiny footprint anyway on consoles. With the transition to Xbox One and PlayStation 4 having 8 GiB of memory, 64-bit games are everywhere and with it a "smoke it if you got it" approach to memory usage. That's why the GTX 970 has aged poorly and so many in this thread scoff at 3 GiB variants of graphics cards. 3 GiB was more than enough several years ago. It isn't anymore because the 32-bit/512 MiB RAM veil has finally been truly lifted.
Posted on Reply
#75
eidairaman1
The Exiled Airman
FordGT90Concept said:
It can't directly address anything outside of 32-bit virtual address space (includes system RAM and VRAM). Like RAM, VRAM can use a physical address extension like system to get around that but performance is terrible so not generally viable for games.

Point is, the reason why 1, 2, 3.5 GiB was okay for long is because games were developed for Xbox 360 and PlayStation 3 that only had 512 MiB of memory. Everything was kept 32-bit because AAA games had to fit in such a tiny footprint anyway on consoles. With the transition to Xbox One and PlayStation 4 having 8 GiB of memory, 64-bit games are everywhere and with it a "smoke it if you got it" approach to memory usage. That's why the GTX 970 has aged poorly and so many in this thread scoff at 3 GiB variants of graphics cards. 3 GiB was more than enough several years ago. It isn't anymore because the 32-bit/512 MiB RAM veil has finally been truly lifted.
Even the 3GB 7970/280s are getting that way, 6GB variants are doing better.
Posted on Reply
Add your own comment