Tuesday, December 25th 2018

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

NVIDIA drew consumer ire for differentiating its GeForce GTX 1060 into two variants based on memory, the GTX 1060 3 GB and GTX 1060 6 GB, with the two also featuring different GPU core-configurations. The company plans to double-down - or should we say, triple-down - on its sub-branding shenanigans with the upcoming GeForce RTX 2060. According to VideoCardz, citing a GIGABYTE leak about regulatory filings, NVIDIA could be carving out not two, but six variants of the RTX 2060!

There are at least two parameters that differentiate the six (that we know of anyway): memory size and memory type. There are three memory sizes, 3 GB, 4 GB, and 6 GB. Each of the three memory sizes come in two memory types, the latest GDDR6 and the older GDDR5. Based on the six RTX 2060 variants, GIGABYTE could launch up to thirty nine SKUs. When you add up similar SKU counts from NVIDIA's other AIC partners, there could be upward of 300 RTX 2060 graphics card models to choose from. It won't surprise us if in addition to memory size and type, GPU core-configurations also vary between the six RTX 2060 variants compounding consumer confusion. The 12 nm "TU106" silicon already has "A" and "non-A" ASIC classes, so there could be as many as twelve new device IDs in all! The GeForce RTX 2060 is expected to debut in January 2019.
Source: VideoCardz
Add your own comment

230 Comments on NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

#201
lexluthermiester
FordGT90Concept said:
Max (har har) except for MSAA (4x) and tessellation (off).
Try this, turn the MSAA off. Turn the shadows down to low. Turn the tessellation back on to normal. Granted you're running a Radeon, but shouldn't matter too much concerning memory load. Post a screen shot with those setting to see if the memory load drops. The reason I ask is as follows, now that GPU prices have come down for used 1060's, I've been installing then in client PC's and tweaking driver settings to keep performance good. So I know that 1060 3gb cards are good gaming cards when settings are config'd well.
Posted on Reply
#202
FordGT90Concept
"I go fast!1!11!1!"
lexluthermiester said:
Try this, turn the MSAA off. Turn the shadows down to low. Turn the tessellation back on to normal. Granted you're running a Radeon, but shouldn't matter too much concerning memory load. Post a screen shot with those setting to see if the memory load drops. The reason I ask is as follows, now that GPU prices have come down for used 1060's, I've been installing then in client PC's and tweaking driver settings to keep performance good. So I know that 1060 3gb cards are good gaming cards when settings are config'd well.
My card has 8 GiB. Why would I gimp the game to save VRAM? :roll: To be fair, the game mostly hovered around 2 GiB.

The point I was trying to make is that you don't have to look hard to find a game that uses 3+ GiB VRAM these days.
Posted on Reply
#203
GhostRyder
Frankly this seems like a bad idea in my book. I don't know if its an experiment or what but to me having that many versions is just odd even on just the different VRAM versions. Its going to be one heck of a confusing lineup that's for sure.
Posted on Reply
#204
bug
GhostRyder said:
Frankly this seems like a bad idea in my book. I don't know if its an experiment or what but to me having that many versions is just odd even on just the different VRAM versions. Its going to be one heck of a confusing lineup that's for sure.
I'm betting GDDR5 and 6 support is baked in because availability couldn't be predicted. Irl we'll see one or the other taking the lion's share.
Posted on Reply
#205
efikkan
FordGT90Concept said:

The point I was trying to make is that you don't have to look hard to find a game that uses 3+ GiB VRAM these days.
The point wasn't to try to make a game use >3GB memory, that's easy, but the fact that most games can be configured to run well within those constraints.
Just because you can find a use for >3GB, doesn't mean everyone needs it.
Posted on Reply
#206
FordGT90Concept
"I go fast!1!11!1!"
Nor was I trying. I just happened to notice it, thread was getting derailed, so I posted it. :P
Posted on Reply
#207
EarthDog
efikkan said:
The point wasn't to try to make a game use >3GB memory, that's easy, but the fact that most games can be configured to run well within those constraints.
Just because you can find a use for >3GB, doesn't mean everyone needs it.
Many users like to crank the settings/ultra to get what they paid for too. ;)

We can turn things down to run on, and look like, a potato. :p
Posted on Reply
#208
lexluthermiester
EarthDog said:
Many users like to crank the settings/ultra to get what they paid for too. ;)
Not everyone.
EarthDog said:
We can turn things down to run on, and look like, a potato. :p
Really? Turning off setting that don't mean much will not result in a game looking like a "potato". Further, turning off or down certain settings to max performance on a lower tier card is also not going to have that result either.
Posted on Reply
#209
bug
lexluthermiester said:
Not everyone.
I would expect a clueless user will just try various presets and settle for one. A more informed user will know how to tweak at least some of the settings. I wouldn't expect many users to just max out everything and refuse to play any other way anymore than I expect drivers to get behind the wheel and just press the pedal to the metal. Users that do that probably don't buy a mid range card to begin with. But I have seen stranger things.

lexluthermiester said:
Really? Turning off setting that don't mean much will not result in a game looking like a "potato". Further, turning off or down certain settings to max performance on a lower tier card is also not going to have that result either.
It depends on the game. In some titles, "medium" can be hard to distinguish from "ultra". In some titles, "medium" can look significantly worse. And then there's the matter of the game itself. Some give you time to admire the scenery, some do not.
Posted on Reply
#210
lexluthermiester
bug said:
It depends on the game. In some titles, "medium" can be hard to distinguish from "ultra". In some titles, "medium" can look significantly worse. And then there's the matter of the game itself. Some give you time to admire the scenery, some do not.
Good points all!
Posted on Reply
#211
EarthDog
lexluthermiester said:
Not everyone.

Really? Turning off setting that don't mean much will not result in a game looking like a "potato". Further, turning off or down certain settings to max performance on a lower tier card is also not going to have that result either.
I didnt say everyone. ;)

I didnt buy a PC to have it look worse than a console. Some need to...some choose to, others like ultra. It is what it is.
Posted on Reply
#212
lexluthermiester
EarthDog said:
I didnt buy a PC to have it look worse than a console.
LOLOLOLOLOL!
Posted on Reply
#213
FordGT90Concept
"I go fast!1!11!1!"
I only tend to change graphics settings when the game on start looks atrocious (especially games defaulting to anything less than my monitor's native resolution). That was the case with Max Payne (defaulted to 800x600). Naturally, the game didn't know what an RX 590 was so it defaults to medium/low settings. That's when I turn everything up to max (hehe), check the framerate which was noticeably terrible at around 35 fps, and adjusted MSAA and tessellation down. Obviously the game uses an NVIDIA-biased implementation of tessellation which hasn't been optimized for AMD in the last six years.

Newer games with newer cards, the defaults are usually good enough. It's older games that don't know what the graphics card is that need tweaking.
Posted on Reply
#214
GhostRyder
bug said:
I'm betting GDDR5 and 6 support is baked in because availability couldn't be predicted. Irl we'll see one or the other taking the lion's share.
That's a fair point I had not considered. Personally I just hope there really are not that many versions lol.
Posted on Reply
#215
Divide Overflow
Que the threads asking to "unlock" the extra memory in their lower tier models.
Posted on Reply
#217
B-Real
lexluthermiester said:

That's an opinion, and not a very good one. Every RTX card I've installed/used kicks the ever living snot out of the GTX10XX/Titan counter parts to say nothing of Radeon cards. And that's before RTRT is factored into the equation. The only people whining about the price are the people who can't afford them or have to wait an extra bit of time to save money for one. Everyone else is buying them up, which is why they are consistently still selling out, months after release.
So for you, it is enough that the 20 series can't even beat the 10 series the 900 series did with the 700 to say it's not the worst price-performance GPU ever made.
lexluthermiester said:

Here's a set of facts;
1. Every generation of new GPU's get a price increase.
You are simply LYING. Comparing MSRP prices (that's what you have to do, not comparing previous gen GPU after their price drop to the new gen), there was a price decrease of $50-100 in the 700-900 switch (where there was a slightly bigger jump in performance), a price increase of $50-100 in the 900-1000 switch, which brought a HUGE performance leap. There was minimal-none price jump with the 600-700 switch except for the $150 increase of the 780. There was also minimal-none increase in the case of the 500-600. And now we are speaking of $100-300 (which in reality was more like $500) price jump. Don't you really feel how pathetic is that, or you are just an NV employee?

Plus it wouldn't have been that bad if there was a minimal price increase for the RTX series, let's say 50-50$ for the 2070 and 2080. But anything you say, just check Techpowerup's poll before the release of RTX, check the stock market, check general reception of potential customers about the card, and you will now you are just simply lying to yourself, too.
lexluthermiester said:

The 2080 cleanly beats out the 1080 and beats out 1080ti if it doesn't match it. Also RTX offers advancements Pascal can not. The 2080/2080ti and RTX Titan are the best on the market. NVidia knows this and demands a premium price for it. If you don't want to pay that price, ok, don't buy one. Settle for less.
As some already reacted to this, I have to do that too: Wow, cleanly beats out a 2,5 year old card by nearly 30%. What a result! 2080 is 1% faster than the 1080Ti, which is totally equal in performance, so it doesnt' beat it out. Just to remind you: 1080 beat the 980 Ti by 38% while costing $50 less. The 2080 equals the 1080Ti while costing the same. And as mentioned before, 1080Ti can be OCd better than the 2080.
lexluthermiester said:
Try this, turn the MSAA off. Turn the shadows down to low. Turn the tessellation back on to normal. Granted you're running a Radeon, but shouldn't matter too much concerning memory load. Post a screen shot with those setting to see if the memory load drops. The reason I ask is as follows, now that GPU prices have come down for used 1060's, I've been installing then in client PC's and tweaking driver settings to keep performance good. So I know that 1060 3gb cards are good gaming cards when settings are config'd well.
So you are advising people who want to buy a ~ $350+ 3GB 2060 (which is near the price of the 1070 with 8GB) to lower settings in FHD. LOL. No other words needed. I hope you advise your customers only Intel-NV rigs. :D

The fact is that objectively the only really good point in the RTX is series is the Founder's Edition's solid cooling solutions (in terms of noise, cooling performance) and neat look (subjective).
Posted on Reply
#218
lexluthermiester
B-Real said:
So for you, it is enough that the 20 series can't even beat the 10 series the 900 series did with the 700 to say it's not the worst price-performance GPU ever made.
Wow, what a conclusion. Clearly you have looked at the review performance graphs. Well done.
B-Real said:
You are simply LYING.
Are you sure? Maybe I'm postulating a set of suggestions based on pure hyperbole?
B-Real said:
As some already reacted to this, I have to do that too
Of course you would. Sure.
B-Real said:
Wow, cleanly beats out a 2,5 year old card by nearly 30%.
It would seem you know how to read like an expert..
B-Real said:
What a result! 2080 is 1% faster than the 1080Ti
So 30% is equal to 1%? Is that what you're saying?
B-Real said:
which is totally equal in performance
Your math skills are dizzying!
B-Real said:
so it doesnt' beat it out.
Ok, sure.
B-Real said:
Just to remind you: 1080 beat the 980 Ti by 38% while costing $50 less. The 2080 equals the 1080Ti while costing the same. And as mentioned before, 1080Ti can be OCd better than the 2080.
Gee, thanks for the reminders. You're very helpful.

@Vayra86
Earlier you said I was making a fool of myself... How are things going on that?
Posted on Reply
#219
Vayra86
lexluthermiester said:
Wow, what a conclusion. Clearly you have looked at the review performance graphs. Well done.

Are you sure? Maybe I'm postulating a set of suggestions based on pure hyperbole?

Of course you would. Sure.

It would seem you know how to read like an expert..

So 30% is equal to 1%? Is that what you're saying?

Your math skills are dizzying!

Ok, sure.

Gee, thanks for the reminders. You're very helpful.

@Vayra86
Earlier you said I was making a fool of myself... How we doing on that?
Im not seeing much of a change. Topic went right back to shit the moment you started 'moderating' everything posted.

Suffice to say, Im out, enjoy yourselves
Posted on Reply
#220
vip3r011
$250 mayb i cna dream for a rtx 2060 3gb
Posted on Reply
#221
remixedcat
Nah nvidia needs to have a card with over 30 variations like the galaxy s4 lmao
Posted on Reply
#222
RichF
remixedcat said:
Nah nvidia needs to have a card with over 30 variations like the galaxy s4 lmao
Ask and ye shall receive...

Gigabyte will have at least 40 variants of the GeForce RTX 2060 graphics card according to an EEC (Eurasian Economic Commission) product listing. link

:rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes:

(eyeroll count not gratuitous)
Posted on Reply
#224
bug
RichF said:
Ask and ye shall receive...

Gigabyte will have at least 40 variants of the GeForce RTX 2060 graphics card according to an EEC (Eurasian Economic Commission) product listing. link

:rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes:

(eyeroll count not gratuitous)
Yeah, there's no 40 cards in there. Just an assumption, based on what could vary between models.
Posted on Reply
Add your own comment