• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

mid range games,meaning FHD resolution 3GB of memory is more than enough..if some1 not know it.

and that one 3GB 2060 is for target.still its fastest gpu for that..with 3GB.

so,if you using FHD monitor,2060 with 3GB memory is best choice bcoz you get near 50 fps almost all games with low price
 
Max (har har) except for MSAA (4x) and tessellation (off).
Try this, turn the MSAA off. Turn the shadows down to low. Turn the tessellation back on to normal. Granted you're running a Radeon, but shouldn't matter too much concerning memory load. Post a screen shot with those setting to see if the memory load drops. The reason I ask is as follows, now that GPU prices have come down for used 1060's, I've been installing then in client PC's and tweaking driver settings to keep performance good. So I know that 1060 3gb cards are good gaming cards when settings are config'd well.
 
Try this, turn the MSAA off. Turn the shadows down to low. Turn the tessellation back on to normal. Granted you're running a Radeon, but shouldn't matter too much concerning memory load. Post a screen shot with those setting to see if the memory load drops. The reason I ask is as follows, now that GPU prices have come down for used 1060's, I've been installing then in client PC's and tweaking driver settings to keep performance good. So I know that 1060 3gb cards are good gaming cards when settings are config'd well.
My card has 8 GiB. Why would I gimp the game to save VRAM? :roll: To be fair, the game mostly hovered around 2 GiB.

The point I was trying to make is that you don't have to look hard to find a game that uses 3+ GiB VRAM these days.
 
Frankly this seems like a bad idea in my book. I don't know if its an experiment or what but to me having that many versions is just odd even on just the different VRAM versions. Its going to be one heck of a confusing lineup that's for sure.
 
Frankly this seems like a bad idea in my book. I don't know if its an experiment or what but to me having that many versions is just odd even on just the different VRAM versions. Its going to be one heck of a confusing lineup that's for sure.
I'm betting GDDR5 and 6 support is baked in because availability couldn't be predicted. Irl we'll see one or the other taking the lion's share.
 
The point I was trying to make is that you don't have to look hard to find a game that uses 3+ GiB VRAM these days.
The point wasn't to try to make a game use >3GB memory, that's easy, but the fact that most games can be configured to run well within those constraints.
Just because you can find a use for >3GB, doesn't mean everyone needs it.
 
Nor was I trying. I just happened to notice it, thread was getting derailed, so I posted it. :P
 
The point wasn't to try to make a game use >3GB memory, that's easy, but the fact that most games can be configured to run well within those constraints.
Just because you can find a use for >3GB, doesn't mean everyone needs it.
Many users like to crank the settings/ultra to get what they paid for too. ;)

We can turn things down to run on, and look like, a potato. :p
 
Many users like to crank the settings/ultra to get what they paid for too. ;)
Not everyone.
We can turn things down to run on, and look like, a potato. :p
Really? Turning off setting that don't mean much will not result in a game looking like a "potato". Further, turning off or down certain settings to max performance on a lower tier card is also not going to have that result either.
 
Not everyone.

I would expect a clueless user will just try various presets and settle for one. A more informed user will know how to tweak at least some of the settings. I wouldn't expect many users to just max out everything and refuse to play any other way anymore than I expect drivers to get behind the wheel and just press the pedal to the metal. Users that do that probably don't buy a mid range card to begin with. But I have seen stranger things.

Really? Turning off setting that don't mean much will not result in a game looking like a "potato". Further, turning off or down certain settings to max performance on a lower tier card is also not going to have that result either.
It depends on the game. In some titles, "medium" can be hard to distinguish from "ultra". In some titles, "medium" can look significantly worse. And then there's the matter of the game itself. Some give you time to admire the scenery, some do not.
 
It depends on the game. In some titles, "medium" can be hard to distinguish from "ultra". In some titles, "medium" can look significantly worse. And then there's the matter of the game itself. Some give you time to admire the scenery, some do not.
Good points all!
 
Not everyone.

Really? Turning off setting that don't mean much will not result in a game looking like a "potato". Further, turning off or down certain settings to max performance on a lower tier card is also not going to have that result either.
I didnt say everyone. ;)

I didnt buy a PC to have it look worse than a console. Some need to...some choose to, others like ultra. It is what it is.
 
I only tend to change graphics settings when the game on start looks atrocious (especially games defaulting to anything less than my monitor's native resolution). That was the case with Max Payne (defaulted to 800x600). Naturally, the game didn't know what an RX 590 was so it defaults to medium/low settings. That's when I turn everything up to max (hehe), check the framerate which was noticeably terrible at around 35 fps, and adjusted MSAA and tessellation down. Obviously the game uses an NVIDIA-biased implementation of tessellation which hasn't been optimized for AMD in the last six years.

Newer games with newer cards, the defaults are usually good enough. It's older games that don't know what the graphics card is that need tweaking.
 
I'm betting GDDR5 and 6 support is baked in because availability couldn't be predicted. Irl we'll see one or the other taking the lion's share.
That's a fair point I had not considered. Personally I just hope there really are not that many versions lol.
 
Que the threads asking to "unlock" the extra memory in their lower tier models.
 
That's an opinion, and not a very good one. Every RTX card I've installed/used kicks the ever living snot out of the GTX10XX/Titan counter parts to say nothing of Radeon cards. And that's before RTRT is factored into the equation. The only people whining about the price are the people who can't afford them or have to wait an extra bit of time to save money for one. Everyone else is buying them up, which is why they are consistently still selling out, months after release.
So for you, it is enough that the 20 series can't even beat the 10 series the 900 series did with the 700 to say it's not the worst price-performance GPU ever made.
Here's a set of facts;
1. Every generation of new GPU's get a price increase.
You are simply LYING. Comparing MSRP prices (that's what you have to do, not comparing previous gen GPU after their price drop to the new gen), there was a price decrease of $50-100 in the 700-900 switch (where there was a slightly bigger jump in performance), a price increase of $50-100 in the 900-1000 switch, which brought a HUGE performance leap. There was minimal-none price jump with the 600-700 switch except for the $150 increase of the 780. There was also minimal-none increase in the case of the 500-600. And now we are speaking of $100-300 (which in reality was more like $500) price jump. Don't you really feel how pathetic is that, or you are just an NV employee?

Plus it wouldn't have been that bad if there was a minimal price increase for the RTX series, let's say 50-50$ for the 2070 and 2080. But anything you say, just check Techpowerup's poll before the release of RTX, check the stock market, check general reception of potential customers about the card, and you will now you are just simply lying to yourself, too.
The 2080 cleanly beats out the 1080 and beats out 1080ti if it doesn't match it. Also RTX offers advancements Pascal can not. The 2080/2080ti and RTX Titan are the best on the market. NVidia knows this and demands a premium price for it. If you don't want to pay that price, ok, don't buy one. Settle for less.
As some already reacted to this, I have to do that too: Wow, cleanly beats out a 2,5 year old card by nearly 30%. What a result! 2080 is 1% faster than the 1080Ti, which is totally equal in performance, so it doesnt' beat it out. Just to remind you: 1080 beat the 980 Ti by 38% while costing $50 less. The 2080 equals the 1080Ti while costing the same. And as mentioned before, 1080Ti can be OCd better than the 2080.
Try this, turn the MSAA off. Turn the shadows down to low. Turn the tessellation back on to normal. Granted you're running a Radeon, but shouldn't matter too much concerning memory load. Post a screen shot with those setting to see if the memory load drops. The reason I ask is as follows, now that GPU prices have come down for used 1060's, I've been installing then in client PC's and tweaking driver settings to keep performance good. So I know that 1060 3gb cards are good gaming cards when settings are config'd well.

So you are advising people who want to buy a ~ $350+ 3GB 2060 (which is near the price of the 1070 with 8GB) to lower settings in FHD. LOL. No other words needed. I hope you advise your customers only Intel-NV rigs. :D

The fact is that objectively the only really good point in the RTX is series is the Founder's Edition's solid cooling solutions (in terms of noise, cooling performance) and neat look (subjective).
 
Last edited:
So for you, it is enough that the 20 series can't even beat the 10 series the 900 series did with the 700 to say it's not the worst price-performance GPU ever made.
Wow, what a conclusion. Clearly you have looked at the review performance graphs. Well done.
You are simply LYING.
Are you sure? Maybe I'm postulating a set of suggestions based on pure hyperbole?
As some already reacted to this, I have to do that too
Of course you would. Sure.
Wow, cleanly beats out a 2,5 year old card by nearly 30%.
It would seem you know how to read like an expert..
What a result! 2080 is 1% faster than the 1080Ti
So 30% is equal to 1%? Is that what you're saying?
which is totally equal in performance
Your math skills are dizzying!
so it doesnt' beat it out.
Ok, sure.
Just to remind you: 1080 beat the 980 Ti by 38% while costing $50 less. The 2080 equals the 1080Ti while costing the same. And as mentioned before, 1080Ti can be OCd better than the 2080.
Gee, thanks for the reminders. You're very helpful.

@Vayra86
Earlier you said I was making a fool of myself... How are things going on that?
 
Last edited:
Wow, what a conclusion. Clearly you have looked at the review performance graphs. Well done.

Are you sure? Maybe I'm postulating a set of suggestions based on pure hyperbole?

Of course you would. Sure.

It would seem you know how to read like an expert..

So 30% is equal to 1%? Is that what you're saying?

Your math skills are dizzying!

Ok, sure.

Gee, thanks for the reminders. You're very helpful.

@Vayra86
Earlier you said I was making a fool of myself... How we doing on that?

Im not seeing much of a change. Topic went right back to shit the moment you started 'moderating' everything posted.

Suffice to say, Im out, enjoy yourselves
 
Nah nvidia needs to have a card with over 30 variations like the galaxy s4 lmao
 
Nah nvidia needs to have a card with over 30 variations like the galaxy s4 lmao
Ask and ye shall receive...

Gigabyte will have at least 40 variants of the GeForce RTX 2060 graphics card according to an EEC (Eurasian Economic Commission) product listing. link

:rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes:

(eyeroll count not gratuitous)
 
Ask and ye shall receive...

Gigabyte will have at least 40 variants of the GeForce RTX 2060 graphics card according to an EEC (Eurasian Economic Commission) product listing. link

:rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes:

(eyeroll count not gratuitous)
Yeah, there's no 40 cards in there. Just an assumption, based on what could vary between models.
 
Back
Top