• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

Joined
Aug 14, 2017
Messages
74 (0.03/day)
mid range games,meaning FHD resolution 3GB of memory is more than enough..if some1 not know it.

and that one 3GB 2060 is for target.still its fastest gpu for that..with 3GB.

so,if you using FHD monitor,2060 with 3GB memory is best choice bcoz you get near 50 fps almost all games with low price
 
Joined
Jul 5, 2013
Messages
25,559 (6.49/day)
Max (har har) except for MSAA (4x) and tessellation (off).
Try this, turn the MSAA off. Turn the shadows down to low. Turn the tessellation back on to normal. Granted you're running a Radeon, but shouldn't matter too much concerning memory load. Post a screen shot with those setting to see if the memory load drops. The reason I ask is as follows, now that GPU prices have come down for used 1060's, I've been installing then in client PC's and tweaking driver settings to keep performance good. So I know that 1060 3gb cards are good gaming cards when settings are config'd well.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Try this, turn the MSAA off. Turn the shadows down to low. Turn the tessellation back on to normal. Granted you're running a Radeon, but shouldn't matter too much concerning memory load. Post a screen shot with those setting to see if the memory load drops. The reason I ask is as follows, now that GPU prices have come down for used 1060's, I've been installing then in client PC's and tweaking driver settings to keep performance good. So I know that 1060 3gb cards are good gaming cards when settings are config'd well.
My card has 8 GiB. Why would I gimp the game to save VRAM? :roll: To be fair, the game mostly hovered around 2 GiB.

The point I was trying to make is that you don't have to look hard to find a game that uses 3+ GiB VRAM these days.
 
Joined
Apr 29, 2014
Messages
4,180 (1.15/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
Frankly this seems like a bad idea in my book. I don't know if its an experiment or what but to me having that many versions is just odd even on just the different VRAM versions. Its going to be one heck of a confusing lineup that's for sure.
 

bug

Joined
May 22, 2015
Messages
13,213 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Frankly this seems like a bad idea in my book. I don't know if its an experiment or what but to me having that many versions is just odd even on just the different VRAM versions. Its going to be one heck of a confusing lineup that's for sure.
I'm betting GDDR5 and 6 support is baked in because availability couldn't be predicted. Irl we'll see one or the other taking the lion's share.
 
Joined
Jun 10, 2014
Messages
2,900 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
The point I was trying to make is that you don't have to look hard to find a game that uses 3+ GiB VRAM these days.
The point wasn't to try to make a game use >3GB memory, that's easy, but the fact that most games can be configured to run well within those constraints.
Just because you can find a use for >3GB, doesn't mean everyone needs it.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Nor was I trying. I just happened to notice it, thread was getting derailed, so I posted it. :p
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
The point wasn't to try to make a game use >3GB memory, that's easy, but the fact that most games can be configured to run well within those constraints.
Just because you can find a use for >3GB, doesn't mean everyone needs it.
Many users like to crank the settings/ultra to get what they paid for too. ;)

We can turn things down to run on, and look like, a potato. :p
 
Joined
Jul 5, 2013
Messages
25,559 (6.49/day)
Many users like to crank the settings/ultra to get what they paid for too. ;)
Not everyone.
We can turn things down to run on, and look like, a potato. :p
Really? Turning off setting that don't mean much will not result in a game looking like a "potato". Further, turning off or down certain settings to max performance on a lower tier card is also not going to have that result either.
 

bug

Joined
May 22, 2015
Messages
13,213 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Not everyone.

I would expect a clueless user will just try various presets and settle for one. A more informed user will know how to tweak at least some of the settings. I wouldn't expect many users to just max out everything and refuse to play any other way anymore than I expect drivers to get behind the wheel and just press the pedal to the metal. Users that do that probably don't buy a mid range card to begin with. But I have seen stranger things.

Really? Turning off setting that don't mean much will not result in a game looking like a "potato". Further, turning off or down certain settings to max performance on a lower tier card is also not going to have that result either.
It depends on the game. In some titles, "medium" can be hard to distinguish from "ultra". In some titles, "medium" can look significantly worse. And then there's the matter of the game itself. Some give you time to admire the scenery, some do not.
 
Joined
Jul 5, 2013
Messages
25,559 (6.49/day)
It depends on the game. In some titles, "medium" can be hard to distinguish from "ultra". In some titles, "medium" can look significantly worse. And then there's the matter of the game itself. Some give you time to admire the scenery, some do not.
Good points all!
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Not everyone.

Really? Turning off setting that don't mean much will not result in a game looking like a "potato". Further, turning off or down certain settings to max performance on a lower tier card is also not going to have that result either.
I didnt say everyone. ;)

I didnt buy a PC to have it look worse than a console. Some need to...some choose to, others like ultra. It is what it is.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I only tend to change graphics settings when the game on start looks atrocious (especially games defaulting to anything less than my monitor's native resolution). That was the case with Max Payne (defaulted to 800x600). Naturally, the game didn't know what an RX 590 was so it defaults to medium/low settings. That's when I turn everything up to max (hehe), check the framerate which was noticeably terrible at around 35 fps, and adjusted MSAA and tessellation down. Obviously the game uses an NVIDIA-biased implementation of tessellation which hasn't been optimized for AMD in the last six years.

Newer games with newer cards, the defaults are usually good enough. It's older games that don't know what the graphics card is that need tweaking.
 
Joined
Apr 29, 2014
Messages
4,180 (1.15/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
I'm betting GDDR5 and 6 support is baked in because availability couldn't be predicted. Irl we'll see one or the other taking the lion's share.
That's a fair point I had not considered. Personally I just hope there really are not that many versions lol.
 
Joined
Apr 15, 2009
Messages
1,011 (0.18/day)
Processor Ryzen 9 5900X
Motherboard Gigabyte X570 Aorus Master
Cooling ARCTIC Liquid Freezer II 360 A-RGB
Memory 32 GB Ballistix Elite DDR4-3600 CL16
Video Card(s) XFX 6800 XT Speedster Merc 319 Black
Storage Sabrent Rocket NVMe 4.0 1TB
Display(s) LG 27GL850B x 2 / ASUS MG278Q
Case be quiet! Silent Base 802
Audio Device(s) Sound Blaster AE-7 / Sennheiser HD 660S
Power Supply Seasonic Prime 750W Titanium
Software Windows 11 Pro 64
Que the threads asking to "unlock" the extra memory in their lower tier models.
 
Joined
Feb 18, 2017
Messages
688 (0.26/day)
That's an opinion, and not a very good one. Every RTX card I've installed/used kicks the ever living snot out of the GTX10XX/Titan counter parts to say nothing of Radeon cards. And that's before RTRT is factored into the equation. The only people whining about the price are the people who can't afford them or have to wait an extra bit of time to save money for one. Everyone else is buying them up, which is why they are consistently still selling out, months after release.
So for you, it is enough that the 20 series can't even beat the 10 series the 900 series did with the 700 to say it's not the worst price-performance GPU ever made.
Here's a set of facts;
1. Every generation of new GPU's get a price increase.
You are simply LYING. Comparing MSRP prices (that's what you have to do, not comparing previous gen GPU after their price drop to the new gen), there was a price decrease of $50-100 in the 700-900 switch (where there was a slightly bigger jump in performance), a price increase of $50-100 in the 900-1000 switch, which brought a HUGE performance leap. There was minimal-none price jump with the 600-700 switch except for the $150 increase of the 780. There was also minimal-none increase in the case of the 500-600. And now we are speaking of $100-300 (which in reality was more like $500) price jump. Don't you really feel how pathetic is that, or you are just an NV employee?

Plus it wouldn't have been that bad if there was a minimal price increase for the RTX series, let's say 50-50$ for the 2070 and 2080. But anything you say, just check Techpowerup's poll before the release of RTX, check the stock market, check general reception of potential customers about the card, and you will now you are just simply lying to yourself, too.
The 2080 cleanly beats out the 1080 and beats out 1080ti if it doesn't match it. Also RTX offers advancements Pascal can not. The 2080/2080ti and RTX Titan are the best on the market. NVidia knows this and demands a premium price for it. If you don't want to pay that price, ok, don't buy one. Settle for less.
As some already reacted to this, I have to do that too: Wow, cleanly beats out a 2,5 year old card by nearly 30%. What a result! 2080 is 1% faster than the 1080Ti, which is totally equal in performance, so it doesnt' beat it out. Just to remind you: 1080 beat the 980 Ti by 38% while costing $50 less. The 2080 equals the 1080Ti while costing the same. And as mentioned before, 1080Ti can be OCd better than the 2080.
Try this, turn the MSAA off. Turn the shadows down to low. Turn the tessellation back on to normal. Granted you're running a Radeon, but shouldn't matter too much concerning memory load. Post a screen shot with those setting to see if the memory load drops. The reason I ask is as follows, now that GPU prices have come down for used 1060's, I've been installing then in client PC's and tweaking driver settings to keep performance good. So I know that 1060 3gb cards are good gaming cards when settings are config'd well.

So you are advising people who want to buy a ~ $350+ 3GB 2060 (which is near the price of the 1070 with 8GB) to lower settings in FHD. LOL. No other words needed. I hope you advise your customers only Intel-NV rigs. :D

The fact is that objectively the only really good point in the RTX is series is the Founder's Edition's solid cooling solutions (in terms of noise, cooling performance) and neat look (subjective).
 
Last edited:
Joined
Jul 5, 2013
Messages
25,559 (6.49/day)
So for you, it is enough that the 20 series can't even beat the 10 series the 900 series did with the 700 to say it's not the worst price-performance GPU ever made.
Wow, what a conclusion. Clearly you have looked at the review performance graphs. Well done.
You are simply LYING.
Are you sure? Maybe I'm postulating a set of suggestions based on pure hyperbole?
As some already reacted to this, I have to do that too
Of course you would. Sure.
Wow, cleanly beats out a 2,5 year old card by nearly 30%.
It would seem you know how to read like an expert..
What a result! 2080 is 1% faster than the 1080Ti
So 30% is equal to 1%? Is that what you're saying?
which is totally equal in performance
Your math skills are dizzying!
so it doesnt' beat it out.
Ok, sure.
Just to remind you: 1080 beat the 980 Ti by 38% while costing $50 less. The 2080 equals the 1080Ti while costing the same. And as mentioned before, 1080Ti can be OCd better than the 2080.
Gee, thanks for the reminders. You're very helpful.

@Vayra86
Earlier you said I was making a fool of myself... How are things going on that?
 
Last edited:
Joined
Sep 17, 2014
Messages
20,906 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Wow, what a conclusion. Clearly you have looked at the review performance graphs. Well done.

Are you sure? Maybe I'm postulating a set of suggestions based on pure hyperbole?

Of course you would. Sure.

It would seem you know how to read like an expert..

So 30% is equal to 1%? Is that what you're saying?

Your math skills are dizzying!

Ok, sure.

Gee, thanks for the reminders. You're very helpful.

@Vayra86
Earlier you said I was making a fool of myself... How we doing on that?

Im not seeing much of a change. Topic went right back to shit the moment you started 'moderating' everything posted.

Suffice to say, Im out, enjoy yourselves
 
Joined
May 13, 2010
Messages
5,688 (1.12/day)
System Name RemixedBeast-NX
Processor Intel Xeon E5-2690 @ 2.9Ghz (8C/16T)
Motherboard Dell Inc. 08HPGT (CPU 1)
Cooling Dell Standard
Memory 24GB ECC
Video Card(s) Gigabyte Nvidia RTX2060 6GB
Storage 2TB Samsung 860 EVO SSD//2TB WD Black HDD
Display(s) Samsung SyncMaster P2350 23in @ 1920x1080 + Dell E2013H 20 in @1600x900
Case Dell Precision T3600 Chassis
Audio Device(s) Beyerdynamic DT770 Pro 80 // Fiio E7 Amp/DAC
Power Supply 630w Dell T3600 PSU
Mouse Logitech G700s/G502
Keyboard Logitech K740
Software Linux Mint 20
Benchmark Scores Network: APs: Cisco Meraki MR32, Ubiquiti Unifi AP-AC-LR and Lite Router/Sw:Meraki MX64 MS220-8P
Nah nvidia needs to have a card with over 30 variations like the galaxy s4 lmao
 
Joined
Jan 15, 2015
Messages
362 (0.11/day)
Nah nvidia needs to have a card with over 30 variations like the galaxy s4 lmao
Ask and ye shall receive...

Gigabyte will have at least 40 variants of the GeForce RTX 2060 graphics card according to an EEC (Eurasian Economic Commission) product listing. link

:rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes:

(eyeroll count not gratuitous)
 
Joined
May 13, 2010
Messages
5,688 (1.12/day)
System Name RemixedBeast-NX
Processor Intel Xeon E5-2690 @ 2.9Ghz (8C/16T)
Motherboard Dell Inc. 08HPGT (CPU 1)
Cooling Dell Standard
Memory 24GB ECC
Video Card(s) Gigabyte Nvidia RTX2060 6GB
Storage 2TB Samsung 860 EVO SSD//2TB WD Black HDD
Display(s) Samsung SyncMaster P2350 23in @ 1920x1080 + Dell E2013H 20 in @1600x900
Case Dell Precision T3600 Chassis
Audio Device(s) Beyerdynamic DT770 Pro 80 // Fiio E7 Amp/DAC
Power Supply 630w Dell T3600 PSU
Mouse Logitech G700s/G502
Keyboard Logitech K740
Software Linux Mint 20
Benchmark Scores Network: APs: Cisco Meraki MR32, Ubiquiti Unifi AP-AC-LR and Lite Router/Sw:Meraki MX64 MS220-8P

bug

Joined
May 22, 2015
Messages
13,213 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Ask and ye shall receive...

Gigabyte will have at least 40 variants of the GeForce RTX 2060 graphics card according to an EEC (Eurasian Economic Commission) product listing. link

:rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes:

(eyeroll count not gratuitous)
Yeah, there's no 40 cards in there. Just an assumption, based on what could vary between models.
 
Top