• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 2060 Founders Edition Pictured, Tested

Joined
Apr 1, 2013
Messages
127 (0.05/day)
You are saying a 2080 Ti should be 600$? Because they “doubled” it? I know you are going to say “it was a metaphor”, and that’s where problems come in. It’s NOT double, at most I would say they charged 10-25% more than it’s worth. 10-25% is NOT double. That’s where customer satisfaction can be altered, by misleading a consumer market and saying it’s absolutely not worth it where in reality it’s not like what you said.
No, It's not he comparison I wish to make. The 2080 Ti is way more powerful than a 2060, any memory setup considered.

A 2080Ti should cost ~800/900 IMO and a 2080 600$.
If AMD was in the competition, I think that nvidia would be much closer to that range of prices.
 
Joined
Jul 5, 2013
Messages
9,431 (3.74/day)
System Name GPD-Q9
Processor Rockchip RK-3288 1.8ghz quad core
Motherboard GPD Q9_V6_150528
Cooling Passive
Memory 2GB DDR3
Video Card(s) Mali T764
Storage 16GB Samsung NAND
Display(s) IPS 1024x600
A 2080Ti should cost ~800/900 IMO and a 2080 600$.
Yeah, but you're not the one making them having to recoup manufacturing and R&D costs. And for the record, 2080's are not far from the $600 you mentioned.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
5,626 (1.18/day)
System Name MightyX (MITX)
Processor Ryzen 7 3700X PBO
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Noctua NH-L12
Memory 16GB DDR4 3600 CL15
Video Card(s) Gigabyte GTX1080 G1 OC/UV
Storage Samsung 970 Evo m.2 NVME
Display(s) AOC AGON AG352QCX
Case Raven RVZ-01
Power Supply Corsair SF600
Mouse Zowie EC1-A
Keyboard Razer Blackwidow X Chroma
People compare this to the price of the GTX 1060, but forget the RX 480 was on the market at the same time to keep prices a bit lower! Basically, the closest competition to the RTX 2060 will be the RX Vega 56 (if we believe in the leaks), which still costs upwards of $400~$450, except for one or another occasional promotion.

Unless AMD pulls something off their hat in January, $350 to $400 for the RTX 2060 will be in tune with what AMD also offers! Nvidia with their dominant position, is not interested in disrupting the market with price/performance.
Nice to see bunch of couch GPU designers and financial analysts knows better than a multi million GPU company regarding both technology and pricing. It is called capitalism for a reason, no competition means Nvidia can have free say on how much they price their cards. You don’t like it then don’t buy, good for you. Someone else likes it they buy and it is entirely their own business. NVIDIA is “greedy” sure yeah they better f*ucking be greedy. They are a for profit company not a f*ucking charity.
Good to see a few people out there are onto it, probably others too I just haven't quoted you all. In the absence of competition at certain price points which AMD has generally been able to do at least in the low/mid/upper-midrange (and often top teir) segments previously for some time, Nvidia just has this ability to charge a premium for premium performance. Add to that fact the upper-end RTX chips are enormous and use newer more expensive memory and yeah, you find them charging top dollar for them, and so they should in that position.

As has been said, don't like it? vote with your wallet! I sure have. I bought a GTX1080 at launch and ~2.5 years later I personally have no compelling reason to upgrade, that comes down to my rig, screen, available time to game, what I play, price performance etc etc etc, add it all together - that equation is different for every buyer.

Do I think 20 series RTX is worth it? Not yet but I'm glad someones doing it, I've seen BFV played with it on and I truly hope Ray Tracing is in the future of gaming.

My take is that when one or both of these two things happen prices will drop, perhaps but a negligible amount, perhaps significantly;

1. Nvidia clears out all (or virtually all) 10 series stock, which the market still seems hungry for, partly because many offerings offer more than adequate performance for the particular consumer's needs.
2. AMD answer the 20 series upper level performance, or release cards matching 1080/2070/vega perf at lower prices (or again, both)
 

bug

Joined
May 22, 2015
Messages
7,546 (4.11/day)
Processor Intel i5-6600k (AMD Ryzen5 3600 in a box, waiting for a mobo)
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V (@3200)
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Yeah, but you're not the one making them having to recoup manufacturing and R&D costs. And for the record, 2080's are not far from the $600 you mentioned.
My guess would be at least some of the R&D has been covered by Volta. It's the sheer die size that makes Turing so damn expensive.
If Nvidia manages to tweak the hardware for their 7nm lineup, then we'll have a strong proposition for DXR. Otherwise, we'll have to wait for another generation.
 
Joined
Jul 5, 2013
Messages
9,431 (3.74/day)
System Name GPD-Q9
Processor Rockchip RK-3288 1.8ghz quad core
Motherboard GPD Q9_V6_150528
Cooling Passive
Memory 2GB DDR3
Video Card(s) Mali T764
Storage 16GB Samsung NAND
Display(s) IPS 1024x600
My guess would be at least some of the R&D has been covered by Volta.
Maybe, but how many Volta cards have they actually sold? Even so, Volta and Turing are not the same. They have similarities, but are different enough that a lot of R&D is unrelated and doesn't cross over.
It's the sheer die size that makes Turing so damn expensive.
That is what I was referring to with manufacturing costs. Pricey wafer dies, even if you manage a high wafer/die ratio yield. That price goes way up of you can't manage at least an 88% wafer yield, which will be challenging given the total size of a functional die.
 

bug

Joined
May 22, 2015
Messages
7,546 (4.11/day)
Processor Intel i5-6600k (AMD Ryzen5 3600 in a box, waiting for a mobo)
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V (@3200)
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Maybe, but how many Volta cards have they actually sold? Even so, Volta and Turing are not the same. They have similarities, but are different enough that a lot of R&D is unrelated and doesn't cross over.
Well, Volta is only for professional cards. The Quadro goes for $9,000, God knows how much they charge for a Tesla.
And about differences, I'm really not aware of many, save for the tweaks Nvidia did to make it more fit for DXR (and probably less fit for general computing). I'm sure Anand did a piece highlighting the differences (and I'm sure I read it), but nothing striking has stuck.

That said, yes, R&D does not usually pay off after just one iteration. I was just saying they've already made some of that back.
 
Last edited:
Joined
Mar 10, 2015
Messages
3,093 (1.62/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
Nvidia clears out all (or virtually all) 10 series stock
They could certainly just restock 10 series at the prices they are at and leave the 20 series where they are to continue this pricing model until AMD releases something. Lord help us all if it doesn't compete.
 
Joined
Dec 6, 2016
Messages
81 (0.06/day)
System Name The RIG MK III
Processor AMD Ryzen 2600 @ 4.15GHz / 1.41v
Motherboard Gigabyte B450 Aorus M
Cooling Thermaright "Le Grande Macho"
Memory 8GB Crucial Ballistix Tactical 3000MHz CL15-16-16-34 @ 3466MHz CL16-17-16-34
Video Card(s) Asus VEGA 64 Gamer Strix OC
Storage WD Green 240GB | Crucial MX500 512GB M.2 | 2x 1TB 2.5" Hitachi 7200rpm | 2TB 2.5" Toshiba USB 3.0
Display(s) Twin 27" Dell u2713h 2k monitors
Case Chieftec Cube CI-01B-OP
Audio Device(s) Creative X-Fi Extreme Audio PCI-E (SB1040)
Power Supply Seasonic Focus+ Gold 750W
Mouse Logitech G600
Keyboard Razer Ornata Chroma
Software Win10 x64 PRO
No, the first Titan was released along with the 700 series.
The top model of the 600 series was the GTX 690, having two GK104 chips.
Yes, the original titan was released with the 7 series, but both use the kepler architecture. In fact, the 680 and 770 are identical, sans for a small clock speed increase in favor of the 770 (1045 vs 1006Mhz). You can even flash a 680 to 770 and vice versa (if the cards use the same pcb like say a reference design).

I have to correct you there.
GK100 was bad and Nvidia had to do a fairly "last minute" rebrand of "GTX 670 Ti" into "GTX 680". (I remember my GTX 680 box had stickers over all the product names) The GK100 was only used for some compute cards, but the GK110 was a revised version, which ended up in the GTX 780, and was pretty much what the GTX 680 should have been.

You have to remember that Kepler was a major architectural redesign for Nvidia.
No. GK100=GK110. The reason for the extra "1" is the name change from 6 series to 7 series - try to make it look like a new product. Kepler also has GK2xx branded chips witch do contain small architectural improvements, mostly for the scheduler and power efficiency. Again, and for the last time - the 680 is NOT GK100 - it's GK104. Nvidia did not release any GPU with the GK100 codename, not even in the professional market. There were rumors and the tech press did speculate that GK100 would be reserved for the (then) new Tesla, but that never happened. This is the complete list of Kepler GPUs, both 6 and 7 series, and including the Titan, Quadro and Tesla cards:
  • Full GK104 - GTX 680, GTX 770, GTX 880M and several profesional cards.
  • Cut down GK104 - GTX 660, 760, 670, 680M, 860M and several professional cards
  • GK106 - GTX 650ti boost, 650ti, 660, and several mobile and pro cards.
  • GK107 - GTX 640, 740, 820 and lots of mobile cards
  • GK110 - GTX 780 (cut down GK110), GTX 780ti and the original Titan, as well as the Titan Black, Titan Z and loads of Tesla / Quadro cards like the K6000
  • GK208 - entry level 7 series and 8 series cards, both Geforce and Quadro branded
  • GK208B - entry level 7 series and 8 series cards, both Geforce and Quadro branded
  • GK210 - Revised and slightly cut-down version of the GK100/GK110. Launched as the Tesla K80
  • GK20A - GPU built into the Tegra K1 SoC
I know from a trustworthy source (nvidia board partner employee) that nvidia had no issues whatsoever with the GK100. If fact internal testing showed what a huge leap in performance Kepler was over Fermi. This is THE REASON why nvidia decided to launch the GK104 mid-range chip as the GTX 680 - the GK104 is 30 to 50% faster then the GF100 used in the GTX 480 and 580. Some clever people in management came up with this marketing stunt - spread Kepler over two series of cards, 600 and 700 series, and release the GK104 first, and save the full GK100 (GK110) for the later 700 series and launch it as a premium product, creating a new market segment with the 780ti and original titan. GK110 is simply the name chosen for launch, replacing the GK100 moniker, mainly to attempt to obfuscate savvy consumers and the tech press. As nvidia naming schemes go, the GK110 should have been an entry level chip: GK104>GK106>GK107 -> the smaller the last number, the larger the chip. The only Kepler revision is named GK2xx (see above) and only includes entry level cards.
 
Last edited:
Joined
Mar 10, 2014
Messages
1,753 (0.77/day)
While true, gk110 had two versions too. GK110 and GK110b, latter could clock a bit higher. All gtx780 "AIBs GHz editions" used the gk110b version of that chip. Going back to that ancient history, everybody knows that gtx780ti had aged pretty bad mostly because of 3GB vram. But how has GTX 780 6GB version aged?
 
Joined
Feb 26, 2016
Messages
298 (0.19/day)
Location
Texas
System Name Dell Precision 7540
Processor i7-9750H (turbo clocked)
Cooling Stock w/ ICD thermal compound
Memory G.Skill Ripjaws 16GB (2x8GB) @3200 MHz
Video Card(s) NVIDIA GeForce GTX 980 @1430 MHz/2025 MHz (core/memory)
Storage XPG SX8200 Pro 512GB SSD
Display(s) 1080p @60 Hz 100% sRGB laptop screen, Acer G247HYL
Power Supply Dell 240W charger
Mouse Logitech G403
Keyboard Logitech G910 Stickerbombed
Software Windows 10 Pro 64 bit
Benchmark Scores XTU - 2218 points CB15 - 1311 CB20 - 3203
I told y’all the power plug was on the side lol
 
Top