• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 2060 Founders Edition Pictured, Tested

Joined
Apr 1, 2013
Messages
81 (0.04/day)
Likes
20
You are saying a 2080 Ti should be 600$? Because they “doubled” it? I know you are going to say “it was a metaphor”, and that’s where problems come in. It’s NOT double, at most I would say they charged 10-25% more than it’s worth. 10-25% is NOT double. That’s where customer satisfaction can be altered, by misleading a consumer market and saying it’s absolutely not worth it where in reality it’s not like what you said.
No, It's not he comparison I wish to make. The 2080 Ti is way more powerful than a 2060, any memory setup considered.

A 2080Ti should cost ~800/900 IMO and a 2080 600$.
If AMD was in the competition, I think that nvidia would be much closer to that range of prices.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
5,597 (1.30/day)
Likes
882
System Name MightyX (MITX)
Processor Core i7 4770K @ 4.2ghz
Motherboard Asus Z87i-Pro
Cooling Noctua NH-L12
Memory 16gb DDR3 1600 LP
Video Card(s) Gigabyte GTX1080 G1 OC
Storage Lots of SSD's
Display(s) AOC AGON AG352QCX
Case Raven RVZ-01
Power Supply Corsair SF600
Mouse Razer Mamba Tournament Chroma
Keyboard Razer Blackwidow X Chroma
People compare this to the price of the GTX 1060, but forget the RX 480 was on the market at the same time to keep prices a bit lower! Basically, the closest competition to the RTX 2060 will be the RX Vega 56 (if we believe in the leaks), which still costs upwards of $400~$450, except for one or another occasional promotion.

Unless AMD pulls something off their hat in January, $350 to $400 for the RTX 2060 will be in tune with what AMD also offers! Nvidia with their dominant position, is not interested in disrupting the market with price/performance.
Nice to see bunch of couch GPU designers and financial analysts knows better than a multi million GPU company regarding both technology and pricing. It is called capitalism for a reason, no competition means Nvidia can have free say on how much they price their cards. You don’t like it then don’t buy, good for you. Someone else likes it they buy and it is entirely their own business. NVIDIA is “greedy” sure yeah they better f*ucking be greedy. They are a for profit company not a f*ucking charity.
Good to see a few people out there are onto it, probably others too I just haven't quoted you all. In the absence of competition at certain price points which AMD has generally been able to do at least in the low/mid/upper-midrange (and often top teir) segments previously for some time, Nvidia just has this ability to charge a premium for premium performance. Add to that fact the upper-end RTX chips are enormous and use newer more expensive memory and yeah, you find them charging top dollar for them, and so they should in that position.

As has been said, don't like it? vote with your wallet! I sure have. I bought a GTX1080 at launch and ~2.5 years later I personally have no compelling reason to upgrade, that comes down to my rig, screen, available time to game, what I play, price performance etc etc etc, add it all together - that equation is different for every buyer.

Do I think 20 series RTX is worth it? Not yet but I'm glad someones doing it, I've seen BFV played with it on and I truly hope Ray Tracing is in the future of gaming.

My take is that when one or both of these two things happen prices will drop, perhaps but a negligible amount, perhaps significantly;

1. Nvidia clears out all (or virtually all) 10 series stock, which the market still seems hungry for, partly because many offerings offer more than adequate performance for the particular consumer's needs.
2. AMD answer the 20 series upper level performance, or release cards matching 1080/2070/vega perf at lower prices (or again, both)
 
Joined
May 22, 2015
Messages
5,203 (3.80/day)
Likes
2,224
Processor Intel i5-6600k
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x8GB DDR4 2400 G.Skill
Video Card(s) EVGA GTX 1060 SC
Storage 128 and 256GB OCZ Vertex4, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Chieftec BX01
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Yeah, but you're not the one making them having to recoup manufacturing and R&D costs. And for the record, 2080's are not far from the $600 you mentioned.
My guess would be at least some of the R&D has been covered by Volta. It's the sheer die size that makes Turing so damn expensive.
If Nvidia manages to tweak the hardware for their 7nm lineup, then we'll have a strong proposition for DXR. Otherwise, we'll have to wait for another generation.
 
Joined
Jul 5, 2013
Messages
4,618 (2.25/day)
Likes
2,885
Location
USA
My guess would be at least some of the R&D has been covered by Volta.
Maybe, but how many Volta cards have they actually sold? Even so, Volta and Turing are not the same. They have similarities, but are different enough that a lot of R&D is unrelated and doesn't cross over.
It's the sheer die size that makes Turing so damn expensive.
That is what I was referring to with manufacturing costs. Pricey wafer dies, even if you manage a high wafer/die ratio yield. That price goes way up of you can't manage at least an 88% wafer yield, which will be challenging given the total size of a functional die.
 
Joined
May 22, 2015
Messages
5,203 (3.80/day)
Likes
2,224
Processor Intel i5-6600k
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x8GB DDR4 2400 G.Skill
Video Card(s) EVGA GTX 1060 SC
Storage 128 and 256GB OCZ Vertex4, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Chieftec BX01
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Maybe, but how many Volta cards have they actually sold? Even so, Volta and Turing are not the same. They have similarities, but are different enough that a lot of R&D is unrelated and doesn't cross over.
Well, Volta is only for professional cards. The Quadro goes for $9,000, God knows how much they charge for a Tesla.
And about differences, I'm really not aware of many, save for the tweaks Nvidia did to make it more fit for DXR (and probably less fit for general computing). I'm sure Anand did a piece highlighting the differences (and I'm sure I read it), but nothing striking has stuck.

That said, yes, R&D does not usually pay off after just one iteration. I was just saying they've already made some of that back.
 
Last edited:
Joined
Mar 10, 2015
Messages
1,042 (0.72/day)
Likes
712
System Name Wut?
Processor 4770K @ Stock
Motherboard MSI Z97 Gaming 7
Cooling Water
Memory 16GB DDR3 2400
Video Card(s) Vega 56
Storage Samsung 840 Pro 256GB
Display(s) 3440 x 1440
Case Thermaltake T81
Power Supply Seasonic 750 Watt Gold
Nvidia clears out all (or virtually all) 10 series stock
They could certainly just restock 10 series at the prices they are at and leave the 20 series where they are to continue this pricing model until AMD releases something. Lord help us all if it doesn't compete.
 
Joined
Dec 6, 2016
Messages
66 (0.08/day)
Likes
20
System Name The_RIG
Processor Intel Core i7 3930k @ 4.6GHz / 1.353v
Motherboard Asus P9X79
Cooling Segotep Halo 240
Memory 16GB Corsair Dominator GT 2133Mhz CL9-11-10-27 (4x4GB)
Video Card(s) MSI GTX 1080 Aero + Arctic Cooling Accelero Extreme IV
Storage OCZ Trion 240MB | Colorful SL500 240GB | 2x 1TB 2.5" Hitachi 7200rpm | 2TB 2.5" Toshiba 5400rpm
Display(s) Twin 27" Dell u2713h 2k monitors
Case Segotep Halo 5
Audio Device(s) Creative X-Fi Extreme Audio PCI-E (SB1040)
Power Supply Seasonic Focus+ Gold 750W
Mouse Logitech G600
Keyboard Razer Deathstalker Chroma
Software Win10 x64
No, the first Titan was released along with the 700 series.
The top model of the 600 series was the GTX 690, having two GK104 chips.
Yes, the original titan was released with the 7 series, but both use the kepler architecture. In fact, the 680 and 770 are identical, sans for a small clock speed increase in favor of the 770 (1045 vs 1006Mhz). You can even flash a 680 to 770 and vice versa (if the cards use the same pcb like say a reference design).

I have to correct you there.
GK100 was bad and Nvidia had to do a fairly "last minute" rebrand of "GTX 670 Ti" into "GTX 680". (I remember my GTX 680 box had stickers over all the product names) The GK100 was only used for some compute cards, but the GK110 was a revised version, which ended up in the GTX 780, and was pretty much what the GTX 680 should have been.

You have to remember that Kepler was a major architectural redesign for Nvidia.
No. GK100=GK110. The reason for the extra "1" is the name change from 6 series to 7 series - try to make it look like a new product. Kepler also has GK2xx branded chips witch do contain small architectural improvements, mostly for the scheduler and power efficiency. Again, and for the last time - the 680 is NOT GK100 - it's GK104. Nvidia did not release any GPU with the GK100 codename, not even in the professional market. There were rumors and the tech press did speculate that GK100 would be reserved for the (then) new Tesla, but that never happened. This is the complete list of Kepler GPUs, both 6 and 7 series, and including the Titan, Quadro and Tesla cards:
  • Full GK104 - GTX 680, GTX 770, GTX 880M and several profesional cards.
  • Cut down GK104 - GTX 660, 760, 670, 680M, 860M and several professional cards
  • GK106 - GTX 650ti boost, 650ti, 660, and several mobile and pro cards.
  • GK107 - GTX 640, 740, 820 and lots of mobile cards
  • GK110 - GTX 780 (cut down GK110), GTX 780ti and the original Titan, as well as the Titan Black, Titan Z and loads of Tesla / Quadro cards like the K6000
  • GK208 - entry level 7 series and 8 series cards, both Geforce and Quadro branded
  • GK208B - entry level 7 series and 8 series cards, both Geforce and Quadro branded
  • GK210 - Revised and slightly cut-down version of the GK100/GK110. Launched as the Tesla K80
  • GK20A - GPU built into the Tegra K1 SoC
I know from a trustworthy source (nvidia board partner employee) that nvidia had no issues whatsoever with the GK100. If fact internal testing showed what a huge leap in performance Kepler was over Fermi. This is THE REASON why nvidia decided to launch the GK104 mid-range chip as the GTX 680 - the GK104 is 30 to 50% faster then the GF100 used in the GTX 480 and 580. Some clever people in management came up with this marketing stunt - spread Kepler over two series of cards, 600 and 700 series, and release the GK104 first, and save the full GK100 (GK110) for the later 700 series and launch it as a premium product, creating a new market segment with the 780ti and original titan. GK110 is simply the name chosen for launch, replacing the GK100 moniker, mainly to attempt to obfuscate savvy consumers and the tech press. As nvidia naming schemes go, the GK110 should have been an entry level chip: GK104>GK106>GK107 -> the smaller the last number, the larger the chip. The only Kepler revision is named GK2xx (see above) and only includes entry level cards.
 
Last edited:
Joined
Mar 10, 2014
Messages
1,307 (0.72/day)
Likes
368
While true, gk110 had two versions too. GK110 and GK110b, latter could clock a bit higher. All gtx780 "AIBs GHz editions" used the gk110b version of that chip. Going back to that ancient history, everybody knows that gtx780ti had aged pretty bad mostly because of 3GB vram. But how has GTX 780 6GB version aged?
 
Joined
Feb 26, 2016
Messages
71 (0.07/day)
Likes
6
System Name Berfs1
Processor i7-6700HQ (turbo clocked)
Motherboard Asus G752VL
Cooling Stock w/ ICD thermal compound
Memory Samsung 16GB (2x8GB) @2133 MHz
Video Card(s) NVIDIA GeForce GTX 965M @1218 MHz/1425 MHz (core/memory)
Storage HGST 1TB @7200 RPM, Kingston 2GB SD (cache card)
Display(s) 1080p G-SYNC IPS @75 Hz laptop screen
Power Supply Asus 180W charger
Mouse Logitech M100
Keyboard Logitech G105
Software Windows 10 Home 64 bit
Benchmark Scores XTU - 979 points
I told y’all the power plug was on the side lol
 
Top