• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 2060 Founders Edition Pictured, Tested

Joined
Mar 10, 2015
Messages
3,984 (1.20/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
This isn't difficult. Let's even remove an exact refresh rate to make it easier. When two cards are vsynced to refresh rates, you can't tell them apart.
 
Joined
Aug 6, 2017
Messages
7,412 (3.02/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
This isn't difficult. Let's even remove an exact refresh rate to make it easier. When two cards are vsynced to refresh rates, you can't tell them apart.
was that the point of the blind test ? you seem to have no idea of what I'm referring to.
 
Joined
Mar 10, 2015
Messages
3,984 (1.20/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
The point of the test was that you couldn't tell which card was in which machine. That is what a blind test is used for. I still don't see why this is so difficult.
 
Joined
May 30, 2015
Messages
1,873 (0.58/day)
Location
Seattle, WA
Wrong, you missed the point, the prices must be calculated with inflation to give a fair comparison.

That does not mean that a price increase is pure inflation, there are many factors here, including varying production costs, competition, etc. But these can only be compared after we have the price corrected for inflation, otherwise any comparison is pointless.

You missed the entire point of the comparison. It was a chart to simply show the launch price for each flagship. It was not a comparison across multiple generations (I.E. not meant to compare adjusted value of non-subsequent products).

The entire point of the chart was to show that prices did not increase with each new generation. No adjusted value required.
 
Joined
Oct 2, 2015
Messages
2,991 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) AOC Q27G3XMN + Samsung S22F350
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse EVGA X15
Keyboard VSG Alnilam
Software Windows 11
Not for sure we don't. Waiting to see the benchmarks and reviews.
Sure, here are some wild predictions for when it happens, the 4GB 128bit variant will be crap, just like Navi will be too.
 
Joined
Aug 6, 2017
Messages
7,412 (3.02/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
The point of the test was that you couldn't tell which card was in which machine. That is what a blind test is used for. I still don't see why this is so difficult.
so,what was the point of having two cards,a v64 and 1080ti,which are 30% apart in performance,in two rigs that are capped at 100hz. what was the point of it really? the card could not could not deliver with pure performance numbers, so they capped the fps when comparing it against 1080Ti in the most laughable AMD-sponsored test I've seen :laugh: can you imagine the reverse, if navi beat 2060 by 30% and someone did a nvidia-sponsored blind test where they're both delivering 100 fps.You'd both be the first people to heckle it,and rightly so.

Sure, here are some wild predictions for when it happens, the 4GB 128bit variant will be crap, just like Navi will be too.
:laugh:
the 4gb 128-bit variant will easily match 1070 at 1080p/1440p if the 2060 6gb leak is true. and how is navi gonna be crap ? even if they took polaris at 7nm and ddr6 that'd make a good card.
 
Last edited:

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
7,966 (1.12/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 2
Software Win 10 Pro x64
so,what was the point of having two cards,a v64 and 1080ti,which are 30% apart in performance,in two rigs that are capped at 100hz. what was the point of it really? the card could not could not deliver with pure performance numbers, so they capped the fps when comparing it against 1080Ti in the most laughable AMD-sponsored test I've seen :laugh: can you imagine the reverse, if navi beat 2060 by 30% and someone did a nvidia-sponsored blind test where they're both delivering 100 fps.You'd both be the first people to heckle it,and rightly so.


:laugh:
the 4gb 128-bit variant will easily match 1070 at 1080p/1440p if the 2060 6gb leak is true. and how is navi gonna be crap ? even if they took polaris at 7nm and ddr6 that'd make a good card.
Point of it was 100FPS of smooth gaming is possible. how much more is needed? I don’t worry that there’s cards that can get 10-20FPS more than I can because I get my “100FPS” oF smooth Freesync gaming. Diminishing returns for epeen after that.
 
Joined
Mar 10, 2015
Messages
3,984 (1.20/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
o,what was the point of having two cards,a v64 and 1080ti,which are 30% apart in performance,in two rigs that are capped at 100hz. what was the point of it really?

I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here. Once you hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing. If anyone is concerned about extra power draw but then fps chases, they are silly. There is simply no point in worrying about frames over your monitors refresh rate. That was the whole purpose of the test.

If you can't understand that, good day!
 

bug

Joined
May 22, 2015
Messages
13,224 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here. Once you hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing. If anyone is concerned about extra power draw but then fps chases, they are silly. There is simply no point in worrying about frames over your monitors refresh rate. That was the whole purpose of the test.

If you can't understand that, good day!

I think we can all agree on that, if you meant "once your min fps* hits your monitor's refresh rate". But even that is contingent to one title.

*or at least the 99 or 90 percentile, because absolute min fps is usually just a freak occurence
 
Joined
Mar 10, 2015
Messages
3,984 (1.20/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
I think we can all agree on that, if you meant "once your min fps* hits your monitor's refresh rate". But even that is contingent to one title.

*or at least the 99 or 90 percentile, because absolute min fps is usually just a freak occurence

Considering we couldn't get the basics down, I didn't want to go advanced yet. But yes, the extra processing power is much better for your minimums.
 
Joined
Aug 6, 2017
Messages
7,412 (3.02/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here. Once you hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing. If anyone is concerned about extra power draw but then fps chases, they are silly. There is simply no point in worrying about frames over your monitors refresh rate. That was the whole purpose of the test.

If you can't understand that, good day!
you lay it down in black in white,yet you can't understand. tests that are limited in any way,in this case by the resolution/refresh rate (100hz) and the choice of games (doom,shadow warrior) are just pointless and are not a good way to test performance. I don't know how to express it in any simpler terms either. I mean,a gpu comparison with a v-sync cap ? Who even heard of something that stupid.

What is the exact point that we should stop "fps chasing" ? 90 fps ? 100 fps ? 85 fps ? Can you please tell us ? Cause I was under the impression that it's subjective in every case. I absolutely can feel 100 vs 130 fps instantly. The blind test was pointless and just pulling wool over the public eyes.
 
Joined
Mar 10, 2015
Messages
3,984 (1.20/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
ermahgerd.....

you lay it down in black in white,yet you can't understand. tests that are limited in any way,in this case by the resolution/refresh rate (100hz) and the choice of games (doom,shadow warrior) are just pointless and are not a good way to test performance. I don't know how to express it in any simpler terms either. I mean,a gpu comparison with a v-sync cap ? Who even heard of something that stupid.

This really is black and white. They weren't comparing GPUs, they were comparing a scenarios. It went like this: Hey, you can play X game at Y resolution at Z Hz and you can't tell the difference. Which is absolutely correct. Was it the most eye opening test ever? No. Did it show people that if you target your monitor that any gpu will likely work? Yes.

What is the exact point that we should stop "fps chasing" ? 90 fps ? 100 fps ? 85 fps ? Can you please tell us ? Cause I was under the impression that it's subjective in every case. I absolutely can feel 100 vs 130 fps instantly. The blind test was pointless and just pulling wool over the public eyes.

This is perhaps the most black and white and easiest. The correct answer is, drum roll please: your monitor's refresh rate!
 
Joined
Aug 6, 2017
Messages
7,412 (3.02/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
ermahgerd.....



This really is black and white. They weren't comparing GPUs, they were comparing a scenarios. It went like this: Hey, you can play X game at Y resolution at Z Hz and you can't tell the difference. Which is absolutely correct. Was it the most eye opening test ever? No. Did it show people that if you target your monitor that any gpu will likely work? Yes.



This is perhaps the most black and white and easiest. The correct answer is, drum roll please: your monitor's refresh rate!
why didn't they go with a 144/165hz one for the tests then ? And it's obvious you're unlikely to see the difference when they just tell you to sit down and play a couple of game for a couple of minutes each.But in the games you play extensively at home you'll easily see the fps difference immediately. Imagine you've played hundreds of hours of BF1 at 100 fps and the same amount at 130 fps. You'll see the difference right away.
 
Joined
Jul 5, 2013
Messages
25,559 (6.48/day)
your monitor's refresh rate!
Not to take sides here, because both of you have made fair points, the above is correct. Once a systems minimum FPS matches or exceeds the maximum refresh rate of the display, FPS chasing becomes a waste of time, effort and energy.

EDIT; Realistically, 120hz is the physical limit to what the human eye can distinguish in real-time. While we can still see a difference in "smoothness" above 120hz, it's only a slight perceptional difference and can be very subjective from person to person.
 
Last edited:
Joined
Mar 10, 2015
Messages
3,984 (1.20/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
why didn't they go with a 144/165hz one for the tests then ? And it's obvious you're unlikely to see the difference when they just tell you to sit down and play a couple of game for a couple of minutes each.But in the games you play extensively at home you'll easily see the fps difference immediately. Imagine you've played hundreds of hours of BF1 at 100 fps and the same amount at 130 fps. You'll see the difference right away.

We both know the answer to that question: The Vega likely can't do it. Wow, that is disingenuous you are probably telling yourself. The reason they did so is that it is very likely many more people play at 100hz and below. So that is who they targeted. Where it would benefit them the most.

And I don't even want to imagine playing any BF title for hundreds of hours. It only brings up images of my own suicide.

Not to take sides here, because both of you have made fair points, the above is correct. Once a systems minimum FPS matches or exceeds the maximum refresh rate of the display, FPS chasing becomes a waste of time, effort and energy.

I don't play for sides. There is really nothing Cucker has said that is wrong, just that it doesn't apply to what started this debate about the blind test.
 
Joined
Dec 6, 2016
Messages
152 (0.06/day)
System Name The cube
Processor AMD Ryzen 5700g
Motherboard Gigabyte B550M Aorus Elite
Cooling Thermalright ARO-M14
Memory 16GB Corsair Vengeance LPX 3800mhz
Video Card(s) Powercolor Radeon RX 6900XT Red Devil
Storage Kingston 1TB NV2| 2x 1TB 2.5" Hitachi 7200rpm | 2TB 2.5" Toshiba USB 3.0
Display(s) Samsung Odyssey G5 32" + LG 24MP59G 24"
Case Chieftec CI-02B-OP
Audio Device(s) Creative X-Fi Extreme Audio PCI-E (SB1040)
Power Supply Corsair HX1200
Mouse Razer Basilisk X Hyperspeed
Keyboard Razer Ornata Chroma
Software Win10 x64 PRO
Benchmark Scores Mobile: Asus Strix Advantage G713QY | Ryzen 7 5900HX | 16GB Micron 3200MHz CL21 | RX 6800M 12GB |
Everyone is complaining about price. Is it really that expensive? And if so, is really that difficult to save money for an extra month or two?

Seems to be at the end of the card.

You don't seem to get it do you - nvidia has been steadily increasing the cost of it's graphics cards for the last 4-5 generations, and swapping products around. With the 6xx series, the GTX 680 was NOT actually their high-end product - no - that was the original titan. So they released the 680 witch is a mid end card branded as a high end one. The codename for the GPU is GK104 (G=geforce K=Kepler 4=model number). In previous generation cards, Fermi, the 104 numbering was assigned to the GTX 460 - GF104, and the high end model was the GF100. Currently, the GP106 card is the GTX 1060 - but what bore the 106 moniker a few generations ago? the GTX 450! yeah. The GF106 is the GTX 450, witch means the GP104 SHOUD have been the GTX 1050, NOT the 1060. Nvidia keeps swapping names, making even more money off the backs of people like you, who are willing to spend the ludicrous amounts of money these bottomless pit of a company is asking for.

The 680 should have been the GK100, witch nvidia later renamed into GK110 because of the 7 series or kepler refresh - and sold as the original titan - a 1000$ card! No other high end video card sold for that much before - the GTX 480 had a MSRP of 499$, and the 580 was 450$ - so nvidia doubled their profits with Kepler by simply moving the high and mid end cards around their lineup and "inventing" two new models - the Titan, and later, the GK110b, the 780ti - but not before milking consumers with their initial run of defective GK110 chips, the GTX 780, or GK110-300.

Now nvidia is asking 400$ for a mainstream card, almost as much as the 2, 4 and 5 series high end models cost. But wait - the current flagship, the Titan RTX is now 2500 bloody dollars, and the 2800ti was 1299$ at launch, with prices reaching 1500$ in some places. In fact, it's still 1500$ at most online retailers in my country. F#(k that! 1500$ can buy you a lot of nice stuff - a decent car, a good bike, a boat, loads of clothes, a nice vacation - I'm not forking over to nvidia to pay for a product witch has cost around the 400$ mark for the better part of 18 freakin' years.

Don't you realize we're being taken for fools? Companies are treating us like idiots, and we're happy to oblige by forking out more cash for shittier products...
 
Last edited:
Joined
Dec 31, 2009
Messages
19,366 (3.70/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
So... youd rather have a 'high-end' card from a 'couple' generations ago than a modern card with the latest features which performs the same (or better)....just because its labeled as a midrange card?

I'd rather go with the modern card with all the latest gizmos, and less power use given the same cost and performance.
 
Joined
Oct 2, 2015
Messages
2,991 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) AOC Q27G3XMN + Samsung S22F350
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse EVGA X15
Keyboard VSG Alnilam
Software Windows 11
So... youd rather have a 'high-end' card from a 'couple' generations ago than a modern card with the latest features which performs the same (or better)....just because its labeled as a midrange card?

I'd rather go with the modern card with all the latest gizmos, and less power use given the same cost and performance.
Amen to that. Plus warranty.
 
Joined
Jun 10, 2014
Messages
2,901 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
With the 6xx series, the GTX 680 was NOT actually their high-end product - no - that was the original titan.
No, the first Titan was released along with the 700 series.
The top model of the 600 series was the GTX 690, having two GK104 chips.

So they released the 680 witch is a mid end card branded as a high end one. The codename for the GPU is GK104 (G=geforce K=Kepler 4=model number). In previous generation cards, Fermi, the 104 numbering was assigned to the GTX 460 - GF104, and the high end model was the GF100. <snip>

The 680 should have been the GK100, witch nvidia later renamed into GK110 because of the 7 series or kepler refresh - and sold as the original titan - a 1000$ card!
I have to correct you there.
GK100 was bad and Nvidia had to do a fairly "last minute" rebrand of "GTX 670 Ti" into "GTX 680". (I remember my GTX 680 box had stickers over all the product names) The GK100 was only used for some compute cards, but the GK110 was a revised version, which ended up in the GTX 780, and was pretty much what the GTX 680 should have been.

You have to remember that Kepler was a major architectural redesign for Nvidia.
 
Joined
Feb 26, 2016
Messages
548 (0.18/day)
Location
Texas
System Name O-Clock
Processor Intel Core i9-9900K @ 52x/49x 8c8t
Motherboard ASUS Maximus XI Gene
Cooling EK Quantum Velocity C+A, EK Quantum Vector C+A, CE 280, Monsta 280, GTS 280 all w/ A14 IP67
Memory 2x16GB G.Skill TridentZ @3900 MHz CL16
Video Card(s) EVGA RTX 2080 Ti XC Black
Storage Samsung 983 ZET 960GB, 2x WD SN850X 4TB
Display(s) Asus VG259QM
Case Corsair 900D
Audio Device(s) beyerdynamic DT 990 600Ω, Asus SupremeFX Hi-Fi 5.25", Elgato Wave 3
Power Supply EVGA 1600 T2 w/ A14 IP67
Mouse Logitech G403 Wireless (PMW3366)
Keyboard Logitech G910 Stickerbombed
Software Windows 10 Pro 64 bit
Benchmark Scores https://hwbot.org/search/submissions/permalink?userId=92615&cpuId=5773
400$ for an high end card (meaning it's the best a company can do) like the the 290 for the record, is understandable.
400$ for a mid-range card (see what a 2080 Ti can do) is not good.

Nvidia doubled they're prices because AMD is waiting for next year to propose something, because they are focusing on CPU and don't have Nvidia nor Intel firepower. They can't do both GPU and CPU.
So you are bending over waiting for the theft "a month or two".

Here is your invalidation.
You are saying a 2080 Ti should be 600$? Because they “doubled” it? I know you are going to say “it was a metaphor”, and that’s where problems come in. It’s NOT double, at most I would say they charged 10-25% more than it’s worth. 10-25% is NOT double. That’s where customer satisfaction can be altered, by misleading a consumer market and saying it’s absolutely not worth it where in reality it’s not like what you said.

No, the first Titan was released along with the 700 series.
The top model of the 600 series was the GTX 690, having two GK104 chips.


I have to correct you there.
GK100 was bad and Nvidia had to do a fairly "last minute" rebrand of "GTX 670 Ti" into "GTX 680". (I remember my GTX 680 box had stickers over all the product names) The GK100 was only used for some compute cards, but the GK110 was a revised version, which ended up in the GTX 780, and was pretty much what the GTX 680 should have been.

You have to remember that Kepler was a major architectural redesign for Nvidia.
The GTX 670 Ti doesn’t exist. It’s not a rebrand. Also, if I recall correctly, the 680 has the fully enabled GK104 chip (with the 690 being the same but two GK104 chips). Kepler 1st gen capped at 1536 cores, Kepler 2nd gen capped at 2880 cores. Just by architecture. Not for the intent to rip people off.
 
Last edited:
Joined
Aug 6, 2017
Messages
7,412 (3.02/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
It's a 750mm2 die with 11 gigs of ddr6 and hardware RT acceleration, you can have one at $1000 or you can have none, they'll not sell it to you at $600 if vega 64 is $400 and 2080Ti is 1.85x Vega's performance.




2080Ti RTX = V64 rasterization

 
Last edited:
Joined
Mar 10, 2015
Messages
3,984 (1.20/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
They'll sell it for what people will pay. Which is apparently what it is listed at.
 
Top