Sunday, December 30th 2018

NVIDIA GeForce RTX 2060 Founders Edition Pictured, Tested

Here are some of the first pictures of NVIDIA's upcoming GeForce RTX 2060 Founders Edition graphics card. You'll know from our older report that there could be as many as six variants of the RTX 2060 based on memory size and type. The Founders Edition is based on the top-spec one with 6 GB of GDDR6 memory. The card looks similar in design to the RTX 2070 Founders Edition, which is probably because NVIDIA is reusing the reference-design PCB and cooling solution, minus two of the eight memory chips. The card continues to pull power from a single 8-pin PCIe power connector.

According to VideoCardz, NVIDIA could launch the RTX 2060 on the 15th of January, 2019. It could get an earlier unveiling by CEO Jen-Hsun Huang at NVIDIA's CES 2019 event, slated for January 7th. The top-spec RTX 2060 trim is based on the TU106-300 ASIC, configured with 1,920 CUDA cores, 120 TMUs, 48 ROPs, 240 tensor cores, and 30 RT cores. With an estimated FP32 compute performance of 6.5 TFLOP/s, the card is expected to perform on par with the GTX 1070 Ti from the previous generation in workloads that lack DXR. VideoCardz also posted performance numbers obtained from NVIDIA's Reviewer's Guide, that point to the same possibility.
In its Reviewer's Guide document, NVIDIA tested the RTX 2060 Founders Edition on a machine powered by a Core i9-7900X processor and 16 GB of memory. The card was tested at 1920 x 1080 and 2560 x 1440, its target consumer segment. Performance numbers obtained at both resolutions point to the card performing within ±5% of the GTX 1070 Ti (and possibly the RX Vega 56 from the AMD camp). The guide also mentions an SEP pricing of the RTX 2060 6 GB at USD $349.99.
Source: VideoCardz
Add your own comment

234 Comments on NVIDIA GeForce RTX 2060 Founders Edition Pictured, Tested

#201
INSTG8R
Vanguard Beta Tester
cucker tarlsonso,what was the point of having two cards,a v64 and 1080ti,which are 30% apart in performance,in two rigs that are capped at 100hz. what was the point of it really? the card could not could not deliver with pure performance numbers, so they capped the fps when comparing it against 1080Ti in the most laughable AMD-sponsored test I've seen :laugh: can you imagine the reverse, if navi beat 2060 by 30% and someone did a nvidia-sponsored blind test where they're both delivering 100 fps.You'd both be the first people to heckle it,and rightly so.


:laugh:
the 4gb 128-bit variant will easily match 1070 at 1080p/1440p if the 2060 6gb leak is true. and how is navi gonna be crap ? even if they took polaris at 7nm and ddr6 that'd make a good card.
Point of it was 100FPS of smooth gaming is possible. how much more is needed? I don’t worry that there’s cards that can get 10-20FPS more than I can because I get my “100FPS” oF smooth Freesync gaming. Diminishing returns for epeen after that.
Posted on Reply
#202
lexluthermiester
GoldenXSure, here are some wild predictions for when it happens, the 4GB 128bit variant will be crap, just like Navi will be too.
That's definitely a set of wild predictions. Can't see either one happening.
Posted on Reply
#203
EarthDog
cucker tarlson:laugh:
the 4gb 128-bit variant will easily match 1070 at 1080p/1440p if the 2060 6gb leak is true. and how is navi gonna be crap ? even if they took polaris at 7nm and ddr6 that'd make a good card.
Haters going hate... it's just that simple. :)
Posted on Reply
#204
lexluthermiester
EarthDogHaters going hate... it's just that simple. :)
Right. You think they'd give up or have something better to do..
Posted on Reply
#205
moproblems99
cucker tarlsono,what was the point of having two cards,a v64 and 1080ti,which are 30% apart in performance,in two rigs that are capped at 100hz. what was the point of it really?
I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here. Once you hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing. If anyone is concerned about extra power draw but then fps chases, they are silly. There is simply no point in worrying about frames over your monitors refresh rate. That was the whole purpose of the test.

If you can't understand that, good day!
Posted on Reply
#206
bug
moproblems99I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here. Once you hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing. If anyone is concerned about extra power draw but then fps chases, they are silly. There is simply no point in worrying about frames over your monitors refresh rate. That was the whole purpose of the test.

If you can't understand that, good day!
I think we can all agree on that, if you meant "once your min fps* hits your monitor's refresh rate". But even that is contingent to one title.

*or at least the 99 or 90 percentile, because absolute min fps is usually just a freak occurence
Posted on Reply
#207
moproblems99
bugI think we can all agree on that, if you meant "once your min fps* hits your monitor's refresh rate". But even that is contingent to one title.

*or at least the 99 or 90 percentile, because absolute min fps is usually just a freak occurence
Considering we couldn't get the basics down, I didn't want to go advanced yet. But yes, the extra processing power is much better for your minimums.
Posted on Reply
#208
cucker tarlson
moproblems99I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here. Once you hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing. If anyone is concerned about extra power draw but then fps chases, they are silly. There is simply no point in worrying about frames over your monitors refresh rate. That was the whole purpose of the test.

If you can't understand that, good day!
you lay it down in black in white,yet you can't understand. tests that are limited in any way,in this case by the resolution/refresh rate (100hz) and the choice of games (doom,shadow warrior) are just pointless and are not a good way to test performance. I don't know how to express it in any simpler terms either. I mean,a gpu comparison with a v-sync cap ? Who even heard of something that stupid.

What is the exact point that we should stop "fps chasing" ? 90 fps ? 100 fps ? 85 fps ? Can you please tell us ? Cause I was under the impression that it's subjective in every case. I absolutely can feel 100 vs 130 fps instantly. The blind test was pointless and just pulling wool over the public eyes.
Posted on Reply
#209
moproblems99
ermahgerd.....
cucker tarlsonyou lay it down in black in white,yet you can't understand. tests that are limited in any way,in this case by the resolution/refresh rate (100hz) and the choice of games (doom,shadow warrior) are just pointless and are not a good way to test performance. I don't know how to express it in any simpler terms either. I mean,a gpu comparison with a v-sync cap ? Who even heard of something that stupid.
This really is black and white. They weren't comparing GPUs, they were comparing a scenarios. It went like this: Hey, you can play X game at Y resolution at Z Hz and you can't tell the difference. Which is absolutely correct. Was it the most eye opening test ever? No. Did it show people that if you target your monitor that any gpu will likely work? Yes.
cucker tarlsonWhat is the exact point that we should stop "fps chasing" ? 90 fps ? 100 fps ? 85 fps ? Can you please tell us ? Cause I was under the impression that it's subjective in every case. I absolutely can feel 100 vs 130 fps instantly. The blind test was pointless and just pulling wool over the public eyes.
This is perhaps the most black and white and easiest. The correct answer is, drum roll please: your monitor's refresh rate!
Posted on Reply
#210
cucker tarlson
moproblems99ermahgerd.....



This really is black and white. They weren't comparing GPUs, they were comparing a scenarios. It went like this: Hey, you can play X game at Y resolution at Z Hz and you can't tell the difference. Which is absolutely correct. Was it the most eye opening test ever? No. Did it show people that if you target your monitor that any gpu will likely work? Yes.



This is perhaps the most black and white and easiest. The correct answer is, drum roll please: your monitor's refresh rate!
why didn't they go with a 144/165hz one for the tests then ? And it's obvious you're unlikely to see the difference when they just tell you to sit down and play a couple of game for a couple of minutes each.But in the games you play extensively at home you'll easily see the fps difference immediately. Imagine you've played hundreds of hours of BF1 at 100 fps and the same amount at 130 fps. You'll see the difference right away.
Posted on Reply
#211
lexluthermiester
moproblems99your monitor's refresh rate!
Not to take sides here, because both of you have made fair points, the above is correct. Once a systems minimum FPS matches or exceeds the maximum refresh rate of the display, FPS chasing becomes a waste of time, effort and energy.

EDIT; Realistically, 120hz is the physical limit to what the human eye can distinguish in real-time. While we can still see a difference in "smoothness" above 120hz, it's only a slight perceptional difference and can be very subjective from person to person.
Posted on Reply
#212
moproblems99
cucker tarlsonwhy didn't they go with a 144/165hz one for the tests then ? And it's obvious you're unlikely to see the difference when they just tell you to sit down and play a couple of game for a couple of minutes each.But in the games you play extensively at home you'll easily see the fps difference immediately. Imagine you've played hundreds of hours of BF1 at 100 fps and the same amount at 130 fps. You'll see the difference right away.
We both know the answer to that question: The Vega likely can't do it. Wow, that is disingenuous you are probably telling yourself. The reason they did so is that it is very likely many more people play at 100hz and below. So that is who they targeted. Where it would benefit them the most.

And I don't even want to imagine playing any BF title for hundreds of hours. It only brings up images of my own suicide.
lexluthermiesterNot to take sides here, because both of you have made fair points, the above is correct. Once a systems minimum FPS matches or exceeds the maximum refresh rate of the display, FPS chasing becomes a waste of time, effort and energy.
I don't play for sides. There is really nothing Cucker has said that is wrong, just that it doesn't apply to what started this debate about the blind test.
Posted on Reply
#213
kanecvr
lexluthermiesterEveryone is complaining about price. Is it really that expensive? And if so, is really that difficult to save money for an extra month or two?

Seems to be at the end of the card.
You don't seem to get it do you - nvidia has been steadily increasing the cost of it's graphics cards for the last 4-5 generations, and swapping products around. With the 6xx series, the GTX 680 was NOT actually their high-end product - no - that was the original titan. So they released the 680 witch is a mid end card branded as a high end one. The codename for the GPU is GK104 (G=geforce K=Kepler 4=model number). In previous generation cards, Fermi, the 104 numbering was assigned to the GTX 460 - GF104, and the high end model was the GF100. Currently, the GP106 card is the GTX 1060 - but what bore the 106 moniker a few generations ago? the GTX 450! yeah. The GF106 is the GTX 450, witch means the GP104 SHOUD have been the GTX 1050, NOT the 1060. Nvidia keeps swapping names, making even more money off the backs of people like you, who are willing to spend the ludicrous amounts of money these bottomless pit of a company is asking for.

The 680 should have been the GK100, witch nvidia later renamed into GK110 because of the 7 series or kepler refresh - and sold as the original titan - a 1000$ card! No other high end video card sold for that much before - the GTX 480 had a MSRP of 499$, and the 580 was 450$ - so nvidia doubled their profits with Kepler by simply moving the high and mid end cards around their lineup and "inventing" two new models - the Titan, and later, the GK110b, the 780ti - but not before milking consumers with their initial run of defective GK110 chips, the GTX 780, or GK110-300.

Now nvidia is asking 400$ for a mainstream card, almost as much as the 2, 4 and 5 series high end models cost. But wait - the current flagship, the Titan RTX is now 2500 bloody dollars, and the 2800ti was 1299$ at launch, with prices reaching 1500$ in some places. In fact, it's still 1500$ at most online retailers in my country. F#(k that! 1500$ can buy you a lot of nice stuff - a decent car, a good bike, a boat, loads of clothes, a nice vacation - I'm not forking over to nvidia to pay for a product witch has cost around the 400$ mark for the better part of 18 freakin' years.

Don't you realize we're being taken for fools? Companies are treating us like idiots, and we're happy to oblige by forking out more cash for shittier products...
Posted on Reply
#214
EarthDog
So... youd rather have a 'high-end' card from a 'couple' generations ago than a modern card with the latest features which performs the same (or better)....just because its labeled as a midrange card?

I'd rather go with the modern card with all the latest gizmos, and less power use given the same cost and performance.
Posted on Reply
#215
GoldenX
EarthDogSo... youd rather have a 'high-end' card from a 'couple' generations ago than a modern card with the latest features which performs the same (or better)....just because its labeled as a midrange card?

I'd rather go with the modern card with all the latest gizmos, and less power use given the same cost and performance.
Amen to that. Plus warranty.
Posted on Reply
#216
efikkan
kanecvrWith the 6xx series, the GTX 680 was NOT actually their high-end product - no - that was the original titan.
No, the first Titan was released along with the 700 series.
The top model of the 600 series was the GTX 690, having two GK104 chips.
kanecvrSo they released the 680 witch is a mid end card branded as a high end one. The codename for the GPU is GK104 (G=geforce K=Kepler 4=model number). In previous generation cards, Fermi, the 104 numbering was assigned to the GTX 460 - GF104, and the high end model was the GF100. <snip>

The 680 should have been the GK100, witch nvidia later renamed into GK110 because of the 7 series or kepler refresh - and sold as the original titan - a 1000$ card!
I have to correct you there.
GK100 was bad and Nvidia had to do a fairly "last minute" rebrand of "GTX 670 Ti" into "GTX 680". (I remember my GTX 680 box had stickers over all the product names) The GK100 was only used for some compute cards, but the GK110 was a revised version, which ended up in the GTX 780, and was pretty much what the GTX 680 should have been.

You have to remember that Kepler was a major architectural redesign for Nvidia.
Posted on Reply
#217
Berfs1
Renald400$ for an high end card (meaning it's the best a company can do) like the the 290 for the record, is understandable.
400$ for a mid-range card (see what a 2080 Ti can do) is not good.

Nvidia doubled they're prices because AMD is waiting for next year to propose something, because they are focusing on CPU and don't have Nvidia nor Intel firepower. They can't do both GPU and CPU.
So you are bending over waiting for the theft "a month or two".

Here is your invalidation.
You are saying a 2080 Ti should be 600$? Because they “doubled” it? I know you are going to say “it was a metaphor”, and that’s where problems come in. It’s NOT double, at most I would say they charged 10-25% more than it’s worth. 10-25% is NOT double. That’s where customer satisfaction can be altered, by misleading a consumer market and saying it’s absolutely not worth it where in reality it’s not like what you said.
efikkanNo, the first Titan was released along with the 700 series.
The top model of the 600 series was the GTX 690, having two GK104 chips.


I have to correct you there.
GK100 was bad and Nvidia had to do a fairly "last minute" rebrand of "GTX 670 Ti" into "GTX 680". (I remember my GTX 680 box had stickers over all the product names) The GK100 was only used for some compute cards, but the GK110 was a revised version, which ended up in the GTX 780, and was pretty much what the GTX 680 should have been.

You have to remember that Kepler was a major architectural redesign for Nvidia.
The GTX 670 Ti doesn’t exist. It’s not a rebrand. Also, if I recall correctly, the 680 has the fully enabled GK104 chip (with the 690 being the same but two GK104 chips). Kepler 1st gen capped at 1536 cores, Kepler 2nd gen capped at 2880 cores. Just by architecture. Not for the intent to rip people off.
Posted on Reply
#218
cucker tarlson
It's a 750mm2 die with 11 gigs of ddr6 and hardware RT acceleration, you can have one at $1000 or you can have none, they'll not sell it to you at $600 if vega 64 is $400 and 2080Ti is 1.85x Vega's performance.




2080Ti RTX = V64 rasterization

Posted on Reply
#219
moproblems99
They'll sell it for what people will pay. Which is apparently what it is listed at.
Posted on Reply
#220
Renald
Berfs1You are saying a 2080 Ti should be 600$? Because they “doubled” it? I know you are going to say “it was a metaphor”, and that’s where problems come in. It’s NOT double, at most I would say they charged 10-25% more than it’s worth. 10-25% is NOT double. That’s where customer satisfaction can be altered, by misleading a consumer market and saying it’s absolutely not worth it where in reality it’s not like what you said.
No, It's not he comparison I wish to make. The 2080 Ti is way more powerful than a 2060, any memory setup considered.

A 2080Ti should cost ~800/900 IMO and a 2080 600$.
If AMD was in the competition, I think that nvidia would be much closer to that range of prices.
Posted on Reply
#221
lexluthermiester
RenaldA 2080Ti should cost ~800/900 IMO and a 2080 600$.
Yeah, but you're not the one making them having to recoup manufacturing and R&D costs. And for the record, 2080's are not far from the $600 you mentioned.
Posted on Reply
#222
wolf
Performance Enthusiast
kingsPeople compare this to the price of the GTX 1060, but forget the RX 480 was on the market at the same time to keep prices a bit lower! Basically, the closest competition to the RTX 2060 will be the RX Vega 56 (if we believe in the leaks), which still costs upwards of $400~$450, except for one or another occasional promotion.

Unless AMD pulls something off their hat in January, $350 to $400 for the RTX 2060 will be in tune with what AMD also offers! Nvidia with their dominant position, is not interested in disrupting the market with price/performance.
xkm1948Nice to see bunch of couch GPU designers and financial analysts knows better than a multi million GPU company regarding both technology and pricing. It is called capitalism for a reason, no competition means Nvidia can have free say on how much they price their cards. You don’t like it then don’t buy, good for you. Someone else likes it they buy and it is entirely their own business. NVIDIA is “greedy” sure yeah they better f*ucking be greedy. They are a for profit company not a f*ucking charity.
Good to see a few people out there are onto it, probably others too I just haven't quoted you all. In the absence of competition at certain price points which AMD has generally been able to do at least in the low/mid/upper-midrange (and often top teir) segments previously for some time, Nvidia just has this ability to charge a premium for premium performance. Add to that fact the upper-end RTX chips are enormous and use newer more expensive memory and yeah, you find them charging top dollar for them, and so they should in that position.

As has been said, don't like it? vote with your wallet! I sure have. I bought a GTX1080 at launch and ~2.5 years later I personally have no compelling reason to upgrade, that comes down to my rig, screen, available time to game, what I play, price performance etc etc etc, add it all together - that equation is different for every buyer.

Do I think 20 series RTX is worth it? Not yet but I'm glad someones doing it, I've seen BFV played with it on and I truly hope Ray Tracing is in the future of gaming.

My take is that when one or both of these two things happen prices will drop, perhaps but a negligible amount, perhaps significantly;

1. Nvidia clears out all (or virtually all) 10 series stock, which the market still seems hungry for, partly because many offerings offer more than adequate performance for the particular consumer's needs.
2. AMD answer the 20 series upper level performance, or release cards matching 1080/2070/vega perf at lower prices (or again, both)
Posted on Reply
#223
bug
lexluthermiesterYeah, but you're not the one making them having to recoup manufacturing and R&D costs. And for the record, 2080's are not far from the $600 you mentioned.
My guess would be at least some of the R&D has been covered by Volta. It's the sheer die size that makes Turing so damn expensive.
If Nvidia manages to tweak the hardware for their 7nm lineup, then we'll have a strong proposition for DXR. Otherwise, we'll have to wait for another generation.
Posted on Reply
#224
lexluthermiester
bugMy guess would be at least some of the R&D has been covered by Volta.
Maybe, but how many Volta cards have they actually sold? Even so, Volta and Turing are not the same. They have similarities, but are different enough that a lot of R&D is unrelated and doesn't cross over.
bugIt's the sheer die size that makes Turing so damn expensive.
That is what I was referring to with manufacturing costs. Pricey wafer dies, even if you manage a high wafer/die ratio yield. That price goes way up of you can't manage at least an 88% wafer yield, which will be challenging given the total size of a functional die.
Posted on Reply
#225
bug
lexluthermiesterMaybe, but how many Volta cards have they actually sold? Even so, Volta and Turing are not the same. They have similarities, but are different enough that a lot of R&D is unrelated and doesn't cross over.
Well, Volta is only for professional cards. The Quadro goes for $9,000, God knows how much they charge for a Tesla.
And about differences, I'm really not aware of many, save for the tweaks Nvidia did to make it more fit for DXR (and probably less fit for general computing). I'm sure Anand did a piece highlighting the differences (and I'm sure I read it), but nothing striking has stuck.

That said, yes, R&D does not usually pay off after just one iteration. I was just saying they've already made some of that back.
Posted on Reply
Add your own comment
Apr 24th, 2024 23:44 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts