Sunday, December 30th 2018

NVIDIA GeForce RTX 2060 Founders Edition Pictured, Tested

Here are some of the first pictures of NVIDIA's upcoming GeForce RTX 2060 Founders Edition graphics card. You'll know from our older report that there could be as many as six variants of the RTX 2060 based on memory size and type. The Founders Edition is based on the top-spec one with 6 GB of GDDR6 memory. The card looks similar in design to the RTX 2070 Founders Edition, which is probably because NVIDIA is reusing the reference-design PCB and cooling solution, minus two of the eight memory chips. The card continues to pull power from a single 8-pin PCIe power connector.

According to VideoCardz, NVIDIA could launch the RTX 2060 on the 15th of January, 2019. It could get an earlier unveiling by CEO Jen-Hsun Huang at NVIDIA's CES 2019 event, slated for January 7th. The top-spec RTX 2060 trim is based on the TU106-300 ASIC, configured with 1,920 CUDA cores, 120 TMUs, 48 ROPs, 240 tensor cores, and 30 RT cores. With an estimated FP32 compute performance of 6.5 TFLOP/s, the card is expected to perform on par with the GTX 1070 Ti from the previous generation in workloads that lack DXR. VideoCardz also posted performance numbers obtained from NVIDIA's Reviewer's Guide, that point to the same possibility.
In its Reviewer's Guide document, NVIDIA tested the RTX 2060 Founders Edition on a machine powered by a Core i9-7900X processor and 16 GB of memory. The card was tested at 1920 x 1080 and 2560 x 1440, its target consumer segment. Performance numbers obtained at both resolutions point to the card performing within ±5% of the GTX 1070 Ti (and possibly the RX Vega 56 from the AMD camp). The guide also mentions an SEP pricing of the RTX 2060 6 GB at USD $349.99.
Source: VideoCardz
Add your own comment

234 Comments on NVIDIA GeForce RTX 2060 Founders Edition Pictured, Tested

#201
cucker tarlson
moproblems99 said:
This isn't difficult. Let's even remove an exact refresh rate to make it easier. When two cards are vsynced to refresh rates, you can't tell them apart.
was that the point of the blind test ? you seem to have no idea of what I'm referring to.
Posted on Reply
#202
moproblems99
The point of the test was that you couldn't tell which card was in which machine. That is what a blind test is used for. I still don't see why this is so difficult.
Posted on Reply
#203
Fouquin
efikkan said:
Wrong, you missed the point, the prices must be calculated with inflation to give a fair comparison.

That does not mean that a price increase is pure inflation, there are many factors here, including varying production costs, competition, etc. But these can only be compared after we have the price corrected for inflation, otherwise any comparison is pointless.
You missed the entire point of the comparison. It was a chart to simply show the launch price for each flagship. It was not a comparison across multiple generations (I.E. not meant to compare adjusted value of non-subsequent products).

The entire point of the chart was to show that prices did not increase with each new generation. No adjusted value required.
Posted on Reply
#204
GoldenX
lexluthermiester said:
Not for sure we don't. Waiting to see the benchmarks and reviews.
Sure, here are some wild predictions for when it happens, the 4GB 128bit variant will be crap, just like Navi will be too.
Posted on Reply
#205
cucker tarlson
moproblems99 said:
The point of the test was that you couldn't tell which card was in which machine. That is what a blind test is used for. I still don't see why this is so difficult.
so,what was the point of having two cards,a v64 and 1080ti,which are 30% apart in performance,in two rigs that are capped at 100hz. what was the point of it really? the card could not could not deliver with pure performance numbers, so they capped the fps when comparing it against 1080Ti in the most laughable AMD-sponsored test I've seen :laugh: can you imagine the reverse, if navi beat 2060 by 30% and someone did a nvidia-sponsored blind test where they're both delivering 100 fps.You'd both be the first people to heckle it,and rightly so.

GoldenX said:
Sure, here are some wild predictions for when it happens, the 4GB 128bit variant will be crap, just like Navi will be too.
:laugh:
the 4gb 128-bit variant will easily match 1070 at 1080p/1440p if the 2060 6gb leak is true. and how is navi gonna be crap ? even if they took polaris at 7nm and ddr6 that'd make a good card.
Posted on Reply
#206
INSTG8R
My Custom Title
cucker tarlson said:
so,what was the point of having two cards,a v64 and 1080ti,which are 30% apart in performance,in two rigs that are capped at 100hz. what was the point of it really? the card could not could not deliver with pure performance numbers, so they capped the fps when comparing it against 1080Ti in the most laughable AMD-sponsored test I've seen :laugh: can you imagine the reverse, if navi beat 2060 by 30% and someone did a nvidia-sponsored blind test where they're both delivering 100 fps.You'd both be the first people to heckle it,and rightly so.


:laugh:
the 4gb 128-bit variant will easily match 1070 at 1080p/1440p if the 2060 6gb leak is true. and how is navi gonna be crap ? even if they took polaris at 7nm and ddr6 that'd make a good card.
Point of it was 100FPS of smooth gaming is possible. how much more is needed? I don’t worry that there’s cards that can get 10-20FPS more than I can because I get my “100FPS” oF smooth Freesync gaming. Diminishing returns for epeen after that.
Posted on Reply
#207
lexluthermiester
GoldenX said:
Sure, here are some wild predictions for when it happens, the 4GB 128bit variant will be crap, just like Navi will be too.
That's definitely a set of wild predictions. Can't see either one happening.
Posted on Reply
#208
EarthDog
cucker tarlson said:
:laugh:
the 4gb 128-bit variant will easily match 1070 at 1080p/1440p if the 2060 6gb leak is true. and how is navi gonna be crap ? even if they took polaris at 7nm and ddr6 that'd make a good card.
Haters going hate... it's just that simple. :)
Posted on Reply
#209
lexluthermiester
EarthDog said:
Haters going hate... it's just that simple. :)
Right. You think they'd give up or have something better to do..
Posted on Reply
#210
moproblems99
cucker tarlson said:
o,what was the point of having two cards,a v64 and 1080ti,which are 30% apart in performance,in two rigs that are capped at 100hz. what was the point of it really?
I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here. Once you hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing. If anyone is concerned about extra power draw but then fps chases, they are silly. There is simply no point in worrying about frames over your monitors refresh rate. That was the whole purpose of the test.

If you can't understand that, good day!
Posted on Reply
#211
bug
moproblems99 said:
I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here. Once you hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing. If anyone is concerned about extra power draw but then fps chases, they are silly. There is simply no point in worrying about frames over your monitors refresh rate. That was the whole purpose of the test.

If you can't understand that, good day!
I think we can all agree on that, if you meant "once your min fps* hits your monitor's refresh rate". But even that is contingent to one title.

*or at least the 99 or 90 percentile, because absolute min fps is usually just a freak occurence
Posted on Reply
#212
moproblems99
bug said:
I think we can all agree on that, if you meant "once your min fps* hits your monitor's refresh rate". But even that is contingent to one title.

*or at least the 99 or 90 percentile, because absolute min fps is usually just a freak occurence
Considering we couldn't get the basics down, I didn't want to go advanced yet. But yes, the extra processing power is much better for your minimums.
Posted on Reply
#213
cucker tarlson
moproblems99 said:
I mean I really don't know what else to say because we just covered it over the last two pages of posts and have come full circle here. Once you hit your monitors refresh rate, any 'extra' performance is going down the toilet for fps chasing. If anyone is concerned about extra power draw but then fps chases, they are silly. There is simply no point in worrying about frames over your monitors refresh rate. That was the whole purpose of the test.

If you can't understand that, good day!
you lay it down in black in white,yet you can't understand. tests that are limited in any way,in this case by the resolution/refresh rate (100hz) and the choice of games (doom,shadow warrior) are just pointless and are not a good way to test performance. I don't know how to express it in any simpler terms either. I mean,a gpu comparison with a v-sync cap ? Who even heard of something that stupid.

What is the exact point that we should stop "fps chasing" ? 90 fps ? 100 fps ? 85 fps ? Can you please tell us ? Cause I was under the impression that it's subjective in every case. I absolutely can feel 100 vs 130 fps instantly. The blind test was pointless and just pulling wool over the public eyes.
Posted on Reply
#214
moproblems99
ermahgerd.....

cucker tarlson said:
you lay it down in black in white,yet you can't understand. tests that are limited in any way,in this case by the resolution/refresh rate (100hz) and the choice of games (doom,shadow warrior) are just pointless and are not a good way to test performance. I don't know how to express it in any simpler terms either. I mean,a gpu comparison with a v-sync cap ? Who even heard of something that stupid.
This really is black and white. They weren't comparing GPUs, they were comparing a scenarios. It went like this: Hey, you can play X game at Y resolution at Z Hz and you can't tell the difference. Which is absolutely correct. Was it the most eye opening test ever? No. Did it show people that if you target your monitor that any gpu will likely work? Yes.

cucker tarlson said:
What is the exact point that we should stop "fps chasing" ? 90 fps ? 100 fps ? 85 fps ? Can you please tell us ? Cause I was under the impression that it's subjective in every case. I absolutely can feel 100 vs 130 fps instantly. The blind test was pointless and just pulling wool over the public eyes.
This is perhaps the most black and white and easiest. The correct answer is, drum roll please: your monitor's refresh rate!
Posted on Reply
#215
cucker tarlson
moproblems99 said:
ermahgerd.....



This really is black and white. They weren't comparing GPUs, they were comparing a scenarios. It went like this: Hey, you can play X game at Y resolution at Z Hz and you can't tell the difference. Which is absolutely correct. Was it the most eye opening test ever? No. Did it show people that if you target your monitor that any gpu will likely work? Yes.



This is perhaps the most black and white and easiest. The correct answer is, drum roll please: your monitor's refresh rate!
why didn't they go with a 144/165hz one for the tests then ? And it's obvious you're unlikely to see the difference when they just tell you to sit down and play a couple of game for a couple of minutes each.But in the games you play extensively at home you'll easily see the fps difference immediately. Imagine you've played hundreds of hours of BF1 at 100 fps and the same amount at 130 fps. You'll see the difference right away.
Posted on Reply
#216
lexluthermiester
moproblems99 said:
your monitor's refresh rate!
Not to take sides here, because both of you have made fair points, the above is correct. Once a systems minimum FPS matches or exceeds the maximum refresh rate of the display, FPS chasing becomes a waste of time, effort and energy.

EDIT; Realistically, 120hz is the physical limit to what the human eye can distinguish in real-time. While we can still see a difference in "smoothness" above 120hz, it's only a slight perceptional difference and can be very subjective from person to person.
Posted on Reply
#217
moproblems99
cucker tarlson said:
why didn't they go with a 144/165hz one for the tests then ? And it's obvious you're unlikely to see the difference when they just tell you to sit down and play a couple of game for a couple of minutes each.But in the games you play extensively at home you'll easily see the fps difference immediately. Imagine you've played hundreds of hours of BF1 at 100 fps and the same amount at 130 fps. You'll see the difference right away.
We both know the answer to that question: The Vega likely can't do it. Wow, that is disingenuous you are probably telling yourself. The reason they did so is that it is very likely many more people play at 100hz and below. So that is who they targeted. Where it would benefit them the most.

And I don't even want to imagine playing any BF title for hundreds of hours. It only brings up images of my own suicide.

lexluthermiester said:
Not to take sides here, because both of you have made fair points, the above is correct. Once a systems minimum FPS matches or exceeds the maximum refresh rate of the display, FPS chasing becomes a waste of time, effort and energy.
I don't play for sides. There is really nothing Cucker has said that is wrong, just that it doesn't apply to what started this debate about the blind test.
Posted on Reply
#218
kanecvr
lexluthermiester said:
Everyone is complaining about price. Is it really that expensive? And if so, is really that difficult to save money for an extra month or two?

Seems to be at the end of the card.
You don't seem to get it do you - nvidia has been steadily increasing the cost of it's graphics cards for the last 4-5 generations, and swapping products around. With the 6xx series, the GTX 680 was NOT actually their high-end product - no - that was the original titan. So they released the 680 witch is a mid end card branded as a high end one. The codename for the GPU is GK104 (G=geforce K=Kepler 4=model number). In previous generation cards, Fermi, the 104 numbering was assigned to the GTX 460 - GF104, and the high end model was the GF100. Currently, the GP106 card is the GTX 1060 - but what bore the 106 moniker a few generations ago? the GTX 450! yeah. The GF106 is the GTX 450, witch means the GP104 SHOUD have been the GTX 1050, NOT the 1060. Nvidia keeps swapping names, making even more money off the backs of people like you, who are willing to spend the ludicrous amounts of money these bottomless pit of a company is asking for.

The 680 should have been the GK100, witch nvidia later renamed into GK110 because of the 7 series or kepler refresh - and sold as the original titan - a 1000$ card! No other high end video card sold for that much before - the GTX 480 had a MSRP of 499$, and the 580 was 450$ - so nvidia doubled their profits with Kepler by simply moving the high and mid end cards around their lineup and "inventing" two new models - the Titan, and later, the GK110b, the 780ti - but not before milking consumers with their initial run of defective GK110 chips, the GTX 780, or GK110-300.

Now nvidia is asking 400$ for a mainstream card, almost as much as the 2, 4 and 5 series high end models cost. But wait - the current flagship, the Titan RTX is now 2500 bloody dollars, and the 2800ti was 1299$ at launch, with prices reaching 1500$ in some places. In fact, it's still 1500$ at most online retailers in my country. F#(k that! 1500$ can buy you a lot of nice stuff - a decent car, a good bike, a boat, loads of clothes, a nice vacation - I'm not forking over to nvidia to pay for a product witch has cost around the 400$ mark for the better part of 18 freakin' years.

Don't you realize we're being taken for fools? Companies are treating us like idiots, and we're happy to oblige by forking out more cash for shittier products...
Posted on Reply
#219
EarthDog
So... youd rather have a 'high-end' card from a 'couple' generations ago than a modern card with the latest features which performs the same (or better)....just because its labeled as a midrange card?

I'd rather go with the modern card with all the latest gizmos, and less power use given the same cost and performance.
Posted on Reply
#220
GoldenX
EarthDog said:
So... youd rather have a 'high-end' card from a 'couple' generations ago than a modern card with the latest features which performs the same (or better)....just because its labeled as a midrange card?

I'd rather go with the modern card with all the latest gizmos, and less power use given the same cost and performance.
Amen to that. Plus warranty.
Posted on Reply
#221
efikkan
kanecvr said:
With the 6xx series, the GTX 680 was NOT actually their high-end product - no - that was the original titan.
No, the first Titan was released along with the 700 series.
The top model of the 600 series was the GTX 690, having two GK104 chips.

kanecvr said:

So they released the 680 witch is a mid end card branded as a high end one. The codename for the GPU is GK104 (G=geforce K=Kepler 4=model number). In previous generation cards, Fermi, the 104 numbering was assigned to the GTX 460 - GF104, and the high end model was the GF100. <snip>

The 680 should have been the GK100, witch nvidia later renamed into GK110 because of the 7 series or kepler refresh - and sold as the original titan - a 1000$ card!
I have to correct you there.
GK100 was bad and Nvidia had to do a fairly "last minute" rebrand of "GTX 670 Ti" into "GTX 680". (I remember my GTX 680 box had stickers over all the product names) The GK100 was only used for some compute cards, but the GK110 was a revised version, which ended up in the GTX 780, and was pretty much what the GTX 680 should have been.

You have to remember that Kepler was a major architectural redesign for Nvidia.
Posted on Reply
#222
Berfs1
Renald said:
400$ for an high end card (meaning it's the best a company can do) like the the 290 for the record, is understandable.
400$ for a mid-range card (see what a 2080 Ti can do) is not good.

Nvidia doubled they're prices because AMD is waiting for next year to propose something, because they are focusing on CPU and don't have Nvidia nor Intel firepower. They can't do both GPU and CPU.
So you are bending over waiting for the theft "a month or two".

Here is your invalidation.
You are saying a 2080 Ti should be 600$? Because they “doubled” it? I know you are going to say “it was a metaphor”, and that’s where problems come in. It’s NOT double, at most I would say they charged 10-25% more than it’s worth. 10-25% is NOT double. That’s where customer satisfaction can be altered, by misleading a consumer market and saying it’s absolutely not worth it where in reality it’s not like what you said.

efikkan said:
No, the first Titan was released along with the 700 series.
The top model of the 600 series was the GTX 690, having two GK104 chips.


I have to correct you there.
GK100 was bad and Nvidia had to do a fairly "last minute" rebrand of "GTX 670 Ti" into "GTX 680". (I remember my GTX 680 box had stickers over all the product names) The GK100 was only used for some compute cards, but the GK110 was a revised version, which ended up in the GTX 780, and was pretty much what the GTX 680 should have been.

You have to remember that Kepler was a major architectural redesign for Nvidia.
The GTX 670 Ti doesn’t exist. It’s not a rebrand. Also, if I recall correctly, the 680 has the fully enabled GK104 chip (with the 690 being the same but two GK104 chips). Kepler 1st gen capped at 1536 cores, Kepler 2nd gen capped at 2880 cores. Just by architecture. Not for the intent to rip people off.
Posted on Reply
#223
cucker tarlson
It's a 750mm2 die with 11 gigs of ddr6 and hardware RT acceleration, you can have one at $1000 or you can have none, they'll not sell it to you at $600 if vega 64 is $400 and 2080Ti is 1.85x Vega's performance.




2080Ti RTX = V64 rasterization

Posted on Reply
#224
moproblems99
They'll sell it for what people will pay. Which is apparently what it is listed at.
Posted on Reply
#225
Renald
Berfs1 said:
You are saying a 2080 Ti should be 600$? Because they “doubled” it? I know you are going to say “it was a metaphor”, and that’s where problems come in. It’s NOT double, at most I would say they charged 10-25% more than it’s worth. 10-25% is NOT double. That’s where customer satisfaction can be altered, by misleading a consumer market and saying it’s absolutely not worth it where in reality it’s not like what you said.
No, It's not he comparison I wish to make. The 2080 Ti is way more powerful than a 2060, any memory setup considered.

A 2080Ti should cost ~800/900 IMO and a 2080 600$.
If AMD was in the competition, I think that nvidia would be much closer to that range of prices.
Posted on Reply
Add your own comment