Sunday, December 30th 2018

NVIDIA GeForce RTX 2060 Founders Edition Pictured, Tested

Here are some of the first pictures of NVIDIA's upcoming GeForce RTX 2060 Founders Edition graphics card. You'll know from our older report that there could be as many as six variants of the RTX 2060 based on memory size and type. The Founders Edition is based on the top-spec one with 6 GB of GDDR6 memory. The card looks similar in design to the RTX 2070 Founders Edition, which is probably because NVIDIA is reusing the reference-design PCB and cooling solution, minus two of the eight memory chips. The card continues to pull power from a single 8-pin PCIe power connector.

According to VideoCardz, NVIDIA could launch the RTX 2060 on the 15th of January, 2019. It could get an earlier unveiling by CEO Jen-Hsun Huang at NVIDIA's CES 2019 event, slated for January 7th. The top-spec RTX 2060 trim is based on the TU106-300 ASIC, configured with 1,920 CUDA cores, 120 TMUs, 48 ROPs, 240 tensor cores, and 30 RT cores. With an estimated FP32 compute performance of 6.5 TFLOP/s, the card is expected to perform on par with the GTX 1070 Ti from the previous generation in workloads that lack DXR. VideoCardz also posted performance numbers obtained from NVIDIA's Reviewer's Guide, that point to the same possibility.
In its Reviewer's Guide document, NVIDIA tested the RTX 2060 Founders Edition on a machine powered by a Core i9-7900X processor and 16 GB of memory. The card was tested at 1920 x 1080 and 2560 x 1440, its target consumer segment. Performance numbers obtained at both resolutions point to the card performing within ±5% of the GTX 1070 Ti (and possibly the RX Vega 56 from the AMD camp). The guide also mentions an SEP pricing of the RTX 2060 6 GB at USD $349.99.
Source: VideoCardz
Add your own comment

234 Comments on NVIDIA GeForce RTX 2060 Founders Edition Pictured, Tested

#176
cucker tarlson
eidairaman1, post: 3968051, member: 40556"
PSI vote with my wallet because I am on my 290 VaporX. I also dont advocate a company with the color of greed.
how many time are you going to mention that you bought an AMD card 5 years ago so you're supporting the right company.

fact is they're using 12nm to launch a $350 card that,if the leak is to be true,outperforms AMD's $400 V56 and nearly matches $500 V64.Has RT and DLSS support too,while all Vega has is gimmicky HBCC that they hyped but it turned out to have zero impact. Meanwhile all AMD did this year is launch a 12nm flop at $280 that is barely faster than rx580 oc vs oc.

nvidia has had better value cards at the ~$350-400 price point for quite some time,starting with the gtx 970,then 1070,1070Ti and now rtx 2060 is gonna be no exception.They didn't make the jump to 7nm yet,but still are bringing the performance of 1080/V64 down to $350 level at 12nm.
Posted on Reply
#177
Fouquin
efikkan, post: 3968509, member: 150226"
You have to calculate the inflated price for a fair comparison.
8800 Ultra becomes ~$1007
8800 GTX (which launched at $649) becomes ~$811
Prices for Turing doesn't look so bad then…
You don't need to adjust for inflation for MSRP between generations. Reason being that the inflation of the US Dollar between a set of product launches is not enough to significantly impact the MSRP of the product. For example; the value of the US Dollar between the GTX 680 and GTX 780 would not have changed substantially enough to justify the $150 price increase as compensation. This was not a value adjusted comparison, it was a generational comparison.
Posted on Reply
#178
jabbadap
dj-electric, post: 3968525, member: 87186"
My 2 cents on pricing:

The 8800 GTX was 600$ or today 725$
8800 Ultra was 800$ or today 999$
The 8800 GT was only 300$ or today 360$
All with inflation.

Here's the thing. The 8800 GT gave you about 75%-80% of the performance the 8800 Ultra did for almost a third of its price. It was fairly close in performance, yet much cheaper.
That's how things were sold to us. The 8800 Ultra was the cream of the crop for fine diners, while offering slightly higher performance.

Today, you have:
RTX 2080 Ti for 999$
RTX 2080 for 799$
RTX 2070 for 499$

The performance difference between the RTX 2070 and RTX 2080 Ti is quite substantial. You don't get 75% of flagship performance for half of its price. You get just a little over half.

The flagship to mainstream ratio is very different today than it was 10-11 years ago, even if "cards were still sold at over 750$". There was compensation for the mid range. Today midrange is absolutely cruel.
The only things 300$ will get you is RX 590, or GTX 1060 GDDR5X, with both having 2+ year technologies in them
Uhm 8800 GT msrp was $199, Same chip 8800 GTS was $349. And not to mention 8800 GTX was manufactured on 90nm node and 8800 GT half year later on smaller 65nm node.
Posted on Reply
#179
moproblems99
Renald, post: 3968497, member: 130756"
eople are complaining because Nvidia is putting crazy prices on something easy to produce for them. End of story.
And stop with "Vote with your wallet" or "learn how to save money". I could build myself a right now with the last Threadripper or i9 + 2TB SSD + 2080Ti and don't even sense it on my accounts. It's a question of taking people for idiots and Nvidia is doing it for many years now.
Honestly, there wasn't much they could do as a publicly traded company. The 10 series is already better than anything AMD has and everyone had a serious surplus. Then they released cards that are faster then their last gen. Why lower the prices on something that is already better than the competition? Unfortunately for us, that is just not good business. People need to figure out what they value and stick to it until Navi comes out. Then they better hope that Navi is mostly competitive. I surmise it will be similar to Vega -> Pascal.

cucker tarlson, post: 3968529, member: 173472"
nvidia has had better value cards at the ~$350-400 price point for quite some time,starting with the gtx 970,then 1070,1070Ti and now rtx 2060 is gonna be no exception.They didn't make the jump to 7nm yet,but still are bringing the performance of 1080/V64 down to $350 level at 12nm.
That really depends on the situation. Those of us enjoying Freesync have a much better value metric when combining the cost of a monitor in with a gpu. Do we have the bestest and fastest? No. Do we have a solution that works? Yes.

EDIT: For what is worth, I have $1000 into my monitor and GPU. Going by these leaks, a 2060 and a G-Sync monitor is going to cost a little bit more than that. How bad is the value really?
Posted on Reply
#180
efikkan
Fouquin, post: 3968530, member: 157604"
You don't need to adjust for inflation for MSRP between generations. Reason being that the inflation of the US Dollar between a set of product launches is not enough to significantly impact the MSRP of the product. For example; the value of the US Dollar between the GTX 680 and GTX 780 would not have changed substantially enough to justify the $150 price increase as compensation. This was not a value adjusted comparison, it was a generational comparison.
Wrong, you missed the point, the prices must be calculated with inflation to give a fair comparison.

That does not mean that a price increase is pure inflation, there are many factors here, including varying production costs, competition, etc. But these can only be compared after we have the price corrected for inflation, otherwise any comparison is pointless.
Posted on Reply
#181
lexluthermiester
Renald, post: 3968497, member: 130756"
I didn't read all 7 pages but I have to respond to that because you are clearly misunderstanding the situation and manipulating information.
And that is why you are way behind the conversation. But hey, that's ok. Context means very little anyway.. :rolleyes:
B-Real, post: 3968522, member: 170068"
Haha. 2 pathetic tries to defend this gen.
Key word there. Roll credits!
Posted on Reply
#182
cucker tarlson
moproblems99, post: 3968543, member: 155919"
Honestly, there wasn't much they could do as a publicly traded company. The 10 series is already better than anything AMD has and everyone had a serious surplus. Then they released cards that are faster then their last gen. Why lower the prices on something that is already better than the competition? Unfortunately for us, that is just not good business. People need to figure out what they value and stick to it until Navi comes out. Then they better hope that Navi is mostly competitive. I surmise it will be similar to Vega -> Pascal.



That really depends on the situation. Those of us enjoying Freesync have a much better value metric when combining the cost of a monitor in with a gpu. Do we have the bestest and fastest? No. Do we have a solution that works? Yes.

EDIT: For what is worth, I have $1000 into my monitor and GPU. Going by these leaks, a 2060 and a G-Sync monitor is going to cost a little bit more than that. How bad is the value really?
very true, I myself can't imagine playing without variable refresh rate anymore. I've had it since 2015 and frame synchronisation with low input lag is an absolut must for me,witch g-sync/freesync you're able to achieve both and effortlessly.An example from the game that I play a lot-shadow warrior 2. You're playing a game at 150 fps,suddenly there's all sort of explosions going on the screen and the framerate drops to 80 for a couple of seconds.With VRR you're able to achieve no tearing and no feeling of stutter while this is happening.The framerate drop is still noticeable but the transition is very smooth.

However,that is not the point.AMD themselves tried to deflect the fact that Vega series is simply worse than nvidia's offerings by blind tests (that's the dumbest thing a gpu manufacturer has done in the last decade or ever IMO,our card is as good if you don't look at the fps counter :rolleyes: ) or including a freesync monitor in the price,what followed was this fake msrp scam where they tried to sway the public and reviewers that V64 was a $400 card while in reality it cost somwhere between 1080 and 1080Ti.

Those blind tests and fake msrp was the reason I started to percieve RTG in completely different light,and I laugh when I see peole blindly defending it while bashing nvidia for being shady.
Posted on Reply
#183
Renald
moproblems99, post: 3968543, member: 155919"
Honestly, there wasn't much they could do as a publicly traded company. The 10 series is already better than anything AMD has and everyone had a serious surplus. Then they released cards that are faster then their last gen. Why lower the prices on something that is already better than the competition? Unfortunately for us, that is just not good business. People need to figure out what they value and stick to it until Navi comes out. Then they better hope that Navi is mostly competitive. I surmise it will be similar to Vega -> Pascal.
And I agree, sadly, that's there's not an inch of competition right now. But AMD is fueling consoles and Zen, that's quite good no ?
In the end, by lowering the need of high-end cards, AMD is saving their asses. That's why those cards aren't relevant in current market, there's few people in need for such power except 4K.
Let's wait for Navi.

Edit :
@lexluthermiester : got your attention, no need for insults
Posted on Reply
#184
lexluthermiester
cucker tarlson, post: 3968562, member: 173472"
AMD themselves tried to deflect the fact that Vega series is simply worse than nvidia's offerings by blind tests (that's the dumbest thing a gpu manufacturer has done in the last decade or ever IMO,our card is as good if you don't look at the fps counter :rolleyes: )
While that's not wrong, let's be fair, Radeon cards are a good value for the price/performance ratio they offer. This has been demonstrated with many a review and benchmark.

Renald, post: 3968568, member: 130756"
@lexluthermiester : stop acting like a douch.
Take your own advice sir.
Posted on Reply
#185
cucker tarlson
lexluthermiester, post: 3968569, member: 134537"
While that's not wrong, let's be fair, Radeon cards are a good value for the price/performance ratio they offer. This has been demonstrated with many a review and benchmark.
Fury X was a flop at $650,and while Vega is much better value,it's its price that dictates it. That's why I'm picking on blind tests (which is not anti-consumer,just horribly dumb) the msrp confusion and low availability/high initial price. For those who pick a freesync monitor and can be patient AMD is a good value proposition,true.
Posted on Reply
#186
Renald
lexluthermiester, post: 3968569, member: 134537"
While that's not wrong, let's be fair, Radeon cards are a good value for the price/performance ratio they offer. This has been demonstrated with many a review and benchmark.
Vega had price and consumption problems.
RTX have the same problem (without consumption problems)
Posted on Reply
#187
kings
Renald, post: 3968568, member: 130756"
In the end, by lowering the need of high-end cards, AMD is saving their asses. That's why those cards aren't relevant in current market, there's few people in need for such power except 4K.
Let's wait for Navi.
I dont know why people keep telling that high-end is not relevant! When AMD/ATI competed shoulder to shoulder with Nvidia at the top in past years, I saw no one saying that high-end was not important!

It´s important to compete in the high-end, for a matter of market perception. The brand who have the top performance products, is more likely to sell the rest of the range better! This happens in everything, not only in PC hardware!

That's why AMD is stuck on 20%~25% of market share, despite having good cards in the mid-range for a few years now! Competing only in the mid-range is not enough to gain meaningful market share, as recent years have proved us!
Posted on Reply
#188
lexluthermiester
Renald, post: 3968568, member: 130756"
Edit :
@lexluthermiester : got your attention, no need for insults
Nice edit, now again, take your own advice.
Renald, post: 3968573, member: 130756"
Vega had price and consumption problems.
RTX have the same problem (without consumption problems)
So what you seem to be implying is that neither AMD nor NVidia can satisfy your requirements/needs? Good thing Intel has IGP options then..

kings, post: 3968578, member: 180022"
When AMD competed shoulder to shoulder with Nvidia at the top in past years, I saw no one saying that high-end was not important!
This, Yes.
kings, post: 3968578, member: 180022"
It´s important to compete in the high-end, for a matter of market perception. The brand who have the top performance products, is more likely to sell the rest of the range better! This happens in everything, not only in PC hardware! That's why AMD is stuck on 20%~25% of market share, despite having good cards in the mid-range for a few years now! Competing only in the mid-range is not enough to gain meaningful market share, as recent years have proved us!
AMD offering a high-end product to compete in the market and challenging NVidia like they used too would be excellent!
Posted on Reply
#189
moproblems99
cucker tarlson, post: 3968562, member: 173472"
However,that is not the point.AMD themselves tried to deflect the fact that Vega series is simply worse than nvidia's offerings by blind tests (that's the dumbest thing a gpu manufacturer has done in the last decade or ever IMO,our card is as good if you don't look at the fps counter :rolleyes: ) or including a freesync monitor in the price,what followed was this fake msrp scam where they tried to sway the public and reviewers that V64 was a $400 card while in reality it cost somwhere between 1080 and 1080Ti.
That is exactly their point. What difference does it make if the games look and play good? My current monitor is 75hz. I don't care if I get 80fps or 180fps. After 75, it doesn't matter anymore. Obviously, people with high refresh rates, it does matter. Bottom line, I can get generally the same performance at the same price point no matter which brand I choose. Currently, I have a Vega. Last time, I had a GTX 980. Next time? Since I have Freesync, I likely will have Navi. However, I may have a 2080ti when they hit the price point at which I find value. Currently, there is none for me because I think BF V is trash and I don't care about benchmarks.

My Afterburner overlay stays the same color no matter what color my gpu bleeds.
Posted on Reply
#190
cucker tarlson
moproblems99, post: 3968588, member: 155919"
That is exactly their point. What difference does it make if the games look and play good? My current monitor is 75hz. I don't care if I get 80fps or 180fps. After 75, it doesn't matter anymore. Obviously, people with high refresh rates, it does matter. Bottom line, I can get generally the same performance at the same price point no matter which brand I choose. Currently, I have a Vega. Last time, I had a GTX 980. Next time? Since I have Freesync, I likely will have Navi. However, I may have a 2080ti when they hit the price point at which I find value. Currently, there is none for me because I think BF V is trash and I don't care about benchmarks.

My Afterburner overlay stays the same color no matter what color my gpu bleeds.
comparing a V64 with 1080Ti in a blind test was stupid. You can't tell me you can't pick a winner,unless youre AMD and you think I'm a gullible child.

I get that in your case your stance is reasonable,but some people choose to go by whatever standard fits their narrative. Blind testing - sure,why not. Nvidia Kepler cards lose a couple of frames in a few games after a driver update - shady nvidia clearly gimping performance,AMD is the only choice. Where were they in those circumstances to tell us 1-2 fps in a couple of games across the board is not somehing anyone will percieve.Wait till they fix the drivers in the next update - hell no. Wait for drivers/cards from AMD - no problem apparently.
Posted on Reply
#191
lexluthermiester
cucker tarlson, post: 3968593, member: 173472"
comparing a V64 with 1080Ti in a blind test was stupid. You can't tell me you can't pick a winner,unless youre AMD and you think I'm a gullible child.
Ah but when you factor performance/price ratio into the equation suddenly the dynamic changes.
Posted on Reply
#192
cucker tarlson
lexluthermiester, post: 3968594, member: 134537"
Ah but when you factor performance/price ratio into the equation suddenly the dynamic changes.
what if nvidia compared 1070Ti with fast sync enabled on a 144hz non-vrr monitor vs V64 using freesync.
I'm not arguing the value of freesync,just the stuff they did try to sway us with when Vega launched. Pure performance wise it was a match for a year old GTX 1080,at higher price and way higher power draw.That's the whole reason for doing blind tests and counting freesync into the price. Good points for some,moot points for me. End of story.
Posted on Reply
#193
GoldenX
lexluthermiester, post: 3968458, member: 134537"
Speculation is a ton of fun sure, but most of what has been going on in this thread has been NVidia bashing about price and supposed performance. If it were more about interesting speculation than it would be more fun.
We already know the 4GB variant is gimped to hell and back. So yeah, not much on that front either.
Posted on Reply
#194
Renald
kings, post: 3968578, member: 180022"
I dont know why people keep telling that high-end is not relevant! When AMD/ATI competed shoulder to shoulder with Nvidia at the top in past years, I saw no one saying that high-end was not important!

It´s important to compete in the high-end, for a matter of market perception. The brand who have the top performance products, is more likely to sell the rest of the range better! This happens in everything, not only in PC hardware!
!
I didn't say it's not important, just not relevant at this time and at this price.
They can always upgrade their lineup, that's good, but before AMD will catch up with nvidia, they have nearly a full year ahead. They already outperform AMD by far.

So instead of putting higher and higher prices, they could go same price and a small + in perf. AMD can't compete if they have better perf for same price point.
And going too far doesn't make you sell more. Ferrari can have the fastest car, it won't get all people to get a Ferrari. It's good publicity, sure, but they have that for 10 years now. They have nothing to prove (talking about nvidia here).
Posted on Reply
#195
lexluthermiester
GoldenX, post: 3968612, member: 160319"
We already know the 4GB variant is gimped to hell and back.
Not for sure we don't. Waiting to see the benchmarks and reviews.
Posted on Reply
#196
moproblems99
cucker tarlson, post: 3968593, member: 173472"
comparing a V64 with 1080Ti in a blind test was stupid. You can't tell me you can't pick a winner,unless youre AMD and you think I'm a gullible child.
You are aren't seeing the point. Nobody can tell the difference between a Vega 64 @ 75hz/75fps and a 1080ti @ 75hz/75fps. Chasing fps is totally useless once you hit your monitors refresh rate.
Posted on Reply
#197
cucker tarlson
moproblems99, post: 3968686, member: 155919"
You are aren't seeing the point. Nobody can tell the difference between a Vega 64 @ 75hz/75fps and a 1080ti @ 75hz/75fps. Chasing fps is totally useless once you hit your monitors refresh rate.
were the test done on a 75hz monitor ?
Posted on Reply
#198
moproblems99
cucker tarlson, post: 3968689, member: 173472"
were the test done on a 75hz monitor ?
I really don't care, what's your point?
Posted on Reply
#199
cucker tarlson
moproblems99, post: 3968691, member: 155919"
I really don't care, what's your point?
what is yours ? that there is no monitor over 75hz ?
Posted on Reply
#200
moproblems99
This isn't difficult. Let's even remove an exact refresh rate to make it easier. When two cards are vsynced to refresh rates, you can't tell them apart.
Posted on Reply
Add your own comment