Friday, February 22nd 2019

NVIDIA Unveils the GeForce GTX 1660 Ti 6GB Graphics Card

NVIDIA today unveiled the GeForce GTX 1660 Ti graphics card, which is part of its new GeForce GTX 16-series product lineup based on the "Turing" architecture. These cards feature CUDA cores from the "Turing" generation, but lack RTX real-time raytracing features due to a physical lack of RT cores, and additionally lack tensor cores, losing out on DLSS. What you get instead with the GTX 1660 Ti is a upper-mainstream product that could play most eSports titles at resolutions of up to 1440p, and AAA titles at 1080p with details maxed out.

The GTX 1660 Ti is based on the new 12 nm "TU116" silicon, and packs 1,536 "Turing" CUDA cores, 96 TMUs, 48 ROPs, and a 192-bit wide memory interface holding 6 GB of GDDR6 memory. The memory is clocked at 12 Gbps, yielding 288 GB/s of memory bandwidth. The launch is exclusively partner-driven, and NVIDIA doesn't have a Founders Edition product based on this chip. You will find custom-design cards priced anywhere between USD $279 to $340.

We thoroughly reviewed four GTX 1660 Ti variants today: MSI GTX 1660 Ti Gaming X, EVGA GTX 1660 Ti XC Black, Zotac GTX 1660 Ti, MSI GTX 1660 Ti Ventus XS.
Add your own comment

22 Comments on NVIDIA Unveils the GeForce GTX 1660 Ti 6GB Graphics Card

#1
Frutika007
The reviews are out. 2% faster than GTX 1070 and 25% faster than RX 590. Costs about 279$. Good job Nvidia. Your move AMD.....
Posted on Reply
#2
cucker tarlson
rx590 was a stupid card to begin with.as if amd just said hey,let's release rx580 again but more expensive this time.
I guess if amd follows with price cuts for v56 then every team will be happy.
Posted on Reply
#3
Frutika007
cucker tarlson
I guess if amd follows with price cuts for v56 then every team will be happy.
That misinformation seems to have spread more rapidly and aggressively than wildfire. There is no price cut for Vega 56. Just one retailer clearing his stock(he had only 2/3 gpus that have already been sold,that's why now it's stock out) and AMD turned that into a hot news story to mislead people.
Posted on Reply
#4
birdie


That power efficiency - NVIDIA has outdone themselves.

I shudder to think what NVIDIA will do to AMD once they release Turing on 7nm - those cards will literally fly.

And I wouldn't call the lack of RTX/DLSS a disadvantage. There are just two games utilizing RTX (where it's hardly visible) and DLSS is still a blurry mess aside from 3DMark where it works but only because it's a benchmark with a picture set in stone.

This card will be widly successful, though I doubt it will be sold more than the GTX 1060 already has (I, for one, don't upgrade unless a performance uplift is less than 150%).
Posted on Reply
#5
Frutika007
birdie


That power efficiency - NVIDIA has outdone themselves.

I shudder to think what NVIDIA will do to AMD once they release Turing on 7nm - those cards will literally fly.

And I wouldn't call the lack of RTX/DLSS a disadvantage. There are just two games utilizing RTX (where it's hardly visible) and DLSS is still a blurry mess aside from 3DMark where it works but only because it's a benchmark with a picture set in stone.

This card will be widly successful, though I doubt it will be sold more than the GTX 1060 already has (I, for one, don't upgrade unless a performance uplift is less than 150%).
SO true,the power efficiency of turing architecture seems to be incredible. I knew those rt cores and tensor cores were the power hogs,excluding those cores will expose the real power efficiency of turing architecture.
Posted on Reply
#6
cucker tarlson
birdie
I shudder to think what NVIDIA will do to AMD once they release Turing on 7nm - those cards will literally fly.
nah,we'll not see a bigger perf jump than pascal-turing really.they can,but they won't.they'll pack the die with more rt/tensor cores probably and get rtx/dlss actually running how they're supposed to.
what I mean is they'll wanna bridge the gap between rtx on/off performance first.
I think we'll get high prices and big dies like rt x 20-series too,but this time with much better rtrt performance.
Posted on Reply
#7
efikkan
birdie
I shudder to think what NVIDIA will do to AMD once they release Turing on 7nm - those cards will literally fly.
As I've been saying for a while; 7nm will not save AMD, a more efficient architecture will get larger gains than a less efficient architecture, the gap will only increase, as we saw with last node shrink.

AMD got very little gains from their node shrink of Vega, we can hope Navi will be a little better, but ultimately they need a brand new architecture.

cucker tarlson
nah,we'll not see a bigger perf jump than pascal-turing really.they can,but they won't.they'll pack the die with more rt/tensor cores probably and get rtx/dlss actually running how they're supposed to.

what I mean is they'll wanna bridge the gap between rtx on/off performance first.

I think we'll get high prices and big dies like rt x 20-series too,but this time with much better rtrt performance.
The performance gains of Turing is nothing to complain about. E.g. RTX 2080 Ti is 39% faster than GTX 1080 Ti, and that's without any node shrink.

Part of the cost of Turing is the huge dies, which have been pushing the node a little too far for cost efficiency. Hopefully 7nm+(EUV) from Samsung or TSMC will be so good they don't have to push the sizes to this extreme again, and keep the costs a little lower.
Posted on Reply
#8
cucker tarlson
efikkan
(...) and keep the costs a little lower.
production costs yes,customer costs no.
I'm pretty sure we won't get to see the luxury of seeing lower production costs translate to lower prices with both gpu makers' current "this is the new normal" approach.
Posted on Reply
#9
efikkan
cucker tarlson
production costs yes,customer costs no.
I'm pretty sure we won't get that with nvidia's "this is the new normal" approach.
I would kindly remind you about this. The prices have varied a lot in the past.
Posted on Reply
#10
TesterAnon
To bad they all come with a 120W limit that can barely be increased.
Posted on Reply
#11
wolf
Performance Enthusiast
Decent successor to the 1060! With this level of power efficiency this chip will be a perfect fit for gaming laptops.
Posted on Reply
#12
Darmok N Jalad
R4WN4K
That misinformation seems to have spread more rapidly and aggressively than wildfire. There is no price cut for Vega 56. Just one retailer clearing his stock(he had only 2/3 gpus that have already been sold,that's why now it's stock out) and AMD turned that into a hot news story to mislead people.
From what I’ve heard, AMD was likely waiting to make a price adjustment on Vega 56 when they see what the 1660 Ti actually ends up retailing at.
Posted on Reply
#13
GoldenX
Great card, high price. A review of one of the cheaper variants in the future would be great.
The lack of competition is getting worse and worse. Is AMD even trying?
Posted on Reply
#14
Xzibit
1070 with 2GB less memory
Posted on Reply
#15
Frutika007
Xzibit
1070 with 2GB less memory
1070 uses gddr5, 1660Ti is using gddr6; so bluntly using the term '2GB less memory' is kinda misleading. Although you are right on the surface,but there's a deeper fact that gets misguided.
Posted on Reply
#16
efikkan
It's strange how memory matters when it favors "your team". Back when Fury X had 4 GB vs. GTX 980 Ti's 6 GB many claimed it didn't matter because HBM was so much better

What matters is how it performs in the real world. There are differences in compression and memory management, which makes it pointless to focus solely on technical specs. We should rely on good benchmarks for this. The ones I've seen have showed no issues for RTX 2060 6 GB in 1440p, so I wouldn't worry.
Posted on Reply
#17
robot zombie
Neat little GPU. Definitely seems like a solid answer to the GTX 1060. The performance/efficiency is great! Though honestly, in some cases you'd have to be buying it for that when you consider that right now there are some decent 2060's going for $350. RTX be damned, it's hard to justify a souped-up, $340 1660ti when you can get a caddy 2060 and have more oomph for just $10 more. Even the leap from $280 for a 1660 to $350 for a 2060 is hard for me to get my head around. Reminds me of 1050ti vs 1060 6gb. Just doesn't seem like a lot of money to have the better card, you know? Or is it undercutting the higher-end model? Hard call to make. From benchmarks it looks like it keeps up very well with the 2060. And with RTX really not counting for much on that card, the 1660 ti starts to look a lot more compelling.

This will be a killer budget (whatever that is now) 1080p card. RX 590 has nothing on a card like this. Even those used 480's and 580's can't truly compete, being as hot/loud as they are. This will be great for laptops, mATX/itx builds, or any budget gaming rig. Say what you want about Nvidia - I'm not in love with everything that they do, but sometimes they do okay. I'm betting these will be hugely popular. Be nice if they'd come down a *little* bit on the price, but I suppose it's fair relative to what everything else goes for. I dunno... not a lot to complain about afaic.
Posted on Reply
#18
iO
efikkan
What matters is how it performs in the real world. There are differences in compression and memory management, which makes it pointless to focus solely on technical specs. We should rely on good benchmarks for this. The ones I've seen have showed no issues for RTX 2060 6 GB in 1440p, so I wouldn't worry.
It mostly isn't an issue yet but there are already some games with bad frame timings when they run out of memory.
Posted on Reply
#19
Crackong
Okay, 1660Ti is Great.
Now give us the 1880Ti
$500 , 30% more performance over gtx 1080

I would happy to sell my 2080 at 550 and get the 1880Ti, let someone else enjoy the Holy Grail BETA experience.
Posted on Reply
#20
hat
Enthusiast
So it sucks a little less power than a 1070 while being a little faster. Not bad, but my 1070s aren't getting replaced.
Posted on Reply
#21
efikkan
Crackong
Okay, 1660Ti is Great.

Now give us the 1880Ti
Not likely. Putting aside the fact that the "1660" naming is strange.

There are once again rumors about a "GTX 1180", this time from HP. This is most likely just a typo or an old document with an assumed product name for the RTX products.
Posted on Reply
#22
Darmok N Jalad
Wouldn’t it be a 1680? I think Anandtech said they went with 16 to put the card closer to the 20 series than to the 10 series. Arbitrary, I’m sure, but they’d make a mess if they started calling cards all sorts of names.
Posted on Reply
Add your own comment