Monday, April 22nd 2019

NVIDIA to Flesh out Lower Graphics Card Segment with GeForce GTX 1650 Ti

It seems NVIDIA's partners are gearing up for yet another launch, sometime after the GTX 1650 finally becomes available. ECC Listings have made it clear that partners are working on another TU117 variant, with improved performance, sitting between the GTX 1650 and the GTX 1660, which will should bring the fight to AMD's Radeon RX 580. Of course, with the GTX 1660 sitting pretty at a $219 price, this leaves anywhere between the GTX 1650's $149 and the GTX 1660's $229 for the GTX 1650 Ti to fill. With the GTX 1660 being an average of 13% faster than the RX 580, it makes sense for NVIDIA to look for another SKU to cover that large pricing gap between the 1650 and the 1660.

It's speculated that the GeForce GTX 1650 could feature 1024 CUDA Cores, 32 ROPs and 64 TMUs. These should be paired with the same 4 GB GDDR5 VRAM running across a 128-bit bus at the same 8000 MHz effective clock speeds as the GTX 1650, delivering a bandwidth of 128 GB/s. Should NVIDIA be able to pull the feat of keeping the same 75W TDP between its Ti and non-Ti GTX 1650 (as it did with the GTX 1660), that could mean that a 75 W graphics card would be contending with AMD's 185 W RX 580 - a mean, green feet in the power efficiency arena. A number of SKUs for the GTX 1650 Ti have been leaked on ASUS' side of the field, which you can find after the break.
Leaked ASUS GTX 1650 Ti cards
  • ASUS GeForce GTX 1650 Ti Dual
  • ASUS GeForce GTX 1650 Ti Dual Advanced
  • ASUS GeForce GTX 1650 Ti Dual Overclock
  • ASUS ROG STRIX GTX 1650 Ti Gaming
  • ASUS ROG STRIX GTX 1650 Ti Gaming Advanced
  • ASUS ROG STRIX GTX 1650 TI Gaming Overclock
  • ASUS TUF GTX 1650 Ti Gaming
  • ASUS TUF GTX 1650 Ti Gaming Advanced
  • ASUS TUF GTX 1650 Ti Gaming Overclock
  • ASUS Phoenix GTX 1650 Ti Gaming
  • ASUS Phoenix GTX 1650 Ti Gaming Overclock
  • ASUS GeForce GTX 1650 Ti Low Profile
  • ASUS GeForce GTX 1650 Ti Low Profile Overclock
Sources: via WCCFTech, ECC, Komachi @ Twitter
Add your own comment

21 Comments on NVIDIA to Flesh out Lower Graphics Card Segment with GeForce GTX 1650 Ti

#1
jabbadap
1660 msrp is $219, not 229. So full tu117 is more likely 1024cc with two 512cc GPCs. That will probably be more or less gtx1060 6GB territory in performance.
Posted on Reply
#2
Nihilus
If this does use 128 bit GDDR5 and not GDDR6, this will be a huge fail.
Posted on Reply
#3
IceScreamer
Hopefully we see RX570-RX580 performance without a 6pin. Just need to wait a little bit longer.
Posted on Reply
#4
notb
IceScreamer, post: 4035021, member: 161373"
Hopefully we see RX570-RX580 performance without a 6pin. Just need to wait a little bit longer.
To be honest, I'd rather see 1650 as the top 75W variant. It's a 1050Ti successor price-wise as well.

Also, 1660 is using more power than 1060, so the gap between 1650 and 1660 will be quite large. I'd rather have a GPU in the middle than one artificially limited.
1050Ti was limited as well. It had more cores, but lower clocks.
1050 and 1050Ti had almost identical power draw. 1050Ti was faster because it had more RAM - something Nvidia won't be able to repeat this time.
Posted on Reply
#5
Nihilus
notb, post: 4035044, member: 165619"
1050Ti was faster because it had more RAM - something Nvidia won't be able to repeat this time.
At this performance level, the difference between 2 GB and 4 GB matters. The difference between 4 GB and 6 GB - not so much. Also, the GTX 1050ti has more shader cores than the GTX 1050, so let's not kid ourselves.

Furthermore, the 3 GB version of the GTX 1050 was barely faster than the 2 GB version. Most of the gains were diminished by the reduced bandwidth.
Posted on Reply
#6
Darmok N Jalad
Power efficiency is what would give the 1650 the win. If it can compete with the 570/580 without a power connector, that makes it the easy “drop-in” upgrade for the old OEM tower. By default, Nvidia could charge more, just like with the 1050 Ti. AMD needs better efficiency in the mid-tier. Even if they just shrunk Polaris down to 7nm to compete, it would be something. I hope they have something ready to answer here, because Nvidia has answers that AMD doesn’t.
Posted on Reply
#7
notb
Nihilus, post: 4035054, member: 97108"
At this performance level, the difference between 2 GB and 4 GB matters. The difference between 4 GB and 6 GB - not so much. Also, the GTX 1050ti has more shader cores than the GTX 1050, so let's not kid ourselves.

Furthermore, the 3 GB version of the GTX 1050 was barely faster than the 2 GB version. Most of the gains were diminished by the reduced bandwidth.
Based on raw compute power, 1050Ti should be ~14% faster than 1050 (20% more cores, 5% lower base clocks). They had similar boost potential as well.
In games 1050Ti was 30% faster.
TPU compared 1050 and 1050Ti with the same cooler (MSI Gaming):
https://www.techpowerup.com/reviews/MSI/GTX_1050_Ti_Gaming_X/27.html
Posted on Reply
#8
Nihilus
notb, post: 4035066, member: 165619"
Based on raw compute power, 1050Ti should be ~14% faster than 1050 (20% more cores, 5% lower base clocks). They had similar boost potential as well.
In games 1050Ti was 30% faster.
TPU compared 1050 and 1050Ti with the same cooler (MSI Gaming):
https://www.techpowerup.com/reviews/MSI/GTX_1050_Ti_Gaming_X/27.html
18/82 = 22% faster (82% vs 100%)

The 2 cards were getting virtually the same clocks in game:

https://www.techpowerup.com/reviews/MSI/GTX_1050_Gaming_X/31.html
https://www.techpowerup.com/reviews/MSI/GTX_1050_Ti_Gaming_X/31.html

So yeah, like a 2% advantage.

At this performance level: 4 GB GDDR6 >>> 6 GB GDDR5, given all else equal.
Posted on Reply
#9
Chloe Price
Nihilus, post: 4035018, member: 97108"
If this does use 128 bit GDDR5 and not GDDR6, this will be a huge fail.
GDDR6 in a lower mid-end card would be insane.
Posted on Reply
#10
Nihilus
Chloe Price, post: 4035099, member: 123719"
GDDR6 in a lower mid-end card would be insane.
If they put 6 GB of GDDR6 in a mid range card, I don't see why they couldn't put 4 GB in this card.
Posted on Reply
#11
Caring1
"It's speculated that the GeForce GTX 1650 could feature 1024 CUDA Cores, 32 ROPs and 64 TMUs. These should be paired with the same 4 GB GDDR5 VRAM running across a 128-bit bus at the same 8000 MHz effective clock speeds as the GTX 1650"
So the 1650 is going to be the same as the 1650? :p
Posted on Reply
#12
IceScreamer
notb, post: 4035044, member: 165619"
To be honest, I'd rather see 1650 as the top 75W variant. It's a 1050Ti successor price-wise as well.

Also, 1660 is using more power than 1060, so the gap between 1650 and 1660 will be quite large. I'd rather have a GPU in the middle than one artificially limited.
1050Ti was limited as well. It had more cores, but lower clocks.
1050 and 1050Ti had almost identical power draw. 1050Ti was faster because it had more RAM - something Nvidia won't be able to repeat this time.
I get what you're trying to say, and partially agree. Have the x50 models (including Ti) ever been released with a higher than 75w TDP before?
Posted on Reply
#13
Readlight
Only take 10 years until they made 75W card for 60 fps
to late. worthless because cpu can't keep up for most games.
Posted on Reply
#14
Caring1
Only 13 cards from ASUS based on this chip, they must be slipping :shadedshu:
Posted on Reply
#15
notb
Caring1, post: 4035191, member: 153156"
Only 13 cards from ASUS based on this chip, they must be slipping :shadedshu:
Lets not forget ASUS was one of the few companies that designed a custom Vega 64 card (and actually sold it!). They must be rich clearly. ;-)

Jokes aside, 5 different coolers (one LP) seem pretty normal for such a mainstream product.
Some versions, on the other hand, look weird at best, e.g. the overclocked LP. Or selling both Advanced and Overclock variants...

BTW: ASUS is currently selling 9 different 2080Ti...

IceScreamer, post: 4035179, member: 161373"
I get what you're trying to say, and partially agree. Have the x50 models (including Ti) ever been released with a higher than 75w TDP before?
Of course. :-)

Pascal was the first generation where *50Ti stayed below 75W, which didn't stop OEMs from putting a 6-pin just in case. Even my 1050 has one.

GTX950 was limited to 75W and there was no Ti, but factory overclocked versions pulled over 120W. :-)
GTX650Ti had TDP=110W, but there was also Ti Boost rated at 134W.
550Ti (Fermi ;-)) had TDP=116W but most cards I've read about peaked over 140W :-)
Posted on Reply
#16
jabbadap
Chloe Price, post: 4035099, member: 123719"
GDDR6 in a lower mid-end card would be insane.
Well there's 8Gb 10Gbps gddr6s listed on Hynix and Micron. And while Samsung does not list them I have no doubt they will make them if requested.
Posted on Reply
#17
Chloe Price
Nihilus, post: 4035117, member: 97108"
If they put 6 GB of GDDR6 in a mid range card, I don't see why they couldn't put 4 GB in this card.
To be honest, I don't know the price differences between GDDR5 and GDDR6, but my point was that GDDR6 would probably ramp up the prices of the card, and I doubt that faster memory would do miracles in a lower-class mid-end card.

jabbadap, post: 4035248, member: 148195"
Well there's 8Gb 10Gbps gddr6s listed on Hynix and Micron. And while Samsung does not list them I have no doubt they will make them if requested.
Good point there, feels still weird that the newest memory type would be used in a card which has a 128-bit memory bus.

ps. moro jäbälle! :D
Posted on Reply
#18
jobroook911
The GTX 950 sadly was 90W not 75W, it would have been a contender for low profile gaming otherwise.
I believe the earliest mid-range card to come under 75W before the 1050, was the 750 Ti at 60W.
We're really seeing this style emerge in the most recent two generations.

About GDDR6
It's not just the increased speed with GDDR6, it also consumes less power.
When trying to squeeze the maximum possible performance within the 75W limit, GDDR6 is a worthy option, even with the higher price tag.

I would pay 50% more $per terraflop for a fully pimped out 75W card if it fits inside a petite business machine, like a Dell or HP.
My days of owing big bulky towers are mostly over. I've still got one with a 1080 Ti, but it's simply not as elegant as the low profile machines.
Posted on Reply
#19
jabbadap
jobroook911, post: 4037293, member: 187208"
The GTX 950 sadly was 90W not 75W, it would have been a contender for low profile gaming otherwise.
I believe the earliest mid-range card to come under 75W before the 1050, was the 750 Ti at 60W.
We're really seeing this style emerge in the most recent two generations.

About GDDR6
It's not just the increased speed with GDDR6, it also consumes less power.
When trying to squeeze the maximum possible performance within the 75W limit, GDDR6 is a worthy option, even with the higher price tag.

I would pay 50% more $per terraflop for a fully pimped out 75W card if it fits inside a petite business machine, like a Dell or HP.
My days of owing big bulky towers are mostly over. I've still got one with a 1080 Ti, but it's simply not as elegant as the low profile machines.
There was 75W gtx950 models at least from Asus and EVGA. Asus one was actually reviewed on here TPU. But yeah you are right gtx 750 ti was first really competitive 60W card, it's predecessor gtx650 was quite lack luster(On a side note gtx 650 ti was xx106 chip as was gtx 950).
Posted on Reply
#20
AegisDaniel
Is it just me or did anyone NOT catch that there was also a 1660 LOW PROFILE Card Listed on that with 6GB of Memory??? With the same naming schema as ASUS's current Low profile 1050Ti card??

I am attaching Pics. This WOULD make a decent card if it was to become reality!! 6GB LOW pro and decent performance? Color me excited!!!

:)
Posted on Reply
#21
jabbadap
AegisDaniel, post: 4046922, member: 187666"
Is it just me or did anyone NOT catch that there was also a 1660 LOW PROFILE Card Listed on that with 6GB of Memory??? With the same naming schema as ASUS's current Low profile 1050Ti card??

I am attaching Pics. This WOULD make a decent card if it was to become reality!! 6GB LOW pro and decent performance? Color me excited!!!

:)
1660 needs extra power, so I'm not sure if there's a real use cases for one.
Posted on Reply
Add your own comment