• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 1630 Launching May 31st with 512 CUDA Cores & 4 GB GDDR6

Joined
Mar 31, 2020
Messages
1,519 (0.81/day)
The NVIDIA GeForce GTX 1630 graphics card is set to be launched on May 31st according to a recent report from VideoCardz. The GTX 1630 is based on the GTX 1650 featuring a 12 nm Turing TU117-150 GPU with 512 CUDA cores and 4 GB of GDDR6 memory on a 64-bit memory bus. This is a reduction from the 896 CUDA cores and 128-bit memory bus found in the GTX 1650 however there is an increase in clock speeds with a boost clock of 1800 MHz at a TDP of 75 W. This memory configuration results in a maximum theoretical bandwidth of 96 GB/s which is exactly half of what is available on the GDDR6 GTX 1650. The NVIDIA GeForce GTX 1630 may be announced during NVIDIA's Computex keynote next week.



View at TechPowerUp Main Site | Source
 
Shame on prices but hopefully those 730 and 1030 will be put to rest for good.
Edit: Also unlike AMD 6400/6500 there won't be stupid limitation of PCI-e 4x for entire stack rather an option left out for AIB makers for SFF/HTPC cards.
 
LOL.

What's next, a re-release of the 550ti? After all it was super popular, why not? Apparently people buy anything.
 
Nvidia with its "rereleases"... IDK how to approach the GPU market at this point lmao
 
LOL.

What's next, a re-release of the 550ti? After all it was super popular, why not? Apparently people buy anything.
The fake card sellers have got that already. :laugh:
 
The fake card sellers have got that already. :laugh:

Yeah I'd not even be surprised if Nvidia bought those, gave them a refurb treatment and sold them as new again.

At this point apparently, anything goes. I mean, what is this, honestly? Is this Nvidia releasing a single Intel Arc Killer here? :D Spec wise it seems about right...

Season 5 Lol GIF by Real Husbands of Hollywood
 
Yeah I'd not even be surprised if Nvidia bought those, gave them a refurb treatment and sold them as new again.

At this point apparently, anything goes. I mean, what is this, honestly? Is this Nvidia releasing a single Intel Arc Killer here? :D Spec wise it seems about right...

Season 5 Lol GIF by Real Husbands of Hollywood
They couldn't even release GT 1010 even though it was announced :laugh:
 
Nvidia with its "rereleases"... IDK how to approach the GPU market at this point lmao

It's just left over chips that can't handle more than that. Certainly better than not doing anything with them.
 
I guess it's good to see the 1030 and 730 and all that crap get replaced? Though I can't imagine this being a particularly attractive proposition, considering that an RX 6400 matches the 1650 and the 6500 XT beats it soundly. Even accounting for both of those losing ~10% performance on PCIe 3.0 this will be slower than the 6400. And at least here in Sweden, you can get an RX 6500 for less money than any 1650.
 
Might be interested in one of these for a SFF arcade emulator build, using MAME, with an old 9700K CPU I have lying around doing nothing.

The Integrated graphics on the 9700K tends to choke when using the HLSL CRT shaders, which can be too bandwidth heavy when running at a 4K native resolution, but one of these would be able to do the job quite nicely.

All depends on the price.
 
I guess it's good to see the 1030 and 730 and all that crap get replaced? Though I can't imagine this being a particularly attractive proposition, considering that an RX 6400 matches the 1650 and the 6500 XT beats it soundly. Even accounting for both of those losing ~10% performance on PCIe 3.0 this will be slower than the 6400. And at least here in Sweden, you can get an RX 6500 for less money than any 1650.
My guess is that they have leftover TU117s so better get rid of them.
 
Sigh.

A budget card without RTX, so can't play Q2 RTX on that. Could on RX6400, but that one has limited features in its own right.
Is it that hard to make a completely featured entry level card?
 
Sigh.

A budget card without RTX, so can't play Q2 RTX on that. Could on RX6400, but that one has limited features in its own right.
Is it that hard to make a completely featured entry level card?
Would you use RTX with a low-end card anyway..?
 
Yes, for the specific game I mentioned in that post.
EDIT: Just rechecked RGHD video with it. Mea culpa I was way in over my head lol, he had awful fps on RX6400 on RTX path.
 
Last edited:
Will this feature an NVENC chip?
NVENC is a feature of the GPU, not a discrete chip. And as this will in 99.9% likelihood use the TU117 chip, it will most likely get the same Volta-era NVENC support.
 
Sigh.

A budget card without RTX, so can't play Q2 RTX on that. Could on RX6400, but that one has limited features in its own right.
Is it that hard to make a completely featured entry level card?

Q2 RTX is not an easy game on cards. A card like this would not handle it
 
Q2 RTX is a fine display of everything that is wrong with Realtime RT. The magic wears off fast and the processing power is ridiculous for the end result you get. The cherry on top? Nine out of ten times the so called accurate lighting of RT is plain wrong, out of place, and reveals itself as a calculation not unlike any other, albeit very expensive.

The most prominent 'feature' of that, is that it shows your old card is lacking hardware to run something you probably didn't care about with a proper rasterized lighting implementation that has every potential to look just as good and run faster.

AMD might have a much stronger long term strategy here by limiting their dedicated RT hardware. Seeing as card TDPs are going through the roof with its competitor that is adamant to keep pushing it, and with prolonged and increasing pressure on resources and production capacity... Its really going to be interesting how this develops. RT only for the happy few will die a certain death, that much is certain.
 
Q2 RTX is a fine display of everything that is wrong with Realtime RT. The magic wears off fast and the processing power is ridiculous for the end result you get. The cherry on top? Nine out of ten times the so called accurate lighting of RT is plain wrong, out of place, and reveals itself as a calculation not unlike any other, albeit very expensive.

RTX on older games will never run like newer games unless they redesign the game. I still think it is awesome, and most people i know loved it too, and i would hope they used it on more games from the past. I happily live with the trade offs.
Let's be honest most great games of the past will never see any work done on them, it's too time consuming and it can go horribly wrong in the wrong hands. XIII for example, it was best they just haded RTX to it (not that it's the best game for rtx, but serves for an example of a remake gone wrong)
 
This a mainstream part, at this price level the volume (demand) isn't small, but TU117 cut down in half seems bit excessive as the main release.
Are we going to have a new chip based on Turing down the line (132-123mm²?) or this is just a stopgap product with very minor stock and life time because i don't see Nvidia throwing away potential 1650 profits for extensive amount of time?
The performance will be much slower than 6400, it will be slower than a 2016 $139 1050Ti, RX 560 4GB was $119 back in the day but it will be slower than 1630, so if it is based on TU117 probably Nvidia will not go less than $119?
I was hoping for a new smaller chip eventually in order to hit the magic $99 price point.
 
Q2 RTX is a fine display of everything that is wrong with Realtime RT. The magic wears off fast and the processing power is ridiculous for the end result you get. The cherry on top? Nine out of ten times the so called accurate lighting of RT is plain wrong, out of place, and reveals itself as a calculation not unlike any other, albeit very expensive.

The most prominent 'feature' of that, is that it shows your old card is lacking hardware to run something you probably didn't care about with a proper rasterized lighting implementation that has every potential to look just as good and run faster.

AMD might have a much stronger long term strategy here by limiting their dedicated RT hardware. Seeing as card TDPs are going through the roof with its competitor that is adamant to keep pushing it, and with prolonged and increasing pressure on resources and production capacity... Its really going to be interesting how this develops. RT only for the happy few will die a certain death, that much is certain.
As with all tools, it takes training, practice, and familiarity to use them correctly. It's hardly a wonder that the tool that's been used for decades is handled more competently than the one that's barely been accessible for a few years, and in a very limited state at that. We'll see how this plays out in the future, but I don't foresee any type of RT takeover any time soon. Selective, limited, but well planned and executed use where it has the greatest effect seems far more sensible and likely to me.

This a mainstream part, at this price level the volume (demand) isn't small, but TU117 cut down in half seems bit excessive as the main release.
Are we going to have a new chip based on Turing down the line (132-123mm²?) or this is just a stopgap product with very minor stock and life time because i don't see Nvidia throwing away potential 1650 profits for extensive amount of time?
The performance will be much slower than 6400, it will be slower than a 2016 $139 1050Ti, RX 560 4GB was $119 back in the day but it will be slower than 1630, so if it is based on TU117 probably Nvidia will not go less than $119?
I was hoping for a new smaller chip eventually in order to hit the magic $99 price point.
I guess we could hope that it replaces the 1030 at its original $80 MSRP? That seem doubtful though.
 
As with all tools, it takes training, practice, and familiarity to use them correctly. It's hardly a wonder that the tool that's been used for decades is handled more competently than the one that's barely been accessible for a few years, and in a very limited state at that. We'll see how this plays out in the future, but I don't foresee any type of RT takeover any time soon. Selective, limited, but well planned and executed use where it has the greatest effect seems far more sensible and likely to me.


I guess we could hope that it replaces the 1030 at its original $80 MSRP? That seem doubtful though.
Doubtful indeed, GP108 was 74mm² on a cheaper 14nm Samsung process (started at $70), I don't see an above 100mm² 12nm TSMC design being able to match it in price, let alone the 200mm² TU117 based one.
 
Last edited:
LOL.

What's next, a re-release of the 550ti? After all it was super popular, why not? Apparently people buy anything.
Pretty sure it's for HTPC users that usually use a 1030 or 710.

Sigh.

A budget card without RTX, so can't play Q2 RTX on that. Could on RX6400, but that one has limited features in its own right.
Is it that hard to make a completely featured entry level card?
Well it is a 10 series card really. Hence GTX not RTX. And it's obviously pretty low level for HTPC use, running entirely from board power. Wouldn't expect it to have RT features. And even if it did it would run Q2 RT like crap.
 
Back
Top