• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Gainward GeForce GTX 1630 Ghost

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,936 (3.75/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The NVIDIA GeForce GTX 1630 launches today. The new card is targeted at the sub-$200 segment and goes head-to-head with AMD's Radeon RX 6400 and Intel's ARC A380. Unlike the Radeon, the GTX 1630 has support for the full PCIe x16 interface.

Show full review
 
They could have at least given it a 128 bit interface. Paying more than 60$ is too much for this.
 
Flop at anything above $79. It must match the street price of the aging RX 550, as it is better than that card in practically every regard. It will find its market, if the price is right.
 
The performance is not unexpected since we can take cue from how GTX 1650 performs. The price is bad, and this basically made the RX 6400 look better, instead of challenging it. To be honest, this card will have its place in systems with CPUs there is no iGPU, and graphic performance is completely unimportant. Also as to whether AV1 decoder is important or not, it really depends on the use case. For people looking to build a multimedia PC that can last for a number of years, I think it may be good to have this feature.
 
Absolutely bad. Even at 75$ performance per dollar is OK-ish at best. This is the sign of what kind of prices to expect for the next gen cards.

The performance is not unexpected since we can take cue from how GTX 1650 performs. The price is bad, and this basically made the RX 6400 look better, instead of challenging it. To be honest, this card will have its place in systems with CPUs there is no iGPU, and graphic performance is completely unimportant. Also as to whether AV1 decoder is important or not, it really depends on the use case. For people looking to build a multimedia PC that can last for a number of years, I think it may be good to have this feature.
Not really, because you have that already with 1030. This thing costs 150$ 20 lower than 1650 MSRP. 12% reduction in price for 40% reduction in performance. Yaaaaaay. NOT.
 
made to replace GT1030. nvidia can't make GT1030 forever. as for the price partner most likely going to give them huge discount soon.
 
We were expecting bad, but THIS bad? Oh dear...
 
They could have at least given it a 128 bit interface. Paying more than 60$ is too much for this.


Yeah.if you thought the 6500 XT's pathetic-sized infinity cache was bad, just imagine pulling off the same trick without said cache?

they could have added a single memory chip, and bumped things to 96-bit/6gb, and maybe added $10 to the price?
 
The only sensible explanation that I have is that nvidia is forced (contract? for a number of wafers?) to produce something using the aging TSMC process, hence they don't stop the production and move to 7 nm and 4 nm. It is ridiculous, and the bad thing is that it is always the user who loses.
 
*waits for the wailing and gnashing of teeth over things like the gimped PCIe interface that we saw with the low-end Radeons...*
 
Ouch, that’s indeed worse than 1050ti. Now given the low consumption and the 16 PCIe lines, it can have a place in the market IF the retail price falls to 1030 levels...
 
AMD: "You know, anything will sell in this market. Let's take these harvested chips that our partners don't seem to want, and configure them as a discrete product in a way that doesn't even make the best use of its own resources, and sell it for too much money." (releases RX 6400/6500)

Nvidia: "I can't believe we haven't done that already. Hey, intern; get on that." (releases GTX 1630)

Everybody (mostly) hated the 1030, but at least it initially landed at a reasonable price point, particularly with any kind of discount, which wasn't usually too hard to find. Counter to popular opinion, they could game if one stuck to less-recent titles at low-to-medium detail and resolutions. The 1630 looks like it could do the same, but needs a USD30 price cut at minimum. To those asking for sub-$100, it ain't gonna happen outside of rebates or clearances, since less than that won't even cover production/distribution costs (speculating there, of course).
 
Imagine releasing a card that makes the RX 6400 look amazing.
 
The only sensible explanation that I have is that nvidia is forced (contract? for a number of wafers?) to produce something using the aging TSMC process, hence they don't stop the production and move to 7 nm and 4 nm. It is ridiculous, and the bad thing is that it is always the user who loses.
Well i suspect they wanted to get a super high margin card for the low end to benefits from the GPU price hike due to crypto. The bubble just busted a little bit too early.

But does it matter ? nah. Many people will buy this instead of a 6400/6500 XT just for the Nvidia name. In the end, it's just a refresh of the 1030.
 
Realistically the price should be $99-120, people tend to say much lower prices, but that's probably impossible, the memories themselves must cost about $50
 
nvidia must have worked really hard to make a turd like that on purpose just to beat AMD as the one having the worst GPU on the market.
 
There is no automagic for 8k youtube videos, dear mr w1zzard. if you try playing an 8k youtube video, at least on my system, it uses av1 and i doesn't fall back to vp9. Since I have a gtx 1050, it uses my rather ancient cpu to do the decoding, which more or less brings my poor PC to a halt.
So please don't use the word magic without doing proper research please. AV1 decode is a necessity, not in the future, but now in fact.

I'm assuming that since this has dp 1.4a, it supports 8k TVs. There will be people having 8k TVs (please don't talk about practicality, people will buy them regardless they can tell the difference or not) and using this card in a htpc is a big nono based on the above.
 
AMD comes out with RX 6500 and everyone is cursing that card. AMD's idea was great in a period of scalpers and miners, offering a card that no scalper or miner would want to get, but the price, performance and feature set cut down, was definitely good reasons for the negative reaction.

What follows is ridiculous and ANTI CONSUMER. Because we have Intel and Nvidia producing cards based on DESKTOP GPUs - so no limitations there that couldn't be avoided, that are even slower than RX 6400. Today RX 6500 XT looks like a good options and IT IS NOT. It is a F regression. Just look at where RX 570 4GB is. I mean. WHAT THE H?
 
Thank you for the review - very deep and nice.

This card should have a new price - 49$.
.... and then buy it when it's on sale for $29.99 :D
 
EVGA is listing this card for 199$ and the 1650 is selling for 179 with rebate lol.
 
Back
Top