• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce GTX 1630 Launching May 31st with 512 CUDA Cores & 4 GB GDDR6

If this gonna be cheaper than 6500xt and on par with it, then worth it I guess.

No, GTX 1630 is a direct competitor to RX 6400..
Maybe nvidia fears that AMD will sell many RX 6400s and it feels that launch should get a response.

The red circle will be the performance estimate for a GTX 1630..

1653070222551.png

AMD Radeon RX 6500 XT Specs | TechPowerUp GPU Database
 
No, GTX 1630 is a direct competitor to RX 6400..
Maybe nvidia fears that AMD will sell many RX 6400s and it feels that launch should get a response.

The red circle will be the performance estimate for a GTX 1630..

View attachment 248172
AMD Radeon RX 6500 XT Specs | TechPowerUp GPU Database
Nope as the RX 6400 trades blows with the GTX 1650 Non Super

from where it stands there will be no direct competitor to the GTX 1630 as nothing falls in that segment as performance can’t be to close to the GTX 1650

The reasons are for many intents and purposes, non-issues, that is the point people are trying to make.

It matters when and if you are looking for those specific features, and if you did, you wouldn't buy said card missing them. If you don't, the gaming performance compares them best. Its like expecting it to RT, obviously it won't and even if it did, why would anyone care - in much the same way if the 6500XT was the subject.

See comparisons matter because things are available at similar price points, that is the primary concern for a customer - what can I get for X money; or if you value the feature over the expense: what does it cost to get feature X. Everything at any price point you pick, is by definition comparable and you compare on features that you need, and/or performance that you'd use.

No need to keep going on about it, but I hope that explains better why you guys have talked past each other the last page :D



Yeah man, I bought a 1080 for 420 eur in 2017... feels close to winning the hindsight lottery. Card's still relevant, if you don't RT all you want to run, runs on it... I'm even on 3440x1440 and haven't been backing down below High. It is that fact that keeps me solid in the RT=Stick it somewhere I don't see it-camp. I don't miss it for a second and any tech that inflates GPU pricing across the board should be a fat no-no right now - and that's not even considering the climate impact of making processing jobs like this more expensive for very little reason other than shareholder gains. All of this was clear when mining surged and took GPUs away from people - not the last surge, but the previous one(s). I honestly don't understand how blind people can be, or perhaps ignorance is bliss.

Could probably still sell it today for the initial price. A clear sign you don't want to even think of buying a GPU at this point...

All I can say is, I hope the upcoming gen is a good one. Otherwise its defo going to be a retro-PC for me, full of emulators and legacy stuff that'll keep me and others going for, I dunno... a few dozen years lol... that bucket list of pre-2020 content is pretty long still :)

But did you sell all three for 450 together?! Whuuu...
Not as a bundle but just the combined total
 
Nope as the RX 6400 trades blows with the GTX 1650 Non Super

from where it stands there will be no direct competitor to the GTX 1630 as nothing falls in that segment as performance can’t be to close to the GTX 1650

RX 6400 in PCIe 3.0 boards.
 
No, GTX 1630 is a direct competitor to RX 6400..
Maybe nvidia fears that AMD will sell many RX 6400s and it feels that launch should get a response.

The red circle will be the performance estimate for a GTX 1630..

View attachment 248172
AMD Radeon RX 6500 XT Specs | TechPowerUp GPU Database

Sadly according gpu database seems card performance is much worst than rx 6400, apparently runs like my actual gtx 1050 non ti 2gb but gtx 1630 have 4gb of vram

gtx1630.png



this card seems more a response to intel arc a310 than rx 6400

:)
 
Last edited:
I remember years ago when an intro cards were under $100. And they usually had a 128-bit bus.

I almost feel like things are regressing instead of progressing.
 
I remember years ago when an intro cards were under $100. And they usually had a 128-bit bus.

I almost feel like things are regressing instead of progressing.
well you know, gamers will consoom anything, and pay royally to do so. I'm surprised it took this long for card makers to catch on.
 
I wouldn't be suggesting this or the 6400 to gamer's, I'll leave it at that.
 
I wouldn't be suggesting this or the 6400 to gamer's, I'll leave it at that.
The problem is those of us with SFF builds that dont have PCIe connections have no real upgrade path available to us, especially those of us with PCIe 3.0 systems dont want to award AMD for the disaster that is the 6400.
 
The problem is those of us with SFF builds that dont have PCIe connections have no real upgrade path available to us, especially those of us with PCIe 3.0 systems dont want to award AMD for the disaster that is the 6400.
Hi,
Sure you do it's called a console they to have dropped in price.
 
Hi,
Sure you do it's called a console they to have dropped in price.
A console is never an upgrade path for any PC user. It's not even an upgrade path, imo. It's just a box that you game on and then sell, or throw in the bin after a couple of years.

well you know, gamers will consoom anything, and pay royally to do so. I'm surprised it took this long for card makers to catch on.
It's not that simple. Game technologies haven't evolved so much that you couldn't game on a 4-5 year old graphics card. Like it's been said here, the GTX 10-series is still fine. If AMD and Nvidia can make the same 4-5 year old card at lower costs (same performance from a smaller GPU, same bandwidth on 64-bit GDDR6 as 128-bit GDDR5), then they will, and I can't blame them. We can drool over reviews of 6800s and 3080s, but the thing is: the average gamer doesn't need them. Those cards are the product of the death of Crossfire and SLi, which is reflected in chip design as well: a Navi 23 is two Navi 24s, a Navi 21 is two Navi 22s.
 
Last edited:
Can't you use integrated graphics as in Ryzen 7 5700G?
AMD Ryzen™ 7 5700G | AMD
The iGPU in that chip is around GT 1030 level in performance. If you want something better, you have the used 1050 Ti LP, the rare 1650 LP, or the 6400.
 
They would be better off with GA 3030 versions, as TU with 1630 is about year or 2 late! :cool:
 
Different graphical settings exist to tweak performance in such cases.

I know..

No one said that you need the highest of the highest even with a relatively cheap graphics card.

How much does lowering the settings help, though? You see miserable 13 FPS on an RX 6400 PCIe 3. I wouldn't have too much expectations...
 
I know..



How much does lowering the settings help, though? You see miserable 13 FPS on an RX 6400 PCIe 3. I wouldn't have too much expectations...
That's at 1080p Ultra. Ultra gaming is dumb even on high-end GPUs as it is extremely wasteful vs. the visual quality gains over the next step down - and there are plenty of performance gains to be had going even lower. 1080p medium would most likely be perfectly playable for AC:V on a 6400 on PCIe 3.0.

If you want to seriously game, you don't use a SFF.
Lol, tell that to my 5800X+6900 XT in 13l. And you can quite easily fit something like an RX 6600 in a ~5-6l case if you know what you're doing - but it's not necessarily cheap or easy. The challenge is if your case only supports LP GPUs, or you have a low power PSU without PCIe power, which is what @AusWolf brought up. There's significant value in improving performance of <75W GPUs for those cases.
 
When I see this kind of framerates, I lose motivation to look for that awful graphics card:

View attachment 248235
AMD Radeon RX 6400 Tested on PCI-Express 3.0 - Assassin's Creed Valhalla | TechPowerUp

sadly this card in performance is dont enough for actual games and is more notorius at 1080p (maybe at 720p can be better), this include other cards like rx 6500 xt and others with similar performance

for actual games as your said rx 6600 is a beggining of 60fps

resuming gtx 1630 and similars are more for media pcs (decode capabilities) and light / older games

:)
 
Sadly according gpu database seems card performance is much worst than rx 6400, apparently runs like my actual gtx 1050 non ti 2gb but gtx 1630 have 4gb of vram

View attachment 248186



this card seems more a response to intel arc a310 than rx 6400

:)
TPU estimation is wrong.
It should be at least +15% more from where they place it.
Around 89% of 1050Ti and that's the worst case scenario imo:
(Not to mention that 1050 isn't going to be only -20% vs 1050Ti in today's TPU test due to 2GB ram)

perfrel_1920_1080.png

Intel ARC A310 isn't supposedly a 512 shader cores design?
It should be a little bit slower than 1630 despite the frequency advantage.
But if someone doesn't interested in older games and if the SRP is competitive, ARC A310 is an interesting alternative due to DX12.2 Ultimate support and the better media engine, plus the performance difference will going smaller and smaller each year as Intel's drivers mature in time and future games have better support for Intel ARC architecture.
 
Nope, though I'd love to be wrong. Sub-100 isn't viable. 1030 launched at 80 because that's how much it needed to cost to make anyone any money. There's no reason to expect its successor to cost the same or less five years later.
They must have got costs down a lot then, I paid £30 for mine brand new. It was also the GDDR5 not DRAM version.
 
They must have got costs down a lot then, I paid £30 for mine brand new. It was also the GDDR5 not DRAM version.
Either that was a pricing error or some kind of clearance, as that was most definitely sold at a loss for someone. £30 isn't a viable sales price for any GPU at retail - most likely the BOM+production costs are higher than that, never mind shipping, amortization of design costs, etc.
 
Should've released this 2 years back during Covid, but Nvidia's gonna do Nvidia :shadedshu:
I don't see an above 100mm² 12nm TSMC design being able to match it in price, let alone the 200mm² TU117 based one.
Depends on the binning, if they have lots of TU117 to throw away then they'd better sell them as this instead not to mention (manufacturing) prices must've come down a lot for this old node.
 
Should've released this 2 years back during Covid, but Nvidia's gonna do Nvidia :shadedshu:

Depends on the binning, if they have lots of TU117 to throw away then they'd better sell them as this instead not to mention (manufacturing) prices must've come down a lot for this old node.
I explained the reasons imo why it won't match gt 1030 price, let's agree to disagree.
Your argument is that if they have lots of TU117 to through away (what, they had secretly building so much excessively defective stock the last 3 years, as if 12nm TSMC wasn't one of the highest yield processes of the post FinFet era and especially Nvidia custom 12nm (12FFN) that emphasizes more on density and yield instead of frequency, allowing Nvidia to attempt a consumer 754mm² monolith design while at the same time TU117 is only 200mm², so whatever excessively defective stock there is, it's so small that the workstation market would have easily absorbed it, but forget all about that because I don't want to have meaningless arguments for nothing) they better sell them, OK let's suppose it's that case, they have lot's of defective stock (cut in half defective) and they must sell them, what will force them to match GT1030 SRP, is there competition? Back in the day at Q2 2017 when gt 1030 launched we had RX 560 2GB at $99 and RX560 2GB is 1.8X faster or around that range vs gt 1030 on 1080p high, now RX 6400 is $160 and it will be at best case scenario 1.5X faster than gt 1630 4GB in today's 1080p TPU setup and let's not talk about the -14% PCI-E 3 deficit, so is there a reason for Nvidia to sell it less than $119? (unless it takes account the upcoming price compression current ≤$499 model lineup will face after Ada Lovelace and RDNA3 launch and of course let's not forget Intel's ARC series that with the delay they facing, Intel will be forced to price their entire lineup based on next gen competition, but still $99 is the limit due to die size and cost process differences imo, although the die size difference is enough support my case, I would argue that the node price situation isn't what you have in mind but again let's agree to disagree)
 
Back
Top