• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4050 "Ada" Launches This June

Yeah, but are slower than the 6400 and also 1050 TI doesn't supports VRR.


Yes but still no VRR or 10 bit on Smart TVs.
Do smart TVs even do 10-bit? I thought hdmi tvs are 8-bit or does it depend on the version/year?
 
Do smart TVs even do 10-bit? I thought hdmi tvs are 8-bit or does it depend on the version/year?
Yes, with videocards you need either HDMI 2.1 (Only newer AMD has it) or over DisplayPort to HDMI.

The Smarts TVs nees to be HDMI 2.1 as well, so only modern Smart TVs have 10 bit and even 12 bit.
 
Yes, with videocards you need either HDMI 2.1 (Only newer AMD has it) or over DisplayPort to HDMI.

The Smarts TVs nees to be HDMI 2.1 as well, so only modern Smart TVs have 10 bit and even 12 bit.
Then you'll need to wait for lower-end RX 7000-series cards, unless you want a 7900 XT in your HTPC. Nothing else has HDMI 2.1 at the moment.
 
Then you'll need to wait for lower-end RX 7000-series cards, unless you want a 7900 XT in your HTPC. Nothing else has HDMI 2.1 at the moment.

Never ever, I'm done with AMD, their drivers suck, no 10 bit and/or VRR on Smart TVs, and now they broke hardware playback in Kodi.
 
Never ever, I'm done with AMD, their drivers suck, no 10 bit and/or VRR on Smart TVs, and now they broke hardware playback in Kodi.
You're done with AMD, the only company that has a card with HDMI 2.1 that you admitted you need. I'm not sure I understand, but good luck, I guess?
 
You're done with AMD, the only company that has a card with HDMI 2.1 that you admitted you need. I'm not sure I understand, but good luck, I guess?
You don't get it isn't it, you can get HDMI 2.1 with Display Port 1.4a, all video cards have that port.
 
You don't get it isn't it, you can get HDMI 2.1 with Display Port 1.4a, all video cards have that port.
DP to HDMI conversion isn't flawless, and I personally wouldn't rely on it. If you manage to make it work, good on you. I just wouldn't say it's the card's fault when it doesn't work, or when you don't get the signal that you wanted, as it wasn't the card's intended purpose in the first place.
 
DP to HDMI conversion isn't flawless, and I personally wouldn't rely on it. If you manage to make it work, good on you. I just wouldn't say it's the card's fault when it doesn't work, or when you don't get the signal that you wanted, as it wasn't the card's intended purpose in the first place.
Yeah, that's why my old 1050 TI worked way better than the new 6400, with that card DisplayPort to HDMI allowed me to get 10 bit and hardware decoding in Kodi was working, no VRR because 10xx series didn't had it.

Keep your AMD, I'll be back to NVidia as soon as they release their 4050.
 
AMD thought it was wise to release the RX 6500 XT with only 4 GB. I don't think it worked out sales-wise.
Its not the 4GB of memory that hurts the 6500xt its the limited PCI-E bandwidth x4 just doesn't cut it on PCI-E 3 or below.
 
Both actually.
If it had more VRAM, it wouldn't access PCI-E so much.
It's also horribly bandwidth starved even if utilization is below 4GB.

That card really should have had a 6GB 96 bit or 8GB 128 bit bus, with a x8 connection.
 
It's also horribly bandwidth starved even if utilization is below 4GB.
Is there a way to measure that? It has about half the GPU horsepower as the 6650 XT (which is a generally well-received card), and also half the memory bandwidth, which sounds about right to me.

I agree on the x8 connection, though.
 
Never ever, I'm done with AMD, their drivers suck, no 10 bit and/or VRR on Smart TVs, and now they broke hardware playback in Kodi.
really? my old RX 480 did 10-bit colour, at least pretty sure that it did. Although didnt matter cause I only had/have an 8-bit monitor and windows is only 8-bit desktop anyways.
 
When I had my 2080TI (HDMI 2.0), it was capable of doing 10 bit color on the TV ( HDMI 2.1). It just couldn't do it at 2160p 120hz. But 1080p 120hz 4:4:4 chroma 10 bit color worked fine. This was on my 2021 Samsung 9 series QLED TV. Their higher end QLED TVs are all G-sync compatible too but you can't find Samsung documentation anywhere about it except of some random articles online, which is ridiculous IMO. This would be a great selling point for their TVs. But if you call Samsung and ask, they'll tell you it is G-sync compatible. And it is because I've been using it on the 30 series GPUs lol.
 
really? my old RX 480 did 10-bit colour, at least pretty sure that it did. Although didnt matter cause I only had/have an 8-bit monitor and windows is only 8-bit desktop anyways.

I said with Smart TVs, with monitors doesn't matters (if you have drivers for your monitor), the problem with Smart TVs is that they don't have windows drivers but NVidia cards manage to make it to work.
 
It had better be faster and better performance/$ than the 3050, BECAUSE THE 3050 IS TERRIBLE.

Anyone with a modicum of sense bought a Radeon 6600 which is a hands-down winner in every possible way - it's like 60% faster at the same price point and gives the more expensive 3060 12G reason to sweat.

If you hate team red for whatever reason, you could have thrown your 3050 money at the used market and picked up a 2060S for less. Not only does it have waaaaay higher performance but you'd still likely save enough cash to buy one of those giant foam middle fingers used at stadium events and mail it to Jensen's home address. The 4GB variant of the 3050 isn't even worth mocking; It's not PC to make fun of stillborn infants.
 
Last edited:
Yeah, but are slower than the 6400 and also 1050 TI doesn't supports VRR.
The RX 6400 came out over 3 years later. VRR is stupid in my opinion, it always causes capture cards to run out of sync, just causes so many unneeded problems. Just use a fixed refresh rate.

That's a full height card with a power connector.


The RX 6400 does too without a power connector and also in low profile flavours while being a lot faster for essentially the same price.
Yeah I unfortunately realized that as soon as I got the card, which baffles me because an 8 pin means the card can draw up to 225W, but this GPU's power limit at 55W default is 1/4th of 225W, it did not need that 8 pin. It's okay tho, I found a way to hide the cable on such a short card lol

As for the RX 6400, the RX 6400 came out in 2022, the GTX 1050 Ti came out late 2016. Congrats, AMD made a card that can do everything the 1050 Ti can do 3 years later and it still can't encode (because it's derived from an ultrabook dGPU). So it is a pretty useless graphics card if you ask me.
 
Last edited:
The RX 6400 came out over 3 years later. VRR is stupid in my opinion, it always causes capture cards to run out of sync, just causes so many unneeded problems. Just use a fixed refresh rate.

Stupid or not, is useful for video playback because some videos are at 24 frames, others at 50 frames, in those cases VRR is useful.

Even with fixed framerates, those videos feel flickering.
 
some videos are at 24 frames, others at 50 frames, in those cases VRR is useful.

Even with fixed framerates, those videos feel flickering.
VRR has a minimum refresh rate, I don't think a lot of monitors can do sub 40Hz. Though, if you set your monitor to 120 or 240 Hz, it will sync properly with both 24, 30, and 60 Hz video! I still don't understand why YouTube allows such different FPS numbers though...
 
VRR has a minimum refresh rate, I don't think a lot of monitors can do sub 40Hz. Though, if you set your monitor to 120 or 240 Hz, it will sync properly with both 24, 30, and 60 Hz video! I still don't understand why YouTube allows such different FPS numbers though...
Who's talking about monitors, I'm talking about smart TVs VRR not Freesync or GSync, which some smart TVs have.

My smart TV works at 30 and 24 Hz, at least in Kodi when I choose to use real framerate allowed me to set those refresh rates.
 
As for the RX 6400, the RX 6400 came out in 2022, the GTX 1050 Ti came out late 2016. Congrats, AMD made a card that can do everything the 1050 Ti can do 3 years later and it still can't encode (because it's derived from an ultrabook dGPU). So it is a pretty useless graphics card if you ask me.
Except that it's a hell of a lot faster, it needs less power (50-ish something W instead of 75), and it has HDMI 2.0 which can drive 4K 60 Hz displays. I'm using mine as a HTPC card, so for me, it's quite useful (well, it's actually a 6500 XT, but the point remains).
 
Back
Top