newtekie1
Semi-Retired Folder
- Joined
- Nov 22, 2005
- Messages
- 28,473 (4.00/day)
- Location
- Indiana, USA
Processor | Intel Core i7 10850K@5.2GHz |
---|---|
Motherboard | AsRock Z470 Taichi |
Cooling | Corsair H115i Pro w/ Noctua NF-A14 Fans |
Memory | 32GB DDR4-3600 |
Video Card(s) | RTX 2070 Super |
Storage | 500GB SX8200 Pro + 8TB with 1TB SSD Cache |
Display(s) | Acer Nitro VG280K 4K 28" |
Case | Fractal Design Define S |
Audio Device(s) | Onboard is good enough for me |
Power Supply | eVGA SuperNOVA 1000w G3 |
Software | Windows 10 Pro x64 |
[rant]
I wonder why exactly people think that ATi has the better HDMI. I see this said a lot, and it is just not true. It is always said that because ATi has a built in audio device, it makes setting up HDMI on their cards easier, but the fact of the matter is it is not easier.
What finally got me to realize this is that my main rig is connected to a 1080p HDTV. I rarely use this connection except for playing the occassional movie file that my PS3 can't handle via TVersity. Ever since I got the HDTV I've had nVidia cards in my main rig, 8800GTS/9800GTX/9600GT/GTX260/GTX285, so I've always used them to output via HDMI. It wasn't until I bought the HD4890 that I used an ATi card connected to my HDTV, and it wasn't until today that I realized what a pain in the ass setting up the "easier" ATi card's HDMI was.
First lets look at getting the image on the TV:
With nVidia:
With ATi:
Why the hell was it not set to 0% by default? If that isn't the stupidest thing...
Now, lets get the sound working:
With nVidia:
With ATi:
Sorry, but I just don't see it being a better method.
Besides the need for an extra driver, which annoys me enough already, the fact that it is labelled totally wrong in the playback devices control panel AND the fact that I appearently need a special HDMI adaptor direct from ATi to use it, really out weighs the nice factor of having a built in sound card. I really hope I never loose that little adaptor, and I would have been really screwed if I had threw it in the box I keep with all the other adaptors instead of keeping it in the box the card came in, which is actually rare for me.
Then add to that the video display issue, and needing to hunt through CCC to find the option, which is hidden pretty damn well, and I just can't say that ATi has the better/easier HDMI setup.
Saddly, it seems nVidia has gone the same way ATi has with a built in audio device directly on the card. That kind of sucks, because I'd rather use the real sound card in my computer, I'd especially be pissed if I had actually paid for an add-on sound card only to be force to use what is essentially no better than the onboard card.
Now, I'm sure someone out there is thinking "well hey, why is using a special HDMI adaptor worse than using a special cable". Well I'll tell you, it is because I can go out and buy the cable and know before I even buy it that it will work. However, I can't buy an HDMI adaptor and know before I buy it that it will give me sound with my ATi card, unless it is stated somewhere on the packaging or on the site I'm buying it from.
[/rant]
Now don't get me wrong, I'm not saying nVidia's solution was better overall, especially for the normal consumer. The nVidia method has its faults also. Connecting the SPD/IF connector for a normal consumer, or even someone with some computer knowledge, can be tricky. Especially with most manufacturers including little or no documentation on it, it kind of leaves you guessing about where and how to connect it. There are also motherboards and sound cards that do not have an SPD/IF header, cheaper motherboards especially.
I wonder why exactly people think that ATi has the better HDMI. I see this said a lot, and it is just not true. It is always said that because ATi has a built in audio device, it makes setting up HDMI on their cards easier, but the fact of the matter is it is not easier.
What finally got me to realize this is that my main rig is connected to a 1080p HDTV. I rarely use this connection except for playing the occassional movie file that my PS3 can't handle via TVersity. Ever since I got the HDTV I've had nVidia cards in my main rig, 8800GTS/9800GTX/9600GT/GTX260/GTX285, so I've always used them to output via HDMI. It wasn't until I bought the HD4890 that I used an ATi card connected to my HDTV, and it wasn't until today that I realized what a pain in the ass setting up the "easier" ATi card's HDMI was.
First lets look at getting the image on the TV:
With nVidia:
- Open nVidia control panel.
- Set the HDTV as output device.
With ATi:
- Open CCC.
- Set the HDTV as the output device.
But wait...the image has huge black boarders around it, despite having the proper resolution. Where is that stupid option that I only happen to know about because I've seen the issue here before... - Hunt through CCC and finally find the under/overscan option under Scaling Options.
- Set it to 0%.
Why the hell was it not set to 0% by default? If that isn't the stupidest thing...
Now, lets get the sound working:
With nVidia:
- Connect SPD/IF cable from sound card to graphics card.
- Select SPD/IF as defaut sound output device.
With ATi:
- Install another driver for the sound card on the graphics card.
- Select Digital Audo(HDMI) as the default sound output device.
Wait...I'm still not getting sound...Well there is nother playback device called ATi DP Output, but it says it is disconnected...hmmm... - Swap out HDMI adaptor.
- Swap out HDMI adaptor again.
Hey...that ATi DP Output says it is connected now... - Select ATi DP Output as the default sound output device.
Sorry, but I just don't see it being a better method.
Besides the need for an extra driver, which annoys me enough already, the fact that it is labelled totally wrong in the playback devices control panel AND the fact that I appearently need a special HDMI adaptor direct from ATi to use it, really out weighs the nice factor of having a built in sound card. I really hope I never loose that little adaptor, and I would have been really screwed if I had threw it in the box I keep with all the other adaptors instead of keeping it in the box the card came in, which is actually rare for me.
Then add to that the video display issue, and needing to hunt through CCC to find the option, which is hidden pretty damn well, and I just can't say that ATi has the better/easier HDMI setup.
Saddly, it seems nVidia has gone the same way ATi has with a built in audio device directly on the card. That kind of sucks, because I'd rather use the real sound card in my computer, I'd especially be pissed if I had actually paid for an add-on sound card only to be force to use what is essentially no better than the onboard card.
Now, I'm sure someone out there is thinking "well hey, why is using a special HDMI adaptor worse than using a special cable". Well I'll tell you, it is because I can go out and buy the cable and know before I even buy it that it will work. However, I can't buy an HDMI adaptor and know before I buy it that it will give me sound with my ATi card, unless it is stated somewhere on the packaging or on the site I'm buying it from.
[/rant]
Now don't get me wrong, I'm not saying nVidia's solution was better overall, especially for the normal consumer. The nVidia method has its faults also. Connecting the SPD/IF connector for a normal consumer, or even someone with some computer knowledge, can be tricky. Especially with most manufacturers including little or no documentation on it, it kind of leaves you guessing about where and how to connect it. There are also motherboards and sound cards that do not have an SPD/IF header, cheaper motherboards especially.
Last edited: