• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Why exactly is ATi considered be better at HDMI?

Status
Not open for further replies.

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.00/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
[rant]

I wonder why exactly people think that ATi has the better HDMI. I see this said a lot, and it is just not true. It is always said that because ATi has a built in audio device, it makes setting up HDMI on their cards easier, but the fact of the matter is it is not easier.

What finally got me to realize this is that my main rig is connected to a 1080p HDTV. I rarely use this connection except for playing the occassional movie file that my PS3 can't handle via TVersity. Ever since I got the HDTV I've had nVidia cards in my main rig, 8800GTS/9800GTX/9600GT/GTX260/GTX285, so I've always used them to output via HDMI. It wasn't until I bought the HD4890 that I used an ATi card connected to my HDTV, and it wasn't until today that I realized what a pain in the ass setting up the "easier" ATi card's HDMI was.

First lets look at getting the image on the TV:

With nVidia:
  • Open nVidia control panel.
  • Set the HDTV as output device.

With ATi:
  • Open CCC.
  • Set the HDTV as the output device.
    But wait...the image has huge black boarders around it, despite having the proper resolution. Where is that stupid option that I only happen to know about because I've seen the issue here before...
  • Hunt through CCC and finally find the under/overscan option under Scaling Options.
  • Set it to 0%.

Why the hell was it not set to 0% by default? If that isn't the stupidest thing...

Now, lets get the sound working:

With nVidia:
  • Connect SPD/IF cable from sound card to graphics card.
  • Select SPD/IF as defaut sound output device.

With ATi:
  • Install another driver for the sound card on the graphics card.
  • Select Digital Audo(HDMI) as the default sound output device.
    Wait...I'm still not getting sound...Well there is nother playback device called ATi DP Output, but it says it is disconnected...hmmm...
  • Swap out HDMI adaptor.
  • Swap out HDMI adaptor again.
    Hey...that ATi DP Output says it is connected now...
  • Select ATi DP Output as the default sound output device.

Sorry, but I just don't see it being a better method.

Besides the need for an extra driver, which annoys me enough already, the fact that it is labelled totally wrong in the playback devices control panel AND the fact that I appearently need a special HDMI adaptor direct from ATi to use it, really out weighs the nice factor of having a built in sound card. I really hope I never loose that little adaptor, and I would have been really screwed if I had threw it in the box I keep with all the other adaptors instead of keeping it in the box the card came in, which is actually rare for me.

Then add to that the video display issue, and needing to hunt through CCC to find the option, which is hidden pretty damn well, and I just can't say that ATi has the better/easier HDMI setup.

Saddly, it seems nVidia has gone the same way ATi has with a built in audio device directly on the card. That kind of sucks, because I'd rather use the real sound card in my computer, I'd especially be pissed if I had actually paid for an add-on sound card only to be force to use what is essentially no better than the onboard card.

Now, I'm sure someone out there is thinking "well hey, why is using a special HDMI adaptor worse than using a special cable". Well I'll tell you, it is because I can go out and buy the cable and know before I even buy it that it will work. However, I can't buy an HDMI adaptor and know before I buy it that it will give me sound with my ATi card, unless it is stated somewhere on the packaging or on the site I'm buying it from.

[/rant]

Now don't get me wrong, I'm not saying nVidia's solution was better overall, especially for the normal consumer. The nVidia method has its faults also. Connecting the SPD/IF connector for a normal consumer, or even someone with some computer knowledge, can be tricky. Especially with most manufacturers including little or no documentation on it, it kind of leaves you guessing about where and how to connect it. There are also motherboards and sound cards that do not have an SPD/IF header, cheaper motherboards especially.
 
Last edited:
NVidia cards since the GT200 series now have built in audio as well. So drivers are needed for that too. Either way I'd take that over needing to install the spdif cable to the video card. I've seen some motherboards that didn't have spdif header from onboard sound so you'd be screwed there.
 
I don't have that issue with the Black borders and I didn't have to do anything for the Audio to come through the HDMI either...My TV asks whenever something new is hooked up through the HDMI to do an Intial setup; I just hit recommended and it's fine..

possibly hit Cinema zoom or something on your TV..quite a few TV's have a max setting of 1680x1050 for Pc's ATI has an HDTV setting at 1920x1080 that makes that easier..
 
NVidia cards since the GT200 series now have built in audio as well. So drivers are needed for that too. Either way I'd take that over needing to install the spdif cable to the video card. I've seen some motherboards that didn't have spdif header from onboard sound so you'd be screwed there.

Yeah, you caught me while I was editting. It really is preference, to me an extra driver is just another driver that can screw up, and the less then intuitive naming scheme that ATi uses doesn't help matters.

I don't have that issue with the Black borders and I didn't have to do anything for the Audio to come through the HDMI either...My TV asks whenever something new is hooked up through the HDMI to do an Intial setup; I just hit recommended and it's fine..

possibly hit Cinema zoom or something on your TV..quite a few TV's have a max setting of 1680x1050 for Pc's ATI has an HDTV setting at 1920x1080 that makes that easier..

The TV isn't the issue, it is the ATi card and the drivers. The black borders are a common thing on ATi cards, we see topics about them quite often here, which is the only reason I knew about it.

And the audio was entirely due to the improper naming of the audio devices, and the requirement of a special HDMI adaptor. You probably never had a sound issue because you used the HDMI adaptor that came with the card.

The nVidia card automatically set the resolution to 1920x1080, just as the ATi card did, and the TV does support this resolution via HDMI. The problem was entirely due to ATi's drivers defaulting to underscan. The zoom feature on my TV zoomed in too far, cutting off half of the taskbar, worse then the black bars.
 
well been thinking going NV my self as i'm sick of HDMI and bad scaling in DX10. Although it depends on game to game were some if you turn AA on you get fullscreen and off you get the incurrect scaling but some times it's the other way around too lol. And some just will not scale right.

which is another reason i am waiting till next year see what that brings. Other wise would love have another ATI card.
 
And the audio was entirely due to the improper naming of the audio devices, and the requirement of a special HDMI adaptor. You probably never had a sound issue because you used the HDMI adaptor that came with the card.


Wait..they give you the stuff, you don't use it, and complain?:wtf:


:laugh:

My tv, and my HDMI monitors I have never had any issues with, audio, or otherwise. it truly is EDID problems.
 
I plug my HDMI cable into the card, then into my TV. Those are the steps that I have to do. All the steps you list make absolutely no sense to me.
 
Last edited:
I think your more comfortable with Nvidia
I'm so used to doing things ATI's way whenever I get an Nvidia card I get frustrated
I may have had some Issues with ATI but I'm so used to dealing with em nothing is really that bad for me Except some really crappy drivers of late
 
I plug my HDMI cable into the card, then into my TV. Those are the steps that I have to do. All the steps you list make absolutely no sense to me.

Sound works fine only issue with me is underscan when connecting to my tv.
 
I plug my HDMI cable into the card, then into my TV. Those are the steps that I have to do. All the steps you list make absolutely no sense to me.

They are all the steps required to get it to work. You HAVE to change the sound device. However, this might be accomplished automatically if the HDMI port is the only device connected(no speakers). Though, this isn't really the issue, as you have to do it with nVidia cards also, it is more of a Windows problem thanks to them not allowing sound over multiple outputs. The issues with sound are requiring special HDMI adaptors and the poor naming of the outputs. The one labelled HDMI should actually be the HDMI, not the other one labelled ATi DP Output, whatever the hell that means.

The understand/overscan issue is widely known as a problem with ATi's drivers, sometimes it happens sometimes it doesn't, it never happens with nVidia though. If it doesn't make sense to you, someone that I consider to have quite a bit of knowledge of hardware and PCs, then I feel sorry for a normal consumer.

Also, I'm far from the only person with this problem: http://forums.techpowerup.com/showthread.php?t=121134

And yes, it is an annoying issue with games, that with some games having no solution at all, apparently you just have to live with the black bars. Though I don't have this problem personally since I don't game on my HDTV.

Wait..they give you the stuff, you don't use it, and complain?:wtf:


:laugh:

My tv, and my HDMI monitors I have never had any issues with, audio, or otherwise. it truly is EDID problems.

Yes, because:

1.) I can't use one of the several other adaptors that I have laying around. Instead I have to hunt for that one specific adaptor, if I can even find it(luckily I did).
2.) If I couldn't find it, it would be a pain to find one to buy.

Edit: Just for the hell of it, I pulled out my DVI to HDMI cable just to see what happens...yep, no sound...It works with nVidia cards though...
 
Last edited:
I plugged mine into my TV the day before last. It detected my TV and set the sound to go through HDMI automagically.

I did have to set it to underscan for the correct display though. No biggie.
 
I use my card to run hardware accelerated video onto my 46" Toshiba, no issues, I can run WMC with netflix just fine, no issues with black borders. I did set the overscan manually as the TV does a poor job of scaling, so I have a minute black border.
 
Th

Yes, because:

1.) I can't use one of the several other adaptors that I have laying around. Instead I have to hunt for that one specific adaptor, if I can even find it(luckily I did).
2.) If I couldn't find it, it would be a pain to find one to buy.

Edit: Just for the hell of it, I pulled out my DVI to HDMI cable just to see what happens...yep, no sound...It works with nVidia cards though...

Well, the 5-series cards fix that, with onboard HDMI. But I hear ya on needing the right adapter...HD2xxx, HD3XXX, HD4xxx, all different ones. But you don't need one any more, and I have all 3 sitting right here next to my monitor.

Also. not all panels support HDCP at all resolutions, and this is an issue as well, and leads to some users getting black borders. My 3008EFP won't do 2560x1600 over HDMI with 5-series, due to a lack of HDCP @ that res. Only issue I have, and it has NOTHING to do with the cards.
 
Well, the 5-series cards fix that, with onboard HDMI. But I hear ya on needing the right adapter...HD2xxx, HD3XXX, HD4xxx, all different ones. But you don't need one any more, and I have all 3 sitting right here next to my monitor.

Also. not all panels support HDCP at all resolutions, and this is an issue as well, and leads to some users getting black borders. My 3008EFP won't do 2560x1600 over HDMI with 5-series, due to a lack of HDCP @ that res. Only issue I have, and it has NOTHING to do with the cards.

The onboard HDMI is definitely a great solution to needing the adaptor, and having mini-HDMI on the GTX400 series is completely stupid, because you still need a damn adaptor.:shadedshu

Though I believe there have been both nVidia and ATi cards previously with HDMI onboard, so the HD5000 series having it is not new, but it is new to have it as a standard on the card.

And HDCP is not the issue here either, the TV definitely supports HDCP at every resolution.
 
What is said is that ATI has better HDMI sound features.

While ATI has had an integrated Realtek codec inside their chips since R600, nVidia resorts to external codecs to provide sound into the HDMI output.

Until their DX10.1, nVidia resorted to internal SPDIF connectors.

Now, SPDIF is old, ancient, wrinkled, etc.
It was meant for high-quality stereo back in the 80s, but dolby adapted it to carry AC-3 in the early 90s, which is a lossy and highly compressed format.


Since the RV7xx, ATI's integrated codecs can do 8 discrete uncompressed 24bit 96KHz channels.
Just connect an ATI card to a 7.1 HDMI receiver and it will do studio-quality 7.1 channels in movies, games, music, anything.
For the Evergreen family, ATI paid the royalties to Dolby and DTS to pass-through their new high-def formats. This means that you can connect the HTPC to the receiver and have the high-def sound decoded directly in the receiver, which will supposedly result in better compatibility and less audio delay, while playing a blu-ray.


Up till last year, nVidia cards were restricted to SPDIF. This means that with a normal sound codec, a nVidia card could only do pre-cooked 5.1 (like DVD content) or 2 channel sound, wich is too limiting for home cinemas.
With a Dolby Digital or DTS encoding sound codec, the SPDIF could do 6 compressed 16bit 48KHz channels, which is 10 years late compared to ATI's offerings.

nVidia's "new" DX10.1 cards don't use the SPDIF connection anymore. Instead, they pass the entire audio stream through the PCI-Express connection. They still need the presence of an audio codec somewhere, be it in an actual soundcard or a motherboard codec, but it's a lot better than the rushed-in-and-clearly-poor SPDIF solution.
But it's a pain in the ass. The nVidia driver must recognize the sound codec and it must be properly configured, which may or may not happen..




But for HTPCs connected to a decent receiver and a multi-speaker setup, ATI's aproach is by far the best.
Just connect the card to the receiver through HDMI, disable the motherboard's audio codec and windows will only recognize a single HDMI-driven 7.1 sound device. "It just works".
 
Last edited:
HDMI for a computer I just built using the integrated HD 2100 Xpress graphics were pretty much plug in play (after installing the chipset driver).
 
"It just works".

Clearly it doesn't.:roll:

Disabling the real sound card, be it the onboard or add-in variaty, every time I want to use my TV with my PC to make it "just work" seems more convoluted then just selecting the proper output device...

It might be a good solution for someone that always uses the TV, but for a PC that is only occasionally connected to the TV it certainly isn't.
 
Last edited:
How do you get to that over/underscan option? And I've also had some strange problems w/ my ati HDMI audio too. It would randomly switch my speakers around so my back was in front and front went to the side and subwoofer disappeared lol
 
How do you get to that over/underscan option? And I've also had some strange problems w/ my ati HDMI audio too. It would randomly switch my speakers around so my back was in front and front went to the side and subwoofer disappeared lol

If you believe some people here, you don't have to..."It just works.":roll:

If you live in the real world, and obviously you do, you got to Catalyst Control Center, switch to Advanced if you haven't already, then go to Displays, right click on your HDTV and go to properties, the under/overscan option will be under the Scaling tab.(Yeah, it is real easy to find.:rolleyes:)
 
it appears that it does not appreciate scaling on my tv as the option isn't there. But I'm also running a resolution that my TV is not native too (it just looks and performs the best out of them, if I push it up to 1080i the text gets insanely misaligned)
 
It might be a good solution for someone that always uses the TV, but for a PC that is only occasionally connected to the TV it certainly isn't.

Then just go to the sound control panel and change the output. It's simple.
 
If you believe some people here, you don't have to..."It just works.":roll:

If you live in the real world, and obviously you do, you got to Catalyst Control Center, switch to Advanced if you haven't already, then go to Displays, right click on your HDTV and go to properties, the under/overscan option will be under the Scaling tab.(Yeah, it is real easy to find.:rolleyes:)

It does "just work" for some people. You are now just making assumptions. I know it doesn't change the fact that you have a hard time with it, but I and others do not. It's truth. Accept it. :toast:
 
My HDMI (audio and video) has been working since I plugged in the cable for the first time using the provided HDMI adapter from xfx. Yes you have to set ATI DP output as the default device, but for me that is really not all that difficult and considering that I only turn it on when Im going to watch a movie I really dont find it annoying.

I find it much nicer that when I want to watch a movie on my tv with my headphones on (as to not disturb the other people living in my house) that I can disable the TV audio from the computer. (don't have a remote for my current television)


The problem at the outset of this thread is you are lumping your personal non-universal problems with setting up ATI HDMI into the same boat as the one functionality issue that could be a bit more intuitive and accessible.
 
Then just go to the sound control panel and change the output. It's simple.

Again, that is actually not what I'm complaining about, I don't know why everyone seems to think it is. I've mentioned serveral times that switching the sound ouput to another device through the Playback Device manager is not an issue here, as you have to do it with nVidia also, and the problem is more related to Windows then the graphics cards or their drivers.

I do have a problem with ATi's naming of those devices in the Playback devices, as what is labelled as HDMI is not actually what you select.
 
Status
Not open for further replies.
Back
Top