• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

What's the downside of using a 2nd GPU for a single display?

Joined
Mar 23, 2005
Messages
239 (0.03/day)
System Name Bessy 6.0
Processor i7-7700K @ 4.8GHz
Motherboard MSI Z270 KRAIT Gaming
Cooling Swiftech H140-X + XSPC EX420 + Resevior
Memory G.Skill Ripjaws V 32GB DDR-3200 CL14 (B-die)
Video Card(s) MSI GTX 1080 Armor OC
Storage Samsung 960 EVO 250GB x2 RAID0, 940 EVO 500GB, 2x WD Black 8TB RAID1
Display(s) Samsung QN90a 50" (the IPS one)
Case Lian Li something or other
Power Supply XFX 750W Black Edition
Software Win10 Pro
So I have a monitor (it's a TV actually) that supports HDMI 2.1 (4k@120hz w/VRR) but I'm still using a GTX 1080 with HDMI 2.0 so it'll only do 60hz@4k. I've been running at 1080p so I can run 120hz and VRR.

Is there a downside (beside power consumption) to using a 2nd GPU to drive the monitor but still run games on the 1080? Latency? Image quality? Anything?

I was thinking of getting the cheapest HDMI 2.1 capable card I can find for this and I have no desire, or need, to upgrade to a more powerful GPU at this time.

Also, would I run into trouble if the 2nd GPU is AMD with the primary being Nvidia?

Thanks :)

(and yes, I tried to look this up but it kept taking me to dual-monitor articles/posts)
 
Hybrid graphics is not trivial. I don't think that is possible with another dGPU, but others should know more. Please correct me if I'm wrong.

Your best option right now might be either using your upgrade GPU alone, or weigh how much you really want 4k@120Hz. Is there no DisplayPort on your TV?
 
Last edited:
Hybrid graphics is not trivial. I don't think that is possible with another dGPU, but others should know more. Please correct me if I'm wrong.

Your best option right now might be either using your upgrade GPU alone, or weigh how much you really want 4k@120Hz. Is there no DisplayPort on your TV?

No Displayport on the TV, but it just hit me, they have Displayport to HDMI 2.1 adapters. Is there a downside to that? The GTX 1080 has DP 1.4.
 
GTX 1080 with HDMI 2.0 so it'll only do 60hz@4k
I wouldn't know where the settings are for Nvidia because I've been using AMD cards for so long now but you should be able to use compression to get 4K 120Hz out of a HDMI 2.0 capable port. Using 4:2:0 compression is enough to allow for 4K 120Hz. I would think it's somewhere where you would select color settings in Nvidia's control panel

I did this for a while with a RX 5700 XT (HDMI 2.0 port) going to a 4K 120Hz TV (HDMI 2.1 port). I did buy a new cable for this.....don't recall if it was actually necessary though.....I doubt it was because whole point of using this compression was to fit the signal in the available bandwidth of a 2.0 port/cable but I got an 8K 60Hz capable cable anyways since I knew I was eventually going to upgrade the GPU to one with a 2.1 port


Also, would I run into trouble if the 2nd GPU is AMD with the primary being Nvidia?
I've never mixed AMD/Nvidia on Windows before. I did it in macOS with a 2009 Mac Pro but GPU drivers are very different in that ecosystem. Likely not for the faint of heart to setup in Windows, if it's even possible.

No Displayport on the TV, but it just hit me, they have Displayport to HDMI 2.1 adapters. Is there a downside to that?
I would do the compression method I mention above instead. It's cheaper (free) and no worries about compatibility. I looked into these adapters for same exact reason when they hit the market a few years back but obviously went with the free option....because.....free lol
 
No Displayport on the TV, but it just hit me, they have Displayport to HDMI 2.1 adapters. Is there a downside to that? The GTX 1080 has DP 1.4.
Other than the vast majority of those adaptors - not sure about those with DP 1.4 to HDMI 2.1 claims - do not appear to actually support 4K@120Hz, I don't see much else. I have never used one myself.
 
I wouldn't know where the settings are for Nvidia because I've been using AMD cards for so long now but you should be able to use compression to get 4K 120Hz out of a HDMI 2.0 capable port. Using 4:2:0 compression is enough to allow for 4K 120Hz. I would think it's somewhere where you would select color settings in Nvidia's control panel

I did this for a while with a RX 5700 XT (HDMI 2.0 port) going to a 4K 120Hz TV (HDMI 2.1 port). I did buy a new cable for this.....don't recall if it was actually necessary though.....I doubt it was because whole point of using this compression was to fit the signal in the available bandwidth of a 2.0 port/cable but I got an 8K 60Hz capable cable anyways since I knew I was eventually going to upgrade the GPU to one with a 2.1 port



I've never mixed AMD/Nvidia on Windows before. I did it in macOS with a 2009 Mac Pro but GPU drivers are very different in that ecosystem. Likely not for the faint of heart to setup in Windows, if it's even possible.


I would do the compression method I mention above instead. It's cheaper (free) and no worries about compatibility. I looked into these adapters for same exact reason when they hit the market a few years back but obviously went with the free option....because.....free lol
Doing a quick search I see that though this is possible, you lose HDR and VRR. It seems you lose VRR with an adapter as well (if it works at all!). That sucks.
 
Last edited:
So I have a monitor (it's a TV actually) that supports HDMI 2.1 (4k@120hz w/VRR) but I'm still using a GTX 1080 with HDMI 2.0 so it'll only do 60hz@4k. I've been running at 1080p so I can run 120hz and VRR.

Is there a downside (beside power consumption) to using a 2nd GPU to drive the monitor but still run games on the 1080? Latency? Image quality? Anything?

I was thinking of getting the cheapest HDMI 2.1 capable card I can find for this and I have no desire, or need, to upgrade to a more powerful GPU at this time.

Also, would I run into trouble if the 2nd GPU is AMD with the primary being Nvidia?

Thanks :)

(and yes, I tried to look this up but it kept taking me to dual-monitor articles/posts)
I don't get it. You want a cheap video card to use as a sort of a pass-through for your GTX1080, only to add HDMI 2.1 out? I don't think that's possible.
 
flip the GPU on ebay and buy a new one would be the best solution.
 
Is there no DisplayPort on your TV?
I've never seen any TV with DP. They are always HDMI

Doing a quick search I see that, though this is possible, you lose HDR and VRR. It seems you lose VRR with an adapter as well. That sucks.
that is def wrong about losing VRR via use of 4:2:0 compression. I used VRR like that, with 4:2:0 compression. at the very least I know VRR was used all the time on any game. HDR I only used on a couple games and I may have been using 1440p res because the games were too much to get 120+FPS (Doom Eternal, Hitman III and the like).....but I want to say HDR worked at 4K 120Hz too. Maybe HDR is too much for that bandwidth even with compression, I don't recall exactly for HDR. But VRR was super important to me and main driving point of me getting the TV which was a Samsung QN90A. Wouldn't know about the adapter but your quick search regarding use of compression is wrong for VRR (might be right for HDR though)

only HDR would maybe be lost capability at 4K 120. any lower resoulution would absolutely also fit HDR like 3200x1800 120Hz for ex
 
I don't get it. You want a cheap video card to use as a sort of a pass-through for your GTX1080, only to add HDMI 2.1 out? I don't think that's possible.
I assumed it was since windows can now select which GPU is used per-app. I'm running with the integrated graphics and can still run programs on it with no monitor connected to it:
graphicspref.jpg

I've never seen any TV with DP. They are always HDMI


that is def wrong about losing VRR via use of 4:2:0 compression. I used VRR like that, with 4:2:0 compression. at the very least I know VRR was used all the time on any game. HDR I only used on a couple games and I may have been using 1440p res because the games were too much to get 120+FPS (Doom Eternal, Hitman III and the like).....but I want to say HDR worked at 4K 120Hz too. Maybe HDR is too much for that bandwidth even with compression, I don't recall exactly for HDR. But VRR was super important to me and main driving point of me getting the TV which was a Samsung QN90A. Wouldn't know about the adapter but your quick search regarding use of compression is wrong for VRR (might be right for HDR though)

only HDR would maybe be lost capability at 4K 120. any lower resoulution would absolutely also fit HDR like 3200x1800 120Hz for ex
Actually I meant that VRR won't work with an adapter (but HDR will). This is what I read so far, I have more research to do.
 
I assumed it was since windows can now select which GPU is used per-app. I'm running with the integrated graphics and can still run programs on it with no monitor connected to it:
View attachment 364626
I meant it's not possible to output games rendered on your GTX1080 through the second card. Again, not sure that's what you were asking, that's what I understood.
 
Hybrid graphics is not trivial. I don't think that is possible with another dGPU, but others should know more. Please correct me if I'm wrong.

Your best option right now might be either using your upgrade GPU alone, or weigh how much you really want 4k@120Hz. Is there no DisplayPort on your TV?

Perfectly possible nowadays. Since Windows 7 multi-vendor configurations are explicitly supported as multiple graphics drivers can now be loaded. What you cannot do is load two versions of the same driver (eg. 391.35 for Fermi and a current driver version), however, you can have both the AMD and NVIDIA driver coexist and be active at the same time. Some machines even require this by design (such as my AMD+NV laptop).

I assumed it was since windows can now select which GPU is used per-app. I'm running with the integrated graphics and can still run programs on it with no monitor connected to it:
View attachment 364626


Actually I meant that VRR won't work with an adapter (but HDR will). This is what I read so far, I have more research to do.

Indeed, can be done. There are no downsides other than perhaps a struggle for system resources (PCIe bandwidth) depending on your PC specs. Sometimes, that might not be worth losing 8 primary GPU lanes for. But depending on the hardware you have, it doesn't matter either. However, expect a slight hit in rendering performance when operating your GPU headless, as Windows will have to copy the output to the framebuffer of the adapter connected to the display. Worst case scenario, 10% - this is the same on non-Optimus laptops that do not support MUX

I meant it's not possible to output games rendered on your GTX1080 through the second card. Again, not sure that's what you were asking, that's what I understood.

Headless operation is possible and can be done. It is also explicitly supported on Windows 10 and newer. The mobile RTX 3050 in fact is specifically designed for this as it does not support MUX switch, so laptop panel is connected to integrated graphics and the dedicated graphics output is managed by Windows.
 
On a side note: I was just looking at benchmark comparisons and it seems it will cost at least $300 to match the performance of this 7 year-old GTX1080 with a HDMI 2.1 capable card (I would never buy a used GPU, that's like buying an open jar of mayo, ya just never know lol). Either that 1080 was an incredible value @$550, or we haven't come very far. Then again, that's around $700 in 2024 dollars, but still...

$130 just for HDMI 2.1 (cheapest new 2.1 card I could find) isn't too appealing either.
 
On a side note: I was just looking at benchmark comparisons and it seems it will cost at least $300 to match the performance of this 7 year-old GTX1080 with a HDMI 2.1 capable card (I would never buy a used GPU, that's like buying an open jar of mayo, ya just never know lol). Either that 1080 was an incredible value @$550, or we haven't come very far. Then again, that's around $700 in 2024 dollars, but still...

$130 just for HDMI 2.1 (cheapest new 2.1 card I could find) isn't too appealing either.

There has been massive progress since Pascal in all fronts including performance, what happened in the meantime was the unfortunate death of the low end GPU and prices that never truly recovered after Covid. I've a 1070 Ti and it's almost rudimentary when put to work next to my RTX 4080, to the point that I wanna grab a 50 series card (hopefully 5090) and just retire the 4080 to secondary function instead of selling it.
 
After screwing around with this, I realized that I can indeed run 4k @120hz with HDR (and VRR works fine). I'm assuming this capability is what the "Plus" means in "Input Signal Plus". It's limited to 8-bit color and 4:2:0 though. The 4:2:0 I can handle, but 8-bit banding gets on my nerves. Crap. :wtf:
 
Last edited:
After screwing around with this, I realized that I can indeed run 4k @120hz with HDR (and VRR works fine). I'm assuming this capability is what the "Plus" means in "Input Signal Plus". It's limited to 8-bit color and 4:2:0 though. The 4:2:0 I can handle, but 8-bit banding gets on my nerves. Crap. :wtf:

Glad you checked it out!

Here is a pointer for HDR with these Samsung TVs (assuming you have one too since you mention "Input Signal Plus"). I thought HDR looked like SHIT on my TV until I realized the color space settings were wacky. It was not using the correct color space for SDR mode. It was using "Native" color space and would therefore totally over-saturate things in SDR. So when I would use HDR, by a relative comparison it would look much, much, much duller and darker as a result. TV shipped like this, because they want the "o0o0o0o0.....aaahhHhHhH" effect when they are displayed in store. Generally people are going to think the brighter/over saturated TV is better. It's kinda like music, even if a mix is worse, a lot of people are going to say the louder one is better even though its mix might be worse by comparison.

When I change color space for SDR to either Auto or Custom it will look appropriate for SDR. If you are used to seeing how things are over-saturated in SDR this will also now look a little washed out by relative comparison, but it's going to be accurate to the source material and will make for an appropriate difference when switching into HDR.
When in HDR mode it needs to use Native color space.
With those settings changed correctly now SDR and HDR modes look great

Wanted to share that because I did not discover the cause for this until like at least a year after getting my TV lol. My TV saves separate settings for each picture mode (game, film/movie, etc modes) along with even SDR and HDR. so this allows me to set the color space settings appropriately whether running a game in SDR or HDR. I hope your model works the same way!

my eureka moment was somehow stumbling onto this article one day. It will explain this phenomenon better than me regurgitating points

oh I also do NOT set my PC to be detected as PC because I think it makes some additional changes in that setup. I instead make the TV think my PC is "Game Console" which still allows for use of Game Mode (low input lag) and VRR, but colors look better in my opinion. I might be not remembering the details but there is def a reason why I landed on this choice. Might have been HDR related too. I want to say PC mode makes everything way too bright/cool. If you haven't played around with that you should try that out too.
 
Actually I was wrong. HDR does not work at 4k 120hz.

Glad you checked it out!

Here is a pointer for HDR with these Samsung TVs (assuming you have one too since you mention "Input Signal Plus"). I thought HDR looked like SHIT on my TV until I realized the color space settings were wacky. It was not using the correct color space for SDR mode. It was using "Native" color space and would therefore totally over-saturate things in SDR. So when I would use HDR, by a relative comparison it would look much, much, much duller and darker as a result. TV shipped like this, because they want the "o0o0o0o0.....aaahhHhHhH" effect when they are displayed in store. Generally people are going to think the brighter/over saturated TV is better. It's kinda like music, even if a mix is worse, a lot of people are going to say the louder one is better even though its mix might be worse by comparison.

When I change color space for SDR to either Auto or Custom it will look appropriate for SDR. If you are used to seeing how things are over-saturated in SDR this will also now look a little washed out by relative comparison, but it's going to be accurate to the source material and will make for an appropriate difference when switching into HDR.
When in HDR mode it needs to use Native color space.
With those settings changed correctly now SDR and HDR modes look great

Wanted to share that because I did not discover the cause for this until like at least a year after getting my TV lol. My TV saves separate settings for each picture mode (game, film/movie, etc modes) along with even SDR and HDR. so this allows me to set the color space settings appropriately whether running a game in SDR or HDR. I hope your model works the same way!

my eureka moment was somehow stumbling onto this article one day. It will explain this phenomenon better than me regurgitating points

oh I also do NOT set my PC to be detected as PC because I think it makes some additional changes in that setup. I instead make the TV think my PC is "Game Console" which still allows for use of Game Mode (low input lag) and VRR, but colors look better in my opinion. I might be not remembering the details but there is def a reason why I landed on this choice. Might have been HDR related too. I want to say PC mode makes everything way too bright/cool. If you haven't played around with that you should try that out too.
When in PC mode the color space is locked to native. It has two picture modes though: graphic and entertain. Graphic seems fine, but entertain feels like my retinas are getting burned out of my eyeballs. lol

I'm not noticing any lag at all in PC mode though.

I realized that I'm pretty much at my limit on power (I like to stay as close to 50% load on the PSU as I can for efficiency) and I don't want to have to buy a new PSU to run two cards.

I'm just going to replace the card with a RX 7600 XT and sell the 1080 and the 1060 I have in a box here somewhere, plus it comes with two games I might be able to sell. It's about 15%-ish faster than the 1080 at stock so what the hell. I might even spring for a 7700 XT but I'll have to do some research to see if it's worth it.
 
Back
Top