• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

installing a 2nd gpu using a pci extension cable?

Joined
Sep 23, 2023
Messages
762 (1.26/day)
so using davinci resolve studio it allows you to configure 1 gpu to do only the gui and 1 gpu to do only the computing and Id like to try this. 1 1070 for gui and 1 3090 for compute

the board is a b500m which means, a 3090 will sit over the 2nd pci slot, but using a pci cable I could get the pci slot extended down and then install the card to that. maybe rivet the cable to the back panel of the case and then use the support bracket? 1070 doesnt weigh much but maybe theres a better way then ghetto modding

case is mid tower so it should have the room to move that 1070 gpu (in the picture) down mounted to the extension cable



whats your opinion?
 

Attachments

  • 20241209_211835.jpg
    20241209_211835.jpg
    221.6 KB · Views: 77
  • wew.jpg
    wew.jpg
    173.6 KB · Views: 80
What he said.

Going to gimp your 3090 on pci-e lanes too.
 
The slot is only pinned out x4 but should be fine as long as it's working in g4.
I have limited understanding of the 3090 VRMs and avoid entirely but this is a perfectly fine compute.

1733790599626.png
 
I have a hard time believing the GUI is a heavy enough burden to have a dedicated GPU for it.
cheap to try though. its only a cheap extension cable. and it makes a difference in tests I have seen and what editors say.

they also support multi gpu. there must be a reason why they have these options
 
What exactly does it improve? I haven't found anything to support these claims. Unless you are simply talking about CPU preview rendering vs GPU. In which case you still don't need a 2nd card just for that.

In fact probably the opposite card is required for 8K Full resolution. I know Adobe Premiere struggled with a Titan Xp card at 6K RED footage just a few years ago. I would assume whatever is not supported by the GTX 1070 will be offloaded to the CPU if you assign the RTX 3090 just to encoding.
 
What exactly does it improve? I haven't found anything to support these claims. Unless you are simply talking about CPU preview rendering vs GPU. In which case you still don't need a 2nd card just for that.

In fact probably the opposite card is required for 8K Full resolution. I know Adobe Premiere struggled with a Titan Xp card at 6K RED footage just a few years ago. I would assume whatever is not supported by the GTX 1070 will be offloaded to the CPU if you assign the RTX 3090 just to encoding.
I cant find the video from the channel "johns films. export out times was faster and why wouldnt you want to use the feature if you could for a few bucks? I dont think ill ever do 8k. im not using premiere.
 
Puget systems did a benchmark of this with multi GPU and it's not worth it. You're better off with a single stronger card.

I happen to also know this because my wife uses this program and I did some research just to find out it's not worth it.
 
Well, do what you want. Since you said its free. Install the 2nd card and if you get no performance gains or decreases, you can take it back out
 

Unless you have 2 RTX 4090s to spare, for 18% extra, there is no point. That is best outcome too.
 
Whole lotta' extra power for whole lotta' not extra gain.
 
well investment is minimal to try. just the pcie cable thats cheap. nothing in the bench says anything about 1gpu for gui/1 for compute. and when doing NR and some fusion work, thats heavy on gpu. why wouldnt I want to get any performance I can.

28% is not little. people here spend tons yearly for 10% gain
 
18% for $5,000 is very little for the return. At that point I would rethink my entire setup.
 
How are you going to run a 16x card in a 4x slot? I do not think this is possible.

You can run 1x and 4x in a 16x slot however, but I don't think it works the other way around.

Am interested and watching.

Do it.
 
what is 18% /$5000 youre talking about? 4090 cards?
Look at the puget scaling. 1x vs 3 RTX 4090s. Extra $5000 assuming 2k each+tax. Considering only 18% gains for that much is silly. The extra card benefit goes down with lower end cards, so expect zero gains from 3090+1070. Might actually hurt perf because stuff is offloaded to the wrong card.

Like I said before since it's a free card and you have everything you need. Try it out. Best case, you don't see a difference.
 
Also got a look at how strong the 1070 is. You'll maybe see 5% performance bump if you scale the differences.
 
if it aint broke dont fix break it
 
You'd probably have a better time in Resolve selling both and buying an Ada card with dual NVENC (4070 Ti SUPER or higher). Though, stick to the 3090 for now and plan ahead for an RTX 5090 upgrade.
 
You'd probably have a better time in Resolve selling both and buying an Ada card with dual NVENC (4070 Ti SUPER or higher). Though, stick to the 3090 for now and plan ahead for an RTX 5090 upgrade.


I wrote it before. I ALWAYS buy older gen top gear. waiting for 5950x to come in. waiting to find a gpu for the prices I have set im willing to pay and im strict and patient. 2080ti/3080ti/3090. no 40 series. they are crazy thieving prices. ill move to 50 series just before 70 are about to be released. this way I never pay the beta premium prices. look at the 5950x. that sold for $800!! got it now for $320. its still a killer cpu.

if you got a ferrari from 10 years back, doesnt mean its now crep. theres always better, but the problem is people compare newer to older. I only compare the gear with what I want it to do. I never chase new. I dont conform and I dont follow. I do the opposite of what mainstream does. I feel its always a ripoff to buy newest.

in this video they test a few cpu and a few gpu which shows the 3090 is still no slouch

even 2080ti (11gb) or 3080ti 12gb will be fine but may run out of vram for multi cam/heavy effect/fusion/grading/NR. shame they didnt make a 14gb 3080ti.
 
You mean to say NVIDIA has cornered the production market with CUDA and keeps gimping VRAM on all but their top end cards? What a concept! /s
 
I wrote it before. I ALWAYS buy older gen top gear. waiting for 5950x to come in. waiting to find a gpu for the prices I have set im willing to pay and im strict and patient. 2080ti/3080ti/3090. no 40 series. they are crazy thieving prices. ill move to 50 series just before 70 are about to be released. this way I never pay the beta premium prices. look at the 5950x. that sold for $800!! got it now for $320. its still a killer cpu.

if you got a ferrari from 10 years back, doesnt mean its now crep. theres always better, but the problem is people compare newer to older. I only compare the gear with what I want it to do. I never chase new. I dont conform and I dont follow. I do the opposite of what mainstream does. I feel its always a ripoff to buy newest.

in this video they test a few cpu and a few gpu which shows the 3090 is still no slouch

even 2080ti (11gb) or 3080ti 12gb will be fine but may run out of vram for multi cam/heavy effect/fusion/grading/NR. shame they didnt make a 14gb 3080ti.

Well, used 4090's (which have about a month left as the current generation flagship) have stratospheric pricing right now, and I don't expect this to change outright once the 5090 lands (will likely have patchy availability). Guess you've got it as good as you can if you're imposing yourself that limitation.

Disagree with the Ferrari analogy, though. Such a sports car will more than likely increase in price, as its scarcity and exclusivity only increased, but an old graphics card is just obsolete technology.
 
Back
Top