• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

getting ready for dual 5090, Functional protype on dual 4090

This is an amazing build. The dual 4090s are cheaper than the RAM and storage, let alone the CPUs :eek:
 
Some updates:

IMG_3821.jpg

IMG_3825.jpg

IMG_3828.jpg

IMG_3818.jpg

bench.jpg


grav.png
 
Do you have a link to that nvlink bridge?
 
Do you have a link to that nvlink bridge?
There is no nvlink bridge, no longer necessary

that cable you are seeing on the picture for the cards is a pcie 5.0, 15cm riser cable because my cards are placed upside down so air is blown out, rather than blowing warm air towards the motherboard or towards the CPUs, seemed a more logical approach
 
Last edited:
I posted this in the driver thread, but it kinda belongs here with this project:

On a less serious note, here is a fun side of this project

Chernobylite Game

using 1 GPU - everything including ray tracing set to ULTRA, no DLSS

IMG_3968.jpg


and now 2 GPU (no SLI, no nvlink, using mGPU, DX12) everything including ray tracing set to ULTRA, no DLSS

IMG_3969.jpg
 
Last edited:
I posted this in the driver thread, but it kinda belongs here with this project:

On a less serious note, here is a fun side of this project

Chernobylite Game

using 1 GPU - everything including ray tracing set to ULTRA

View attachment 363653

and now 2 GPU (no SLI, no nvlink, using mGPU, DX12) everything including ray tracing set to ULTRA

View attachment 363655
Wait ?!?!?!?
I have this game from G.O.G, but i could not get mGPU to work in DX12? S.l.I works fine for me in Dx11 for me
Maybe i have to disable s.l.i on my two 2080 ti's for DX12 or set it for single card in the nvidia control panel
 
what can i say its fantastic, a work of art, move over Michelangelo's David.
 
Heatsink and cooling updates:

Well,

those coolers are on revision 4, going on 5 and work well, especially with some of the threaded algorithms I use, hammered at all core the temps have been manageable and no issues. Don’t let the outside fool you. The base plate of the heatsinks is now all copper, nickel clad frame, the fins have been individually placed on the heat-pipes, pasted, and been mindful of full contact. and has increased in thickness to 5mm, the the 6 heat pipes are pressure grabbed and have Metal based TIM in the micro gap areas to improve contact, the fans can spool from 450 to 3050rpm, the BMC fan curve has then at 100% at 75C, but according to the BMC logs that has yet to ever occur even under an all core load.

Here is the fin protype and base that is going on next. I have increased fin density as well:
IMG_3729.jpg


IMG_4022.jpg


IMG_4023.jpg


IMG_4021.jpg


IMG_4027.jpg
 
Hot gigabit chip:

the dual / tri gigabit chip gets toasty like 90C toasty with the flimsy aluminum coin of a heat sink

IMG_3898.jpg


so had to move to home made all copper

before final cleanup up and deburring

IMG_3900.jpg

IMG_3905.jpg


slight acid dip tp increase porosity and surface area, and then
need to color match of course, this works on copper

IMG_3927.jpg


IMG_3926.jpg


final product:

IMG_3934.jpg



The result , from using netter TIM, moving to copper, increasing fins and surface area:
52C (down from 90C), mission accomplished
 
result of the heat sink effort

Happy PC

boredPC.jpg
 
Me likey. Do I want server CPUs? Yes. Do I need server CPUs? No.

Also how did you connect the u.2 drives on this bad boy. Mcio 8i to u.2?
 
Man, that's a hecking nice build.

Sometimes I think about moving to an used Epyc cheap so that I could have more RAM, but then I think about its idle power consumption and give up instantly. For me it's cheaper to just spin up a cloud instance for the not-so-often moments where I need more RAM/vRAM.
 
Sepheronx - Canada? I want to live there! - can I move in with you? I have expensive taste and need a sugar daddy.



Side note, the rig was expensive but not what one thinks:

the board and cpus, are new (oem eBay deals) but the drives, video cards, psu, etc were from the prior dual Xeon build.

the prior Xeon build was new board and cpus and used prior components from earlier build etc

each build was built from what was already available / owned when possible

rinse repeat

some prior piggy-back components and builds of mine going back from last year all the way to 1999
View attachment 349530
View attachment 349521
View attachment 349536
View attachment 349523
View attachment 349525
View attachment 349526
View attachment 349527
View attachment 349537
View attachment 349540
View attachment 349587
That case with the four geforce RTX upright, I wonder if will become a future standardised design given the current problems with fitting the giant Nvidia GPUs in standard towers.
 
So next on the to do list is to replace the other cheesy lightweight short small heatsinks for the Mosfet / voltage regulators etc that came with the board. This time I was able to find them halfway across the globe and specific to this board:

IMG_4210.jpg

wimage0.jpeg

image1.jpeg

IMG_4207.jpg
 
The PC is neat and tidy, but dang. Twin Eames chairs? Nice.
 
The PC is neat and tidy, but dang. Twin Eames chairs? Nice.

because... you never know?




so I added the upgraded heatsinks, overall result nothing too spectacular, those components are running 7-12C cooler, so I think it was worth the effort:

IMG_4257.jpg



combined with the edited/improved cpu coolers, the cpu heatsinks have a a full copper plate to completely cover the supersized 9684x's footprint

The copper plate is 5mm thick

heatpipes and fin adhesion was improved

more fins were added to the full 120mm

combine with 350-3000 rpm phanteks t30

and outcome (quiet silent and cool running, even under load its not obtrusive)

IMG_4265.jpg


IMG_4275.jpg
 
venturi said:
4090s mounted inverted so that warm air is blown away from CPUs and motherboard:
Amazing config! :love:
Can you tell me some info about your PCIe riser cable?
I might go on a similar path, for my next GPU, keeping the heat away from the rest of the rig.
Planning use one cable for the first PCIe 16×, since it is unlikely that anyone will create a 5090/5080 series Geforce which can fit into 3 slot, and even I find one it will suffocate and overheat
While I would like to use the other cards in the second and third slot.
 
Amazing config! :love:
Can you tell me some info about your PCIe riser cable?
I might go on a similar path, for my next GPU, keeping the heat away from the rest of the rig.
Planning use one cable for the first PCIe 16×, since it is unlikely that anyone will create a 5090/5080 series Geforce which can fit into 3 slot, and even I find one it will suffocate and overheat
While I would like to use the other cards in the second and third slot.
Sure,
they are PCIE 5.0 and amazon has them in various length and directions

beyond keeping the GPU away from the CPUs and board, the real consideration is having it suck cool air and blow hot air away from the CPUs and board, hence mine are actually angled differently for the PCIE cables - details ;)

here are links for you sir

J


 
they are PCIE 5.0
Even comes in PCIe gen 5! Perfect!
I spent some hours finding *this*, but amazon's search is a##
Thank You!
 
Hi Venturi. Can you tell me what is the make and model of your Epyc SP5 CPU coolers? Google search for SP5 cooler does not show anything resembling to your coolers. Your build is awesome!
 
Hi Venturi. Can you tell me what is the make and model of your Epyc SP5 CPU coolers? Google search for SP5 cooler does not show anything resembling to your coolers. Your build is awesome!
Hi, the final heatsinks and fans I settled on came from Alibab and eBay

Heatsinks - from Alibaba and eBay, however I ended up buying multiples and engaging in returns because there are SEVERAL versions that arrive even though eBay for example shows one picture. I also had to engage in vendors to make sure they picked certain types. I also found that the base plates were different in copper plate width and thickness, so I had to send back the undesirable builds and favored the one s with the larger and thicker copper base plate. AGAIN, basically had to cherry pick the available products even under one sku and one vendor as multiple subcontracts were probably involved.
The final versions I ended up with are not in any photos example in ebay but shipped/came as an alternate version.

Heatsinks part 2 - I also had to modify those final heatsinks such as removing all the blades and re pasting / TIM the heat pipes and cooling fins, as well as wet-sanding and polishing dead even/smooth the brass base plate.

Fans - I did not use any of the fans included with the heatsinks, I selected the Phanteks T30 as the best solution and set the internal switch (included on the back of every T30) to hybrid advanced which lets the fan go from 300-400rpms to an excess of 3000 rpm.

Hope that helps.
 
I’ll be updating to dual or quad 5090s and the wife can have the 4090s for her PC (maybe)

I’ve been watching reviews of the 5090s and I am concerned. Apples to apples, my 4090s, non DLSS, 4k max settings are higher than any of the reviewers

example DSOG review of 4090 vs 5090, look at Chernobylite bench

and here is my 4090s. higher than the 5090, also in AW2, Cyberpunk, etc (all non DLSS 4k max setting)
even lowest 1% is higher than the 5090, and I don’t overclock

Hope the 5090 is not a disappointment for those of us that don’t do the DLSS gimmick

My single 4090 score in chernobylite, 4x, max, Ray Tracing set to ULTRA
99f960c1d721e5b6270a39c5d85e10c2c6846b47.jpeg


My Dual 4090 score (mGPU not SLI)

60ae12437e46a9f17fc5d4054ccbb1122baa25a6.jpeg


Benchmark for 5090 review at DSOG


72b6262cbadf5233afab7afed10878e2a2d864a7.jpeg



A few logical concessions:
maybe my system was already better optimized and those optimizations will carry over to the 5090s

maybe there are several driver optimizations that need to be released to take better advantage of the 5090s

maybe testers have anomalies in their benchmarks

maybe things like BUS, bandwidth, processor(s), OS optimizations, etc have a lot of impact

…maybe all of the above

its just that at 2k a card I want to FEEL the performance of the 5090s
42e39560f389c1da378c8b84978b0b694c46e53c.jpeg


I have made a loving comfy home for the dual 5090s

56c9469dc41c0b0773607c7af98db9ee34b649d8.jpeg


As I navigate the issues with the 50xx series cards, architecture, build issues, supply issues, missing ROPs, heat issues, connector issues, driver issues such as monitor timings, chromatic aberrations, and black screens… I also consider that maybe the supply issues and price gauging may have done me a favor.

Bare in mind that I am also waiting on a pair of 5090 FE. A pair of 4090 FE, as I have right now, quite puzzlingly, is out performing by 130% the reported benchmarks of a single 5090.

The pricing of 5080 is street value higher than 4090.

I would also bare in mind that there have been so many issues with the 5090, and the worst being architectural issues and missing ROPs, I think anything else I could handle, but chip flaws…

mGPU has been a struggle (see ALL the posts above) and some witchcraft, burnt offerings, and some failures.

If you are going to do mGPU the only way to get it 100% of the time is to use your own application or algorithm in CNN, NLP, RCNN, etc, and not be at the mercy of what code or features internally made it through “an enthusiastic developer team who’s final product and imbedded code survived the publisher and lawyers”

(borrowing from Bloom County)

In all the product releases from Nvidia, the tactics and issues this time around… seem unheard of for what is supposed to be one of the shiniest companies.

In retrospect, from product release strategies, AB partner strategies… while I am a fan, I really fell let down and somewhat fooled, making me feel like a sheeple.

As of right now, my 4090 single card benchmarks, without DLSS, are higher than the 5090 single card benchmarks without DLSS. So I am also worried that the hype is geared towards DLSS, which I would not use. Lastly, my dual GPU config on many titles, without DLSS, is from 90-130% better benchmarks than the single 5090 benchmarks on several titles.

yes… I am (was) attracted to what I could do with a pair of 5090s… but there are now so many caveats.

So… maybe this botched 5090 release is a favor, and maybe I’ll skip this gen and or wait for the later cycle update/releases..... Of course if I found a pair of retail 5090 FEs, at the retail price, I would reflex buy instantly along with a sincere "shut up and take my money, please!"

If this was a “Dear Abby” it would be signed:

“Jilted”
 
Last edited:
I'm getting a lot of questions on how to get started on mGPU, I should probably make this a separate thread (and will probably do so) as a teaching moment, but I also understand that savvy folks want to know "how" to do it.

So will repost in here some of mu other posts/treads from other sites to try and provide more central start.


While embarking on mGPU, and not go straight to how to enable mGPU in UE4 and UE5, easier examples are Strange Brigade, Ashes of the Singularity and Gravity Mark as they work without too much fussing and have friendly menus to get you there. GravityMark is a great tool to hone and optimize you’re setup.

Gravity mark should be your very first stop on mGPU:



bc4096a607e0a1ec882b8951164b48e9333982b6.jpeg


Now, please know that what comes next is important:

… but the OS optimizations (HAGS, GPU per affinity, ReBAR etc) and the driver basics only (no GeForce Experience, none of the driver optional install components that interfere and importantly, → nv container CAN NOT be running in the background)

Example, I do not have any of that in the background, runs faster and more stable. I don’t use the nvidia app or the control panel, I integrated the inspector and profiles instead on my right click:

Nvcontainer / nvidia service breaks many mGPU attempts

I have it turned off in services (manual)
I can’t seem to included an image in this message editor so I will provide that setup in your main thread.
I use a right click shortcut to control the set up but if needed I use the start stop shortcuts

For starters -here is my desktop right click:

a9b5163eb2f70d6dabe7487259ef0dc84471c432.jpeg


I disable the NVContainer / nvidia service but if I ever need to get to it I start and stop it with shortcuts:

6cd0bbc0b3f641220963d5da7089fe6655b9d06b.jpeg


(yes, all my tiles are orderly and well behaved and are transparent without any vendor or MS added colour, I also have no ads running

for tile transparent background colour modify the xml
with:
BackgroundColor=‘transparent’/>

…easy enough, just try it

nvcontainer set to manual in services as such:

service.png


start.png



start stop shortcut commands are easy:
net start NVDisplay.ContainerLocalSystem
net stop NVDisplay.ContainerLocalSystem

you may want to disable Shader Cache in the CPL, for me, it breaks all my mGPU or at least reduces the performance significantly, the CPL Shader Cache has a negative effect on ALL my games

If you are on server 2022 or preview 2025 data center as workstation, you can prevent mitigations in

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\Memory Management]
“FeatureSettingsOverride”=dword:00000003
“FeatureSettingsOverrideMask”=dword:00000003

Also disable any mitigations in BIOS (your choice)
I realize some folks don’t understand mitigations and have a rubber stamp response, but I leave that choice up to you, these are my choices.

next
you must have resizable BAR on in the BIOS and you need to set the Resizable BAR to “unlimited” ( I believe the switch choice is “32” in the setting.


c4079d80cbfe1ef7a721b6cc544855421f345381.jpeg


in windows you can use (if your hardware is capable) the :

ReBarState.exe

Driver:
the driver itself should only include these things in the downloaded driver and remove so that this is what is left:

bb5d723c2b1e2ff702f2520fb1e2e71cf2deb91d.jpeg


and modify the bottom of the CFG file so that the three lines under the eula are gone, like this:
7e07546d15b9f9269cf1c5a919c760c587ed25e7.jpeg


ONLY then do the exe setup

for pre 2015 games that use physics you should install the separate concurrent standalone exe from nvidia.

PhysX-9.13.0604-SystemSoftware-Legacy.exe

etc etc etc

yes, lot’s to do to have a fast stable windows system, I could not get it to work under win 10 or win 11, but it works under server 2022 and server 2025 (I use windows server 2022 or 2025 data center as a lean Workstation)

Then starts the ramp to some of the more involved steps

the next “easier” game to set up is Civilization IV and also a good introduction to what lies ahead.

step 1: DO NOT USE STEAM VERSIONS OF GAMES, only standalone versions without a third party check in “rent to own” games. Games that the version lets you launch the game from the real exe in the game folder.

Please include the correct switches in the property of the launcer for the executable, these have to be enabled here first.

Ashes of the Singularity:

fe320e57da3f721eed4fa8f57b1620c76ccbdf5d.jpeg


Civilization IV, you’re goal is to enable this check mark without SLI or NVLink:

b4efa4467d7a301044e484320b84aa286efcc513.jpeg

13eef0c9bea645e7d8b00672355e95f105b8f775.jpeg



in in Fraxis, in the graphicoptions.txt
set these changes:

;Set DX12 compute queue usage on (1), off (0), or platform default(-1).
EnableAsyncCompute 1

;Enable DX12 split-screen optimizations for multi-GPU systems. On (1), Off(0), or platform default (-1) [Platform Default is OFF, but may change in the future.]
EnableSplitScreenMultiGPU 1

[Video]
;Experimental rendering modes
RenderMode 1

;Experimental resolve modes
ResolveMode 1

and save the file as “read only”

you can also go into the other config files and manually set the amount of threading for cpus and other features, it scales well.

I have to look up which version I have as one of the updates from Fraxis broke all those GPU and CPU perks.

So, ready to start the journey to faster more stable and enjoy your PC again?

7939449bafa887c509a0cbb3ac2113b68cb3c675.jpeg


Then you can proceed to other titles like Chernobylite and get mGPU scores like this with max / ultra raytracing at 4k and NO DLSS and no FG:

60ae12437e46a9f17fc5d4054ccbb1122baa25a6.jpeg


Disclaimer:
I only toy with this stuff as a hobby, I’m more of a cars, motorcycles, swiss watches, fountain pen, CDs and Vinyl, pistols kinda guy. And so I may not convey in the appropriate vernacular and lexicon of geek-speak. It probably doesn’t help that English is not one of my primary languages.
Anyhow, it FEELS as if nvidia and MS deliberately make it complicated, I merely look for solutions around the obstacles. if you’re interested I may post more on the MS server as your personal PC OS and how to make it be fast, private, stable and able to do all the things I do.
some stuff is complex and some stuff is simple like use the HOST file to give you browsing and your day to day life more privacy. See, I’m of the opinion the OS works for me and that the OS is not a way of life. Whenever I see Microsoft or Google say that the keep your data private, my first thought is “how do I keep my data private from YOU (MS,Google, Apple, etc)?”
Anyhow, that’s a separate rant
 
Last edited:
Back
Top