• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

SLI with different cards

evolucian911

New Member
Joined
Mar 31, 2019
Messages
11 (0.01/day)
Got it, thanks. I am not sure top slot (slot 5) not working because of BIOS set to legacy graphic card to cpu1 by default, if I set to cpu2, than slot 5 will work and slot 3 won't work? then I just need to cut one 8x slot 2 to move the bottom card to slot 2, does it sound a good solution?


I am afraid that I couldn't even do unblocking slot 2 because there is a circle part just right beside the slot so looks powered cable is the only option, can I use a pcie 8x to 16x riser on slot 2? please let me know what you think, thanks a lot.

Remember i told you that the motherboard speaker has to be unsoldered or removed somehow. The circle part is the motherboard speaker. This is there the BEEPing sounds come from when you turn on your computer.
 

seanzhang72

New Member
Joined
Mar 27, 2019
Messages
11 (0.01/day)
Sorry forgot to check your thread link, now I can understand the setup you did. But I am still confused that looks in your system the slot 5 graphic card can't be detected in Windows 10? My case is both cards can work separately without any issue, I am guessing maybe some BIOS settings blocked data transfer between slot 3 and slot 5. To use a powered cable may tell windows system there is a new slot not just for graphic card so slot 3 can talk to new slot, then SLI will work. Instead of unblocking and soldering, I am going to use two 8x to 16x adapters on slot 2 and 4 to give a try, do you think it may work?
 
Last edited:

evolucian911

New Member
Joined
Mar 31, 2019
Messages
11 (0.01/day)
Both cards can only work if you use a powered extension in slot 5 (the upper blue slot). I am not home to show you so the old thread is the best info i have to share for now. Notice how in the thread I have a USB 3.0 card on a pcie 1x to 16x extender which is powered. That was the only way to get that USB3.0 card to work. The board was designed to use the 2 gpu slots for VMs in passthrough. Not to be used together on the host OS. So it removes one card from host OS after Boot by cutting off power (Thats what i imagine happens). Basically, If you want to use both slots, You need to use the powered extender for slot 5 (Upper Blue slot). If u dont have an extender then use option 2: remove motherboard speaker and unblock pcie 8x slots.
 

seanzhang72

New Member
Joined
Mar 27, 2019
Messages
11 (0.01/day)
Both cards can only work if you use a powered extension in slot 5 (the upper blue slot). I am not home to show you so the old thread is the best info i have to share for now. Notice how in the thread I have a USB 3.0 card on a pcie 1x to 16x extender which is powered. That was the only way to get that USB3.0 card to work. The board was designed to use the 2 gpu slots for VMs in passthrough. Not to be used together on the host OS. So it removes one card from host OS after Boot by cutting off power (Thats what i imagine happens). Basically, If you want to use both slots, You need to use the powered extender for slot 5 (Upper Blue slot). If u dont have an extender then use option 2: remove motherboard speaker and unblock pcie 8x slots.
Got it, so you only can see one graphic card in VM even both can work on host ( sorry I haven't set hyperV manager) thanks for WhatsApp support, please let me know when you are available.
 

cacooke

New Member
Joined
Mar 30, 2019
Messages
4 (0.00/day)
Location
Franklin NC
System Name Asus
Processor AMD FX-8250 4.33 GHZ
Motherboard Asus M5A99FX Pro R2.0
Cooling Stock
Memory 32GB
Video Card(s) EVGA AX20 FTW (Nvidia GTX 970 4GB DEV_13C2), Gigabyte GV-V970WF30C (Nvidia GTX 970 4GB DEV_13C2)
Power Supply Corsair CS850M
System Name: Asus
Processor: AMD FX-8250 4.33 GHZ
Motherboard: Asus M5A99FX Pro R2.0
Cooling: Stock
Memory: 32GB
Video Card(s): EVGA AX20 FTW (Nvidia GTX 970 4GB DEV_13C2), Gigabyte Windforce GV-V970WF30C (Nvidia GTX 970 4GB DEV_13C2)

I'm having a problem SLI'ng these 2 cards. By all the information I have been able to find in a week of looking, they should SLI fine.
I just keep getting "To use maxmimum 3D performace, connect the SLI-Ready graphics cards with an SLI connector."

I originally used the very old SLI flexible connector that came with the system, but thinking it may be bad, I bought 4 new ones consisting of 2 different Mfgs (Asus and Replublic of Gamers).
No Go...

I have DDU'd and ddid a full clean install.
nVidia driver version: 419.35 / 25.21.14.1935

When running DifferentSLIAuto1.4 I get:
Could not find patch #2, #3 and #4.

When running DifferentSLIAuto1.6 I get:
Could not find patch #2 and #4.

I've been working on this for a week.

Does anyone have any ideas to get the 2 boards SLI'd?

Thanks in advance...
 
Joined
Apr 25, 2017
Messages
362 (0.14/day)
Location
Switzerland
System Name https://valid.x86.fr/6t2pb7
Processor AMD Ryzen 5 1600
Motherboard Gigabyte - GA-AB350M-Gaming 3
Memory Corsair - Vengeance LED DDR4-3000 16GB
Video Card(s) https://www.techpowerup.com/gpudb/b4362/msi-gtx-1080-ti-gaming-x
Storage Western Digital - Black PCIe 256GB SSD + 3x HDD
Display(s) 42" TV @1080p (main) + 32" TV (side)
Case Cooler Master HAF X NV-942
Audio Device(s) Line 6 KB37
Power Supply Thermaltake Toughpower XT 775W
Mouse Roccat Kova / Logitech G27 Steering Wheel
Keyboard Roccat Ryos TKL Pro
Software Windows 10 Pro x64 v1803
Last edited:

cacooke

New Member
Joined
Mar 30, 2019
Messages
4 (0.00/day)
Location
Franklin NC
System Name Asus
Processor AMD FX-8250 4.33 GHZ
Motherboard Asus M5A99FX Pro R2.0
Cooling Stock
Memory 32GB
Video Card(s) EVGA AX20 FTW (Nvidia GTX 970 4GB DEV_13C2), Gigabyte GV-V970WF30C (Nvidia GTX 970 4GB DEV_13C2)
Power Supply Corsair CS850M
Thanks for the quick reply. I will get right on it.
By the way, does this mean that you have to stay at driver 388.71 or can you update AFTER SLI is working?

Also, where can I get DifferentSLIAuto 1.7.1?

I think I found it in that other thread but it is for win 10, I am running win 7...

I think I found it... but the thread still mentions win 10
 
Last edited:

cacooke

New Member
Joined
Mar 30, 2019
Messages
4 (0.00/day)
Location
Franklin NC
System Name Asus
Processor AMD FX-8250 4.33 GHZ
Motherboard Asus M5A99FX Pro R2.0
Cooling Stock
Memory 32GB
Video Card(s) EVGA AX20 FTW (Nvidia GTX 970 4GB DEV_13C2), Gigabyte GV-V970WF30C (Nvidia GTX 970 4GB DEV_13C2)
Power Supply Corsair CS850M
P!inkPanther, the patching went fine except the feedback gave me a line in red saying "This is a win10 Driver File, Running in win10 mode" then "Patching was successful."
The next line in the instructions: "Edit Install.cmd according to the video above." doesn't really work for me as I do not have the directory path name he is looking to find and replace. My nvlddmkm.sys is in system32/drivers, not system32/driverstore/filerepository...

I am assuming this script is for a win10 installtion and I am on win7

if this is the case, then much more in this script will be incorrect also...

We're almost there...

Any other ideas or pointers?

I need to get into a REALFLOW Ocean simulation tonight but I really need both cards for the rendering.

Thanks...
 

cacooke

New Member
Joined
Mar 30, 2019
Messages
4 (0.00/day)
Location
Franklin NC
System Name Asus
Processor AMD FX-8250 4.33 GHZ
Motherboard Asus M5A99FX Pro R2.0
Cooling Stock
Memory 32GB
Video Card(s) EVGA AX20 FTW (Nvidia GTX 970 4GB DEV_13C2), Gigabyte GV-V970WF30C (Nvidia GTX 970 4GB DEV_13C2)
Power Supply Corsair CS850M
P!nkpanther, I redid install.cmd to match my system and it seemed to run correctly.

Install-win7.cmd.PNG


The bcdedits worked.

On reboot I got a balloon popup that I had never seen before say something like "Your computer is now SLI capable" and I thought YES, FINALLY!, but then when I opened nVidia control panel it says the same thing, to connect the SLI connector.

========================================================

I take that back, I think,
I can now check Maximize 3D Performance and it says SLI ENABLED but below the boxes it still Says For optimal 3D performance, connect the SLI-ready graphics cards with an SLI connector.



Thanks for your your help and your time...
 
Last edited:
Joined
Apr 25, 2017
Messages
362 (0.14/day)
Location
Switzerland
System Name https://valid.x86.fr/6t2pb7
Processor AMD Ryzen 5 1600
Motherboard Gigabyte - GA-AB350M-Gaming 3
Memory Corsair - Vengeance LED DDR4-3000 16GB
Video Card(s) https://www.techpowerup.com/gpudb/b4362/msi-gtx-1080-ti-gaming-x
Storage Western Digital - Black PCIe 256GB SSD + 3x HDD
Display(s) 42" TV @1080p (main) + 32" TV (side)
Case Cooler Master HAF X NV-942
Audio Device(s) Line 6 KB37
Power Supply Thermaltake Toughpower XT 775W
Mouse Roccat Kova / Logitech G27 Steering Wheel
Keyboard Roccat Ryos TKL Pro
Software Windows 10 Pro x64 v1803
This is probably how far your pair of 970s will take you. They seem to have SLIed, but not by using the bridge, instead the data traffic runs across the PCIe bus. Seems to be caused by different memory vendors used in different 970s, preventing the data from being copied directly using the bridge.
Depending on the application/game the performance hit by this may be neglible or may be not.
You can check by watching Bus Load using a GPU performance monitor of your choice (e.g. nvidiaInspector).
By the way, does this mean that you have to stay at driver 388.71 or can you update AFTER SLI is working?
You have to stay with your patched driver. It's not just a hidden config thing that needs to be flipped once.
 

xderpx

New Member
Joined
Sep 7, 2018
Messages
9 (0.00/day)
Is there much performance difference between a 16x/4x config and a 16x/16x one when using the SLI bridge? Seems some games have really poor scaling, whereas others have great ones. Not sure if its just the game or my motherboard, or possibly both.
I get a negligible increase in FPS in RE2 with a GPU2 usage of about 10%, a bit of an increase in GTAV. But my fps is doubled in the unigen benchmark and a few other games. Seems though in MSI Afterburner "SLI Sync Limit" is always at 1
 
Last edited:
Joined
Apr 25, 2017
Messages
362 (0.14/day)
Location
Switzerland
System Name https://valid.x86.fr/6t2pb7
Processor AMD Ryzen 5 1600
Motherboard Gigabyte - GA-AB350M-Gaming 3
Memory Corsair - Vengeance LED DDR4-3000 16GB
Video Card(s) https://www.techpowerup.com/gpudb/b4362/msi-gtx-1080-ti-gaming-x
Storage Western Digital - Black PCIe 256GB SSD + 3x HDD
Display(s) 42" TV @1080p (main) + 32" TV (side)
Case Cooler Master HAF X NV-942
Audio Device(s) Line 6 KB37
Power Supply Thermaltake Toughpower XT 775W
Mouse Roccat Kova / Logitech G27 Steering Wheel
Keyboard Roccat Ryos TKL Pro
Software Windows 10 Pro x64 v1803
Most bad scaling comes from engines simply not supporting SLI in the first place, so most titles will not scale worse with 16x/4x.
You can check by watching Bus Load using nvidiaInspector. If it's greater than~10% it'll definitely hamper SLI scaling
 

Coler

New Member
Joined
Feb 7, 2018
Messages
25 (0.01/day)
Is there much performance difference between a 16x/4x config and a 16x/16x one when using the SLI bridge? Seems some games have really poor scaling, whereas others have great ones. Not sure if its just the game or my motherboard, or possibly both.
I get a negligible increase in FPS in RE2 with a GPU2 usage of about 10%, a bit of an increase in GTAV. But my fps is doubled in the unigen benchmark and a few other games. Seems though in MSI Afterburner "SLI Sync Limit" is always at 1

In my experiences, a SLI with the x4 slot without a bridge decreases performance a lot in most games (but still does good in Unigine) since it's wired to the chipset with all storage and usb devices, it creates a lot of latency trying to sync the frames, so the driver decreases fps to the sync limit avoiding stuttering and whatever. The bridge surely helps but I'm quite sure that x8/x8 both linked to the cpu can do better, and even better with the bridge if we are talking about more powerful gpus., I tested with 2x 1050 Tis.

Plus, some games may need a custom SLI profile to load with inspector, (as you can see here for RE2 https://steamcommunity.com/app/961440/discussions/0/2521353993639030916/)
especially when you see that second gpu is almost not used.
 
Last edited:

xderpx

New Member
Joined
Sep 7, 2018
Messages
9 (0.00/day)
Most bad scaling comes from engines simply not supporting SLI in the first place, so most titles will not scale worse with 16x/4x.
You can check by watching Bus Load using nvidiaInspector. If it's greater than~10% it'll definitely hamper SLI scaling
Ahh, yeah in some games the bus use of the gpu in the 4x slot is 30-40% with 2 980's

Plus, some games may need a custom SLI profile to load with inspector, (as you can see here for RE2 https://steamcommunity.com/app/961440/discussions/0/2521353993639030916/)
especially when you see that second gpu is almost not used.
Thanks :love:. I had a play around with the profile and a few others that I found, but it sure is strange, GPU usage is up to 60-70% for both GPUs, but the fps is the same or lower than single gpu for some reason.
 
Joined
Feb 1, 2019
Messages
40 (0.02/day)
Location
India
System Name VFX RIG
Processor Dual Xeon E5 2686v3 18c 36T @ 3.67Ghz turbo unlocked
Motherboard Asus Z10PE-D16 WS
Cooling Corsair H45 + Corsair H115i
Memory 6x 16 GB Samsung DDR4 ECC REG 2133Mhz
Video Card(s) 1*Zotac GTX 1080 Amp Extreme, 4* Zotac 1050ti OC
Storage Samsung 970 Pro 512 GB Nvme + 850 Evo 250 GB
Display(s) LG D2342P
Case Thermaltake Level 20 XT
Audio Device(s) Creative
Power Supply Corsair AX1200i
Mouse Razer Naga Trinity , Corsair M65 Pro
Keyboard Tecknet Backlight Gaming Keyboard
Software Windows 10
Benchmark Scores Cinebench R15 :- 5225CB Cinebench R20 :- 10560CB
Ahh, yeah in some games the bus use of the gpu in the 4x slot is 30-40% with 2 980's


Thanks :love:. I had a play around with the profile and a few others that I found, but it sure is strange, GPU usage is up to 60-70% for both GPUs, but the fps is the same or lower than single gpu for some reason.
Try "Alternate Frame Rendering 2" in nvidia drivers for your game which uses both gpus but fps is same or lower than single gpu.

Guyz, here is interesting PCIe benchmarks, to check how much performance is lost by lesser PCIe bandwidth.
For now I did all these test on Single GTX 1050ti + X99 Gaming 5P + i7 6800K@ 4.2Ghz. Will do same test with SLI later when i get time.
I switched PCIe gen from BIOS and I have confirmed PCIe bandwidth by running AIDA64 GPGPU test and GPU-z.

PCIe Bandwidth.PNG
 

Coler

New Member
Joined
Feb 7, 2018
Messages
25 (0.01/day)
Ahh, yeah in some games the bus use of the gpu in the 4x slot is 30-40% with 2 980's


Thanks :love:. I had a play around with the profile and a few others that I found, but it sure is strange, GPU usage is up to 60-70% for both GPUs, but the fps is the same or lower than single gpu for some reason.

my theory is that the problem is in the x4 slot, same as mine, what mb is yours? Does it have a 2.0 x4?
a bit due to the limited bandwidth, even if a 3.0 x4 could help, but most because it's routed across the chipset, depends a lot on on the game since some generate a lot of traffic between cpu and gpu. This should explain the 30 - 40% of bus usage.
I remember in the past doing a x16/x4 1.1 SLI on a old P35 Board was getting more benefits because Pcie lanes were all linked in the chipset, even if one to the northbridge and other in the southbridge, now they should be one to the cpu and the other to the mch, wich makes more time to sync everything, causing fps loss.. I see that also some CF users are getting similar problems.
If you want, try to run a game non SLIed on the gpu in the x4 slot and have a look at the bus usage, I think it should be lower.
I think it's useful to SLI cards wich are not officially capable of, but for best performance it's recommended a x8/x8 board both linked to the cpu.

At least this is my theory, maybe I'm wrong, but I still enjoyed seeing good scaling in Unigine and little improvements in some games!

Try "Alternate Frame Rendering 2" in nvidia drivers for your game which uses both gpus but fps is same or lower than single gpu.

Guyz, here is interesting PCIe benchmarks, to check how much performance is lost by lesser PCIe bandwidth.
For now I did all these test on Single GTX 1050ti + X99 Gaming 5P + i7 6800K@ 4.2Ghz. Will do same test with SLI later when i get time.
I switched PCIe gen from BIOS and I have confirmed PCIe bandwidth by running AIDA64 GPGPU test and GPU-z.

View attachment 120182

good report! when you have time could you test please your 1050 Tis with pci ex gen 2. one x16 and second one x4 (covering the pins)?
Just have to be sure that my performance loss is caused to the chipset slot, I presume you are going better since both are linked to the cpu.

Thanks in advance
 
Last edited:

xderpx

New Member
Joined
Sep 7, 2018
Messages
9 (0.00/day)
Guyz, here is interesting PCIe benchmarks, to check how much performance is lost by lesser PCIe bandwidth.
Very interesting! Thanks for sharing. I find it's interesting how like VRay and Sanctuary barely took a hit, while "Performance test 3d" and Fluidmark absolutely tanked.


my theory is that the problem is in the x4 slot, same as mine, what mb is yours? Does it have a 2.0 x4?
I'm using a Z370-A Pro. GPU-Z reports GPU2 is "PCIe x16 3.0 @ 4x 1.1"
Oh.. 1.1? Crap, does that mean it's not only 4x but it's 1.1 and slow as shit as pointed out in topmysteries5 's chart? xD I didn't notice that before!
I am curious though about your suggestion for trying that gpu and check bus usage. I'll have a look and see what happens.
 
Joined
Apr 25, 2017
Messages
362 (0.14/day)
Location
Switzerland
System Name https://valid.x86.fr/6t2pb7
Processor AMD Ryzen 5 1600
Motherboard Gigabyte - GA-AB350M-Gaming 3
Memory Corsair - Vengeance LED DDR4-3000 16GB
Video Card(s) https://www.techpowerup.com/gpudb/b4362/msi-gtx-1080-ti-gaming-x
Storage Western Digital - Black PCIe 256GB SSD + 3x HDD
Display(s) 42" TV @1080p (main) + 32" TV (side)
Case Cooler Master HAF X NV-942
Audio Device(s) Line 6 KB37
Power Supply Thermaltake Toughpower XT 775W
Mouse Roccat Kova / Logitech G27 Steering Wheel
Keyboard Roccat Ryos TKL Pro
Software Windows 10 Pro x64 v1803
Most of the common benchmarks do not stress the PCIe bus at all, rather trying to load the GPU with shader-intensive tasks.
Those which do however expose the true worst-case scaling of an SLI setup.
Same thing with games: Some are written well, others poorly and are sending a huge number of draw calls over the bus.

As for the difference in bus load when running over the southbridge: I tested that some time ago and it did not make any difference in load or overall performance. The only problem was that as soon as bus traffic got high, data collisions with the sound system caused crackling noises in audio.
 

Coler

New Member
Joined
Feb 7, 2018
Messages
25 (0.01/day)
I'm using a Z370-A Pro. GPU-Z reports GPU2 is "PCIe x16 3.0 @ 4x 1.1"

It displays 1.1 because it's like in power saving, if you press on that little question mark next to it and run the stress test you'll be able to see real gen of the slot, unless you set gen1 manually in the bios. should be both 3.0 on your motherboard!

So probably I'm wrong about that thing of the chipset.
 
Joined
Feb 1, 2019
Messages
40 (0.02/day)
Location
India
System Name VFX RIG
Processor Dual Xeon E5 2686v3 18c 36T @ 3.67Ghz turbo unlocked
Motherboard Asus Z10PE-D16 WS
Cooling Corsair H45 + Corsair H115i
Memory 6x 16 GB Samsung DDR4 ECC REG 2133Mhz
Video Card(s) 1*Zotac GTX 1080 Amp Extreme, 4* Zotac 1050ti OC
Storage Samsung 970 Pro 512 GB Nvme + 850 Evo 250 GB
Display(s) LG D2342P
Case Thermaltake Level 20 XT
Audio Device(s) Creative
Power Supply Corsair AX1200i
Mouse Razer Naga Trinity , Corsair M65 Pro
Keyboard Tecknet Backlight Gaming Keyboard
Software Windows 10
Benchmark Scores Cinebench R15 :- 5225CB Cinebench R20 :- 10560CB
good report! when you have time could you test please your 1050 Tis with pci ex gen 2. one x16 and second one x4 (covering the pins)?
Just have to be sure that my performance loss is caused to the chipset slot, I presume you are going better since both are linked to the cpu.

Thanks in advance

I will do same test with PCIe 2.0x16 + x4 in SLI, once I get time. It already took too much time to create that chart.
 
Joined
Feb 1, 2019
Messages
40 (0.02/day)
Location
India
System Name VFX RIG
Processor Dual Xeon E5 2686v3 18c 36T @ 3.67Ghz turbo unlocked
Motherboard Asus Z10PE-D16 WS
Cooling Corsair H45 + Corsair H115i
Memory 6x 16 GB Samsung DDR4 ECC REG 2133Mhz
Video Card(s) 1*Zotac GTX 1080 Amp Extreme, 4* Zotac 1050ti OC
Storage Samsung 970 Pro 512 GB Nvme + 850 Evo 250 GB
Display(s) LG D2342P
Case Thermaltake Level 20 XT
Audio Device(s) Creative
Power Supply Corsair AX1200i
Mouse Razer Naga Trinity , Corsair M65 Pro
Keyboard Tecknet Backlight Gaming Keyboard
Software Windows 10
Benchmark Scores Cinebench R15 :- 5225CB Cinebench R20 :- 10560CB
I'm using a Z370-A Pro. GPU-Z reports GPU2 is "PCIe x16 3.0 @ 4x 1.1"
Oh.. 1.1? Crap, does that mean it's not only 4x but it's 1.1 and slow as shit as pointed out in topmysteries5 's chart? xD I didn't notice that before!
I am curious though about your suggestion for trying that gpu and check bus usage. I'll have a look and see what happens.
PCIe 1.1 x4 = PCIe 3.0 x1 = 985 to 1024 MB/sec
PCIe 3.0 x4 = PCIe 2.0 x8 = PCIe 1.1 x16 = 4GB/sec (all the Thunderbolt 3 external GPU dock for macbook / gaming laptops work at same PCIe 3.0 x4) Thunderbolt 3 is 40 Gbps (Gigabit per sec). Which means 40/8 = 5GB/sec
Performance is heavily hit when bandwidth is less than 4GB/sec. To confirm it try to run AIDA64 GPGPU test on both GPUs, you'll see Memory read and write to be less than 4GB/sec
You can take results from Chart to compare performance with your setup. But do note that different GPUs and CPUs can have different result. For example when doing same test in my 18C Xeon rig, benchmark result is dropped by 10 to 12 % due to lower single thread performance. Results can be little higher if you use newer CPUs.

I can imagine! thanks!

Have you tried Xtreme-G modded drivers ? For me it fixes Stuttering and FPS dips in SLI. Overall performance is also increased by 2-4 FPS, which is very small increase. Maybe it can fix your SLI performance issues.
 
Last edited:

Coler

New Member
Joined
Feb 7, 2018
Messages
25 (0.01/day)
Have you tried Xtreme-G modded drivers ? For me it fixes Stuttering and FPS dips in SLI. Overall performance is also increased by 2-4 FPS, which is very small increase. Maybe it can fix your SLI performance issues.

I will, and report back, thanks
 

xderpx

New Member
Joined
Sep 7, 2018
Messages
9 (0.00/day)
I've just ordered a new motherboard that'll do 8x/8x PCIE 3.0. So I might do a benchmark comparison between it and the PCIE 3.0 16x/PCIE 1.1 4x to see how much of a difference it gives.
 
Joined
Feb 5, 2019
Messages
21 (0.01/day)
I was looking for a solution for a few days and finally I have managed to load modified driver without test mode enabled :) I've singed the driver through software named DriverSigner that can be downloaded from UnknownCheats forum. It was tricky but it worked for me. It's the best solution so far, because a lot of tools for loading unsigned drivers like DSEFix can be detected by anticheats and doesn't work with modified Nvidia driver.

I was following these steps:
1. Install original Nvidia Driver.
2. Copy nvlddmkm.sys from System32 to DifferentSLIAuto folder.
3. Patch nvlddmkm.sys with DifferentSLIAuto.exe or HexEditor (depending on version of driver).
4. Download and extract
DriverSigner from UnknownCheats forum.
5. Sign the driver by following the steps from pdf manual in DriverSigner folder.
6. Boot into safe mode.
7. Change the driver in System32 to the modified one.
8. Reboot computer.
9. Success!


I have enabled SLI on Non-SLI cards and motherboard. As you can see on the screenshoots, I'm playing Fortnite. Anticheat doesn't block the game due to disabled driver signature enforcement, that's because I'm not in Test Mode. I was using old Nvidia driver 342.01, because of crappy unsupported cards - Geforce 210 and 310 from 2009. I don't know if it will work on newer drivers but I think it will.


Hey, Statelo, are you still able to launch the game using this method?

EAC.png
 
Top