• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

SLI is really not Dead if you Have a SLI rig

NVIDIA SLI GeForce RTX 2080 Ti and RTX 2080 with NVLink Review
the games that support SLI still do but there is a very strange trend 2080 Ti is now even more expensive than what I paid for mine 6 months ago. apparently people think 2080 Ti is worth it for some reason compared to a 4060 Ti 16GB. RTX 5070 will probably do in 200 W what the hypothetical SLI can do in 400 with power limit set to 200W per card 600 W without.

Not saying that is a bad review, but... he clearly didn't account for TAA in his testing.

Prime example is bf1 having perfect sli scaling at 4k when taa is disabled - 98% performance improvement.

But if you leave taa enabled then it's only 60-70% scaling, as seen in his test.

Makes the sli scaling seem alot worse than what it actually is.

Not to mention it's tested on a system that only gives 8 pcie lanes for each gpu, which does matter with sli, especially at 4k and beyond.
 
Nvidia SLI is dead for the reason they did not want people to buy cheaper cards to have the same performance of a high expensive one. Also because the real hard core gamers got soft. Like in wanting the game to support it in settings. That was the biggest sorta thing I have seen over the last 20 years. I run SLI and all my games support it just fine even though there is no setting in the game itself. Reminds me of when I had my voodoo2's in SLI people did not understand how it works so they never learned how to configure it. SLI can be supported is almost any game if you do it in the driver not the game. When you do it in the driver the game only see's one card. But how it reads that card is dependant on the driver how it's configured. I take a cyberpunk for an example. It says it does not support crossfire or SLI. Ok I run all my games in 4k res so this is a perfect example. I have two 1080ti's in SLI and I disabled the SLI in the driver open the game and I get average 20 fps because it's an old card duh. But when I Enable SLI and go into the driver set it to quality and SLI enabled and open cyberpunk it still only reads the one card but my FPS now has jumped to average 45 to 55 FPS in 4k. Which means its using both cards as one. Which is what SLI suppose to do. Basicly one card is runing 1080p on the bottom half and the other is rendering 1080p on the top half in the 4k setup.

Now I'm looking to upgrade my systems someday and you notice no all the motherboards only come with a single slot PCE-E. Nvidia and AMD dropped the dual cards and now Motherboard companies dropped extra PCI-E slots. Now Gamers buy a 4090 and think performance is great 130 FPS with DLSS. But DLSS is not native true gaming. It's a software patch to the AI built into the card to lower the res then scale it back to the 4k. I was a hard core gamer and for the price of a 4090 I would expect it to run native 4k without any DLSS on any game over 130 FPS.

But its does not do that. And I'm like why world I pay that money for a POS card that cannot render 4k without any DLSS and trickery.

Gamers used to demand better. Now its all about the DLSS performance and driver updates to the AI to scale down your Games then scale them up to 4k or 8k.

I might just buy two 2080TI's and Nvlink them because I quite sure I could smoke a 4090 with ease in a SLI setup.

I'm holding out though to see if Nvidia 5000 series gets back to there roots of Hard core Gamer delights.
Issues of this was shown in 2011 when Scott wasson of the tech report did new testing called “ inside the second”. It showed the multi gpu can crank out more fps but it wasn’t smooth/equally timed fps which is picked up by people as micro stuttering or lag etc.



 
Issues of this was shown in 2011 when Scott wasson of the tech report did new testing called “ inside the second”. It showed the multi gpu can crank out more fps but it wasn’t smooth/equally timed fps which is picked up by people as micro stuttering or lag etc.



That guy was head of the marketing department at one point.
I wouldn't trust a word he says.
 
Last edited:
I don't know how you can come to that conclusion? In 2023 alone more 13 Triple A games released with stuttering or micro-stuttering that only used a single card. The fact that those problems still exist on single card should tell it was never the driver or the cards or even the Cpu for S.L.I/.Crossfire. It's just crappily built game engines look at the half-baked coding we're getting now there so unoptimized you have to use upscaling even at 1080P.
One does not exclude the other though. Its a known fact SLI induced stutter and frametime variance and this was never fixed nor fixable; you just overcame it with a lot of FPS, and felt good about it. But it was never better or smoother gameplay. Ever, anywhere, I've been doing a lot of SLI driven games when it was actually feasible. And even then it was a shitshow. Delayed support, VRAM bottleneck, you name it, it was never 100% problem free. We all wanted it to be better. We believed it to be better. But it really wasn't ever going to become better than single card. Just faster. A bit like a drag race. Its fast, but very uncomfortable.
 
What happened to DX12 vendor agnostic mGPU?
Dead in the water from the moment it was announced devs were going to have to carry that themselves. DX12 kinda confirmed the end of mGPU by announcing that. The difference between 'you could' and 'you will' is true industry support. Leaving it up to devs will get you nowhere when it comes to features around the games.

That guy was head of the marketing department at one point.
I wouldn't trust a word he says.
That's fine, we can all live in our own bubble of ignorance, but then why share your thoughts on a forum? The results are indisputable.

4090 is faster if you use DLSS without it 60 FPS is the normal so in reality the actual card sucks.


Wow SLI removes stuttering and latency issues. I rememver Crysis 3 running on single card and I got tearing and pause latency issues and stuttering in SLI smooth as glass.


This is 4k on 4090 running cyberpunk with no DLSS 30 fps thats what you call Godly. WOW that what I call POS. I'm to say this but WTH?
Pick any in your view godly game, run it at the maximum settings and do a comparison, because that's the only way you will be able to compare things then. Its not hard. I'm sure someone with a 4090 will be happy to serve you up a comparison to it.

I'm sorry bud, but progress happens, and there is no real discussions here. The 4090 is easily the fastest GPU you can push, and yes, it will destroy any dual SLI setup you can still make /cobble together in 2024. Heck you won't even pass my 7900XT, not by a long shot. Even just driver overhead might destroy any advantage alone (dual SLI + Nvidia on DX12... painful affair I reckon).
 
Last edited:
Dead in the water from the moment it was announced devs were going to have to carry that themselves. DX12 kinda confirmed the end of mGPU by announcing that. The difference between 'you could' and 'you will' is true industry support. Leaving it up to devs will get you nowhere when it comes to features around the games.


That's fine, we can all live in our own bubble of ignorance, but then why share your thoughts on a forum? The results are indisputable.


Pick any in your view godly game, run it at the maximum settings and do a comparison, because that's the only way you will be able to compare things then. Its not hard. I'm sure someone with a 4090 will be happy to serve you up a comparison to it.

I'm sorry bud, but progress happens, and there is no real discussions here. The 4090 is easily the fastest GPU you can push, and yes, it will destroy any dual SLI setup you can still make /cobble together in 2024. Heck you won't even pass my 7900XT, not by a long shot. Even just driver overhead might destroy any advantage alone (dual SLI + Nvidia on DX12... painful affair I reckon).

Quad sli 1080 ti's in dx11 frostbite engine games (2015 battlefront for instance) will be faster than a 4090... but anything dx12, don't even bother with sli (sadly).
 
I had better luck with Crossfire than SLI, but it always seemed like I was CPU bound when I ran such setups so it wasn't very helpful except for benchmarks. The games that it did seem to work right I felt like I was either bandwidth bound or hitting VRAM limit anyway.

But I do believe in multi-GPU, and it could make a comeback in some form with multiple GPU core chiplet dies. Maybe HBM or 3D cache technologies that we have now could solve some of the latency issues.
I Think we will see a multi GPU but on a single card. Sorta like the Voodoo 3 cards of the time. Or the monster voodoo 5 6000. Nvidia owns the tech now so it might happen. I mean if we gamers would demand more 4k and 8k gaming and not the lame 1440p gaming I'm sure it would be done.

I feel mostly backed stabbed in the back by Nvidia. In 2017 the whole marketing ploy was 4k gaming it's the future and I dived into it. got a 4k monitor, first tried 4k on my two 660ti's in SLI and got maybe 30fps. So saved up got the twin 1080ti's and I was like WOW this is freaking awesome and the quality of the picture in gaming was mind blowing. But then the 20 series and 30 series came out I was hoping for better 4k gaming results but every game showed not much better. Then Nvidia switched from the 4k gaming and said 1440p is the best gaming around. It was like Nvida sold me on 4k and then said ha ha ha sucker!
 
I Think we will see a multi GPU but on a single card.
We do, but not quite like a multi GPU - chiplets will likely go there... If the interconnect is fast enough, it can be done, and you're still talking to a single GPU. A big advantage over talking to several.
 
Does 3DMark support it? I saw results with RX 6700 + 6700 XT
which 3D Mark?
The 2018 versions should.
The results with RX 6700 & RX 6700 XT is because they are basically the same tier card.

Issues of this was shown in 2011 when Scott wasson of the tech report did new testing called “ inside the second”. It showed the multi gpu can crank out more fps but it wasn’t smooth/equally timed fps which is picked up by people as micro stuttering or lag etc.

"Intel's GPU busy" would actually be a better way to view this, & I believe it works on all GPU's not just intel ARC.
 
What happened to DX12 vendor agnostic mGPU?
Answer: it's there but only like two games support it. Which is exactly what I predicted when they shifted the onus on game devs to support mGPU. No one did.
 
Answer: it's there but only like two games support it. Which is exactly what I predicted when they shifted the onus on game devs to support mGPU. No one did.

Yep, everyone who had been using sli for years knew that it died with that dx12 mgpu announcement.

With dx11 we could still do hacks etc to make sli work in games that didn't officially support it, but with dx12, that was out the window.
 
you want the list of which cards support it?

RX 5600XT
RX 5700
RX 5700x

RX 6400 XT
RX6500 XT
RX 6600
RX 6650 XT
RX 6700
RX 6700 XT
RX 6750 XT
RX 6800
RX 6800 XT
RX 6900 XT
RX6950 XT

RX 7600
RX7600 XT
RX 7700 XT
RX 7800 XT
RX 7900 XT
RX 7900 XTX

Or
do want the game list of which games support mGPU?
I only a few games that I know support it.

1. Ashes of the singularity
2. Chasm the rift Vulkan
3. Deus EX mankind Divided
4. Rise of the tomb raider
5. Shadow of the Tomb raider
6. Quake 2 RTX



That is not true for today's cards, with Rebar. It's also not true when the Vram buffer is too small. The Gpu will still have to talk to the Cpu's memory controller to use ram. The cpu will touch the memory bus when it calls for RAM when VRam is out. it will still have to do swap out.
I didn't ask. But if you think it's worth your time, give it a go. IMO, it's not worth anyone's time. But you do you.
 
I have had multiple SLi/CF setups over the years.
Went from dual Radeon 4890s to a single 5870 - loved the same performance on a single card with less noise and power consumption.
moved on to a GTX480 and then upgraded from 480 > 580 and then went 580 SLi - that was a great setup and the scaling was off the hook back then.
Eventually upgraded from 580 SLi to GTX780Ti and got that warm fuzzy feeling of single card outperforming my previous SLi setup
Went 780Ti SLi and rocked that for almost 7 years (!!!) until I wanted to play SoTR and realized SLi was not supported.
Picked up a 1080Ti back in 2020 and have not missed SLi since.
I still do play with 2&3 GPU setups on my test bench from time to time and it still works fine for older games that support it, so from that perspective its not "dead" but as far as any new games and hardware - it is dead.
 
I have had multiple SLi/CF setups over the years.
Went from dual Radeon 4890s to a single 5870 - loved the same performance on a single card with less noise and power consumption.
moved on to a GTX480 and then upgraded from 480 > 580 and then went 580 SLi - that was a great setup and the scaling was off the hook back then.
Eventually upgraded from 580 SLi to GTX780Ti and got that warm fuzzy feeling of single card outperforming my previous SLi setup
Went 780Ti SLi and rocked that for almost 7 years (!!!) until I wanted to play SoTR and realized SLi was not supported.
Picked up a 1080Ti back in 2020 and have not missed SLi since.
I still do play with 2&3 GPU setups on my test bench from time to time and it still works fine for older games that support it, so from that perspective its not "dead" but as far as any new games and hardware - it is dead.

Sli was supported in sottr - it just ran horribly without taa disabled.
 
Last multi-GPU experience I had was Borderlands 3 on R9 Fury X CF. Horribly broken.
 
Sli was supported in sottr - it just ran horribly without taa disabled.
Interesting. I wonder what I was thinking back then. I could have been using a really old driver at the time because I did have some weird stuff with nvidia nspector setup and running hacked BIOS's on the 780Tis so driver updates were always a high adrenaline feat. Another case in point against SLi. The 780Ti's were more handicapped by the 3GB frame buffer than there were raw raster power. Far Cry 5 memory usage at 1440p was I think my 2nd biggest reason to finally upgrade.
 
I had two 1080Ti for a while.

There were only two titles running ok, without much magic in nvidiaProfileInspector, it was RDR2 and... for a weird reason Kingdom Come: Deliverance ran like magic utilizing both cards to the max.

Anything else... Deus Ex: Mankind Divided even have removed mGPU support actually... ran like shit.

Overall system consumption was around 800W and spiking over 1000W... It was my last goodbye for SLI.
 
Interesting. I wonder what I was thinking back then. I could have been using a really old driver at the time because I did have some weird stuff with nvidia nspector setup and running hacked BIOS's on the 780Tis so driver updates were always a high adrenaline feat. Another case in point against SLi. The 780Ti's were more handicapped by the 3GB frame buffer than there were raw raster power. Far Cry 5 memory usage at 1440p was I think my 2nd biggest reason to finally upgrade.

Yep, i had the 780 ti's in sli aswell, and the vram capacity was a limiting factor from day one, despite the usual crowd being like "it's just allocated vram hur durr", despite there being obvious vram swapping stutter, which went away when lowering texture quality, which ofc looked like sh1t and was detrimental to the entire purpose of having sli in the first place - better performance with better visuals.

One such game for me was assassins creed unity - low textures kept vram usage just below 3gb at 1440p maxed settings, and increasing textures beyond that gave an increasing amount of vram swapping stutter. But as you said, the cards had plenty of grunt, just way too limited capacity - but that was obviously intended on nvidias part... you were supposed to upgrade !

Honestly i should just have gotten the 6gb versions of the 780 - would have been a much better experience, and cheaper aswell. That or the original titan when it launched a few years prior - ironically wasn't that much more expensive than a 780 ti... just those blower coolers and noise...

But hindsight is a b1tch - lesson learned though, and i won't ever buy a gpu that skimps on vram again.

I had two 1080Ti for a while.

There were only two titles running ok, without much magic in nvidiaProfileInspector, it was RDR2 and... for a weird reason Kingdom Come: Deliverance ran like magic utilizing both cards to the max.

Anything else... Deus Ex: Mankind Divided even have removed mGPU support actually... ran like shit.

Overall system consumption was around 800W and spiking over 1000W... It was my last goodbye for SLI.

RDR2 ?! It ran HORRIBLY with sli unless you disabled TAA, which made the game an utter shimmer horrorshow...
 
Last edited:
RDR2 ?! It ran HORRIBLY with sli unless you disabled TAA, which made the game an utter shimmer horrorshow...

I hate vaseline vision TAA by definition. RDR2 properly utilized both cards without much noticable stutter.
 
I hate vaseline vision TAA by definition. RDR2 properly utilized both cards without much noticable stutter.

The entire game is designed around it though - the image becomes unstable to an insane degree without it, more so than any other game i've seen.

And yes, without TAA sli worked fine in rdr2... but with horrible image stability...
 
On Borderlands 3, using Crossfire setup with the two Fury Xs made the game completely unplayable if Volumetric Fog was enabled. Massive flickering problems. Disabling it still resulted in negative scaling. It just wasn't worth it.

Most modern rendering techniques are hostile to alternate frame rendering systems. Temporal effects, volumetrics... and you still have to factor raytracing in. I bet RT would exhibit major issues.
 
On Borderlands 3, using Crossfire setup with the two Fury Xs made the game completely unplayable if Volumetric Fog was enabled. Massive flickering problems. Disabling it still resulted in negative scaling. It just wasn't worth it.

Most modern rendering techniques are hostile to alternate frame rendering systems. Temporal effects, volumetrics... and you still have to factor raytracing in. I bet RT would exhibit major issues.

Volumetric effects was never an issue with sli, but all temporal effects certainly were.

In metro exodus and hellblade senuas sacrifice the ssr used temporal effects - but you could hex edit the exe file to remove the temporal effect on ssr, and still use ssr (albeit a slightly worse looking version), which resulted in nearly perfect sli scaling in metro exodus, using dx11.

Edit :

Found an old post of mine with screenies, using the hex "hack" for sli in metro exodus Metro Exodus - PC 4k ultra with sli hack - metro exodus post - Imgur
 
Last edited:
Back
Top