• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

G-Sync and SLI Don't Get Along: Framerate Falls If You Combine Them

Joined
Sep 25, 2018
Messages
84 (0.03/day)
Several users have complained lately about performance issues on their SLI systems, and after some discussions in the NVIDIA subreddit and on NVIDIA's forums, the conclusion seems to be clear: performance drops when SLI and G-Sync are working together. The folks over at ExtremeTech have made a good job exploring the issues and they have confirmed that frame rate falls if both features are enabled on the PC. The problem seems related to timing according to their data, but there's no clear solution yet to that issue.

The problems are huge in titles such as Rising Storm 2 (206 vs 75 fps with SLI and G-Sync enabled) and not that important on others like Witcher 3 (113 vs 98 fps). The test setup included two GTX 1080 GPUs and an Acer XB280HK monitor that supports 4K but only at 60 Hz, a perfect choice to detect whether the problem was real or not. In their tests with several games they confirmed the problem, but didn't find a defined pattern: "Turning G-Sync on and using SLI is not guaranteed to tank your frame rate. [...] Different games showed three different performance models". In Deus Ex Mankind Divided the gap appeared only on DX11 mode. In Far Cry 5, the penalty size increases as the frame rate rises, and in Hitman the results were even more confusing.




There is no clear explanation to these issues, but fun fact: they are not new. On Tom's Hardware forum there are users with the very same problem in 2016, and there were also users speaking about that kind of issues in our own TPU forum a year and a half ago. It seems the problem isn't present on Turing, but no one with that kind of configuration (G-Sync + SLI) should, for now, feel completely safe.

At TechPowerUp we reviewed the performance of the new RTX 2080 Ti and RTX 2080 connected with NVLink recently, but we didn't test that configuration with G-Sync, so we cannot confirm if the problem is present with Turing, NVLink and G-Sync. Maybe new drivers could solve these performance penalties, but we'll have to wait: NVIDIA hasn't made any official comments on these issues as of yet.

View at TechPowerUp Main Site
 
I would be interested to know if this occurs on Freesync (2)? and Crossfire or any adaptive sync methods have issues with DX12 mGPU.
 
All the more reason to go single card.
 
So SLI: You throw 50% of the second card's performance, power consumption and price with a normal monitor if you average all games, and more if you use an expensive G-Sync monitor. Well done.
 
This is old news really. Its been put to nGreedia before and they dont want SLi anything
 
I have a SLI, since the GTX 8800 .. and there are constantly problems of coordination.
That's right, if technology falls on a cheat where the processor does not perform vertical synchronization and this is done subsequently, of course, at our cost, Real AMD needs to make a much better processor to overtake nVidia. Chits only pay for NVIDIA.
Good name for nVidia, now nGreedia !!:rockout:, If someone has caused a crash of a stock market in China, it is nVidia with its greed and honorgreed , now we are paying foreign damage!
 
glad i picked up a 1080Ti rather than another 1070 to tide me over.
 
I would be interested to know if this occurs on Freesync (2)? and Crossfire or any adaptive sync methods have issues with DX12 mGPU.
Have not noticed such an issue , crossfire scales as you would expect ,tested on rx580's.
 
Its because you don't need Gsync and SLI together, you see, you get variable refresh so 'it looks smooth anyway'.

This isn't a bug, its a feature people
 
I called this out several years ago. One reason I stopped running SLI is with Gsync there's no need for it.
 
I called this out several years ago. One reason I stopped running SLI is with Gsync there's no need for it.

Exactly, it's the whole bloody point of G-Sync, if your card can stay within said monitors working range, you are in smooth lalaland, no need to chase ultimate framerates anymore.
 
Exactly!

Exactly, it's the whole bloody point of G-Sync, if your card can stay within said monitors working range, you are in smooth lalaland, no need to chase ultimate framerates anymore.
EXACTLY!
 
SLI has fallen so far. I remember the fermi days where a pair of 550ti OCed GPUs were not only cheaper then a 580, but usually much faster, and SLI support was expected of AAA games. A pair of 560ses OCed in SLi with a 550ti OC for physx produced better framerates and image quality then a OCed water-cooled 580. Games not supporting SLI was a huge deal.

The days when you could piece a machine together with a minimum budget.
 
So what is the best alternative, Fixed or ULMB?
 
God, and I almost purchased a 2nd GTX 1080...
nGreedia really has became a shitty callous and greedy company. Have now ZERO respect for them what so ever.
 
Nearly 15 years later, SLI is still a turd and looking worse and worse with each day. It could have been awesome tech if it was done right. My inner tin foil tells me SLI isn't what it could be because they'd rather you buy a GTX1080Ti instead of two GTX1060...
 
Exactly what he said above. I am wondering if this is "Driver assisted revenue" for the company. They bork SLI to force people to pay for an expensive top of the range card with more profit margin for them.
 
Exactly what he said above. I am wondering if this is "Driver assisted revenue" for the company. They bork SLI to force people to pay for an expensive top of the range card with more profit margin for them.

While I've had better luck with Crossfire vs SLI I must say both are crappy at best....
 
Nearly 15 years later, SLI is still a turd and looking worse and worse with each day. It could have been awesome tech if it was done right. My inner tin foil tells me SLI isn't what it could be because they'd rather you buy a GTX1080Ti instead of two GTX1060...

AND a Gsync monitor.

Exactly!


EXACTLY!

Are you trolling or truly that gullible? You really seem to miss the irony here. Performance is handicapped and you're saying it was "TWIMTBP"...because an extra investment in a monitor eliminates the problem (even though it only hides it, latency from low FPS doesn't magically dissappear with Gsync).
 
Last edited:
Rollback the driver and this problem will be fixed (at least on the games that i usually play) , theres nothing worth from updating to 416v
 
Isn't nvidia trying to kill sli anyway? ...artificial incompatibility would be an effective tool then
 
They'll say it's one reason or another but I'm sure their sales department made calculations and figured they'll make more money by never having SLI work properly. It's likely those who would later buy a second GPU to boost performance some time later would be buying a used one. And that means no money for them.
 
Same reason Surround support sucks Nvidia just doesn't care to maintain support of their own innovations, they are always on the next one...
 
Last edited:
Back
Top