Tuesday, October 23rd 2018

G-Sync and SLI Don't Get Along: Framerate Falls If You Combine Them

Several users have complained lately about performance issues on their SLI systems, and after some discussions in the NVIDIA subreddit and on NVIDIA's forums, the conclusion seems to be clear: performance drops when SLI and G-Sync are working together. The folks over at ExtremeTech have made a good job exploring the issues and they have confirmed that frame rate falls if both features are enabled on the PC. The problem seems related to timing according to their data, but there's no clear solution yet to that issue.

The problems are huge in titles such as Rising Storm 2 (206 vs 75 fps with SLI and G-Sync enabled) and not that important on others like Witcher 3 (113 vs 98 fps). The test setup included two GTX 1080 GPUs and an Acer XB280HK monitor that supports 4K but only at 60 Hz, a perfect choice to detect whether the problem was real or not. In their tests with several games they confirmed the problem, but didn't find a defined pattern: "Turning G-Sync on and using SLI is not guaranteed to tank your frame rate. [...] Different games showed three different performance models". In Deus Ex Mankind Divided the gap appeared only on DX11 mode. In Far Cry 5, the penalty size increases as the frame rate rises, and in Hitman the results were even more confusing.
There is no clear explanation to these issues, but fun fact: they are not new. On Tom's Hardware forum there are users with the very same problem in 2016, and there were also users speaking about that kind of issues in our own TPU forum a year and a half ago. It seems the problem isn't present on Turing, but no one with that kind of configuration (G-Sync + SLI) should, for now, feel completely safe.

At TechPowerUp we reviewed the performance of the new RTX 2080 Ti and RTX 2080 connected with NVLink recently, but we didn't test that configuration with G-Sync, so we cannot confirm if the problem is present with Turing, NVLink and G-Sync. Maybe new drivers could solve these performance penalties, but we'll have to wait: NVIDIA hasn't made any official comments on these issues as of yet.
Source: ExtremeTech
Add your own comment

28 Comments on G-Sync and SLI Don't Get Along: Framerate Falls If You Combine Them

#1
moproblems99
I would be interested to know if this occurs on Freesync (2)? and Crossfire or any adaptive sync methods have issues with DX12 mGPU.
Posted on Reply
#3
B-Real
So SLI: You throw 50% of the second card's performance, power consumption and price with a normal monitor if you average all games, and more if you use an expensive G-Sync monitor. Well done.
Posted on Reply
#4
DeathtoGnomes
This is old news really. Its been put to nGreedia before and they dont want SLi anything
Posted on Reply
#5
bogami
I have a SLI, since the GTX 8800 .. and there are constantly problems of coordination.
That's right, if technology falls on a cheat where the processor does not perform vertical synchronization and this is done subsequently, of course, at our cost, Real AMD needs to make a much better processor to overtake nVidia. Chits only pay for NVIDIA.
Good name for nVidia, now nGreedia !!:rockout:, If someone has caused a crash of a stock market in China, it is nVidia with its greed and honorgreed , now we are paying foreign damage!
Posted on Reply
#6
FreedomEclipse
~Technological Technocrat~
glad i picked up a 1080Ti rather than another 1070 to tide me over.
Posted on Reply
#8
TheoneandonlyMrK
moproblems99I would be interested to know if this occurs on Freesync (2)? and Crossfire or any adaptive sync methods have issues with DX12 mGPU.
Have not noticed such an issue , crossfire scales as you would expect ,tested on rx580's.
Posted on Reply
#9
Vayra86
Its because you don't need Gsync and SLI together, you see, you get variable refresh so 'it looks smooth anyway'.

This isn't a bug, its a feature people
Posted on Reply
#10
HM_Actua1
I called this out several years ago. One reason I stopped running SLI is with Gsync there's no need for it.
Posted on Reply
#11
Casecutter
Next article; Gsync monitors... don't play nice with RTX Ray Tracing.
Posted on Reply
#12
Fluffmeister
Hitman_ActualI called this out several years ago. One reason I stopped running SLI is with Gsync there's no need for it.
Exactly, it's the whole bloody point of G-Sync, if your card can stay within said monitors working range, you are in smooth lalaland, no need to chase ultimate framerates anymore.
Posted on Reply
#13
HM_Actua1
Exactly!
FluffmeisterExactly, it's the whole bloody point of G-Sync, if your card can stay within said monitors working range, you are in smooth lalaland, no need to chase ultimate framerates anymore.
EXACTLY!
Posted on Reply
#14
TheinsanegamerN
SLI has fallen so far. I remember the fermi days where a pair of 550ti OCed GPUs were not only cheaper then a 580, but usually much faster, and SLI support was expected of AAA games. A pair of 560ses OCed in SLi with a 550ti OC for physx produced better framerates and image quality then a OCed water-cooled 580. Games not supporting SLI was a huge deal.

The days when you could piece a machine together with a minimum budget.
Posted on Reply
#15
Maleficus
So what is the best alternative, Fixed or ULMB?
Posted on Reply
#16
Prima.Vera
God, and I almost purchased a 2nd GTX 1080...
nGreedia really has became a shitty callous and greedy company. Have now ZERO respect for them what so ever.
Posted on Reply
#17
hat
Enthusiast
Nearly 15 years later, SLI is still a turd and looking worse and worse with each day. It could have been awesome tech if it was done right. My inner tin foil tells me SLI isn't what it could be because they'd rather you buy a GTX1080Ti instead of two GTX1060...
Posted on Reply
#18
Owen1982
Exactly what he said above. I am wondering if this is "Driver assisted revenue" for the company. They bork SLI to force people to pay for an expensive top of the range card with more profit margin for them.
Posted on Reply
#19
Imsochobo
Owen1982Exactly what he said above. I am wondering if this is "Driver assisted revenue" for the company. They bork SLI to force people to pay for an expensive top of the range card with more profit margin for them.
While I've had better luck with Crossfire vs SLI I must say both are crappy at best....
Posted on Reply
#20
Vayra86
hatNearly 15 years later, SLI is still a turd and looking worse and worse with each day. It could have been awesome tech if it was done right. My inner tin foil tells me SLI isn't what it could be because they'd rather you buy a GTX1080Ti instead of two GTX1060...
AND a Gsync monitor.
Hitman_ActualExactly!


EXACTLY!
Are you trolling or truly that gullible? You really seem to miss the irony here. Performance is handicapped and you're saying it was "TWIMTBP"...because an extra investment in a monitor eliminates the problem (even though it only hides it, latency from low FPS doesn't magically dissappear with Gsync).
Posted on Reply
#21
jmcosta
Rollback the driver and this problem will be fixed (at least on the games that i usually play) , theres nothing worth from updating to 416v
Posted on Reply
#22
enxo218
Isn't nvidia trying to kill sli anyway? ...artificial incompatibility would be an effective tool then
Posted on Reply
#23
CheapMeat
They'll say it's one reason or another but I'm sure their sales department made calculations and figured they'll make more money by never having SLI work properly. It's likely those who would later buy a second GPU to boost performance some time later would be buying a used one. And that means no money for them.
Posted on Reply
#24
Maleficus
Same reason Surround support sucks Nvidia just doesn't care to maintain support of their own innovations, they are always on the next one...
Posted on Reply
#25
Fluffmeister
What happen to mGPU? Seems to have only be used by Raja to show two RX 480 can just about beat a single 1080 in a game they bought and paid for.
Posted on Reply
Add your own comment
May 17th, 2024 20:10 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts