Tuesday, October 23rd 2018

G-Sync and SLI Don't Get Along: Framerate Falls If You Combine Them

Several users have complained lately about performance issues on their SLI systems, and after some discussions in the NVIDIA subreddit and on NVIDIA's forums, the conclusion seems to be clear: performance drops when SLI and G-Sync are working together. The folks over at ExtremeTech have made a good job exploring the issues and they have confirmed that frame rate falls if both features are enabled on the PC. The problem seems related to timing according to their data, but there's no clear solution yet to that issue.

The problems are huge in titles such as Rising Storm 2 (206 vs 75 fps with SLI and G-Sync enabled) and not that important on others like Witcher 3 (113 vs 98 fps). The test setup included two GTX 1080 GPUs and an Acer XB280HK monitor that supports 4K but only at 60 Hz, a perfect choice to detect whether the problem was real or not. In their tests with several games they confirmed the problem, but didn't find a defined pattern: "Turning G-Sync on and using SLI is not guaranteed to tank your frame rate. [...] Different games showed three different performance models". In Deus Ex Mankind Divided the gap appeared only on DX11 mode. In Far Cry 5, the penalty size increases as the frame rate rises, and in Hitman the results were even more confusing.
There is no clear explanation to these issues, but fun fact: they are not new. On Tom's Hardware forum there are users with the very same problem in 2016, and there were also users speaking about that kind of issues in our own TPU forum a year and a half ago. It seems the problem isn't present on Turing, but no one with that kind of configuration (G-Sync + SLI) should, for now, feel completely safe.

At TechPowerUp we reviewed the performance of the new RTX 2080 Ti and RTX 2080 connected with NVLink recently, but we didn't test that configuration with G-Sync, so we cannot confirm if the problem is present with Turing, NVLink and G-Sync. Maybe new drivers could solve these performance penalties, but we'll have to wait: NVIDIA hasn't made any official comments on these issues as of yet.
Source: ExtremeTech
Add your own comment

28 Comments on G-Sync and SLI Don't Get Along: Framerate Falls If You Combine Them

#26
HM_Actua1
Vayra86AND a Gsync monitor.



Are you trolling or truly that gullible? You really seem to miss the irony here. Performance is handicapped and you're saying it was "TWIMTBP"...because an extra investment in a monitor eliminates the problem (even though it only hides it, latency from low FPS doesn't magically dissappear with Gsync).
you have ZERO idea what you're even saying.......
MaleficusSame reason Surround support sucks Nvidia just doesn't care to maintain support of their own innovations, they are always on the next one...
sounds like a team green hater.... like most the Bias BS posted on this website.
Posted on Reply
#27
Vayra86
Hitman_Actualyou have ZERO idea what you're even saying.......
Care to elaborate? The substance is lacking here. Perhaps you just missed the essence of what I said.

You're saying its fine they drop proper SLI support because Gsync offers variable refresh so who cares if your games drop to 30 fps. Did you forget that low FPS also translates to higher latency and pixel response? Did you forget that Gsync also comes with ULMB? ULMB excels with high refresh, high fixed framerates. So you're paying a premium on a monitor to get Gsync and half of its featureset isn't supported proper when you buy TWO cards from the same supplier. That sounds like some truly awesome aftersales support! Where can I sign up?

In case it is lost on you, note the sarcasm. The same sarcasm I posted earlier in this topic, which you happily confirmed as 'its fine, you got Gsync anyway'. You really don't get it, and proof is in the pudding.
MaleficusSo what is the best alternative, Fixed or ULMB?
I don't think these two exclude one another, in fact its a prerequisite to have fixed refresh for strobe to work properly. Which makes sense.

When it comes to motion clarity, which Gsync will never fix, ULMB/strobing backlight is king.
Posted on Reply
#28
John Naylor
Hmmmm interesting concept .... using G-Sync over 100 fps ... why would anyone do that ? Kind alike saying.... "On our trip to disneyland, I took my Jeep rental from Orlando to "The Keys" and on the way back I locked the hubs and drive back in 4WD an I noticed mileage drops .... WTH ?

Yeah, WTH .... what they hell would you drive in 4WD on a Florida Highway ?

G-Sync works from 30 fps and has its most significant impact from 30 - 70 fps since it trails off after 60. Freesync works the same way (40 - 70 fps) as again the impact trails off once ya get above 60. Where they differ is that, once you can maintain > 60 fps, you turn G-Sync off and switch to ULMB.

Let's make it simple ....

NVidia G-Sync Monitor
30 - 60 / 70 fps - Use G-Sync
60 / 70 fps on up - Use ULMB

AMD Freesyn Monitor
30 - 60 / 70 fps - Use FreeSync
60 / 70 fps on up - No alternatives unless Monitor Manufacturer provided one

My son has SLI on his 1440p and rarely turns off G-Sync since he's above 80 fps most of the time.

G-Sync monitors do incude a cost premium to cover the cost of the strobing module which provides ULMB. AMD, as has been their modus operandi, chose to compete on price and that strategy has served them well. Some Monitor vendors have filled that void by offering their own strobing / blur reduction alternative to fill that gap..

Asus ROG XG27VQ - ASUS ELMB
BenQ XL2730Z - BENQ Blur Reduction
EIZO Foris FS2735 - EIZO Blur Reduction
LG 24GM79B - G Blur Reduction
LG 34UC79G - LG Blur Reduction
Samsung C49HG90 - Samsung
Samsung C24FG70 - Samsung
Samsung C27FG70 - Samsung

But yes no doubt as cards get more powerful, 144 Hz 4k becomes affordable, and games add more load, we are going to see games that get weird in SLI in 40 - 60 fps zone.... I been saying fo 3 generations now that nVidia is nerfing SLI because there is no competition on the top tiers so SLI was hurting the sale of their top tier cards. Can find no other explanation why across the test suite, scaling is 18% at 1080p, 33% at 1440p and amazingly, 54% at 2160p. While it has a very narrow range of impact, hopefully if SLI has been intentionally nerfed, this will force nVidia to address it.
Posted on Reply
Add your own comment
Apr 26th, 2024 17:45 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts