• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

G-Sync and SLI Don't Get Along: Framerate Falls If You Combine Them

Joined
Dec 22, 2011
Messages
3,890 (0.86/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
What happen to mGPU? Seems to have only be used by Raja to show two RX 480 can just about beat a single 1080 in a game they bought and paid for.
 
Joined
Apr 17, 2014
Messages
228 (0.06/day)
System Name GSYNC
Processor i9-10920X
Motherboard EVGA X299-FTW
Cooling Custom water loop: D5
Memory G.Skill RipJawsZ 16GB 2133mhz 9-11-10-28
Video Card(s) (RTX2080)
Storage OCZ vector, samsung evo 950, Intel M.2 1TB SSD's
Display(s) ROG Swift PG278Q, Acer Z35 and Acer XB270H (NVIDIA G-SYNC)
Case 2x Corsair 450D, Corsair 540
Audio Device(s) sound blaster Z
Power Supply EVGA SuperNOVA 1300 G2 Power
Mouse Logitech proteus G502
Keyboard Corsair K70R cherry red
Software WIN10 Pro (UEFI)
Benchmark Scores bench score are for people who don't game.
AND a Gsync monitor.



Are you trolling or truly that gullible? You really seem to miss the irony here. Performance is handicapped and you're saying it was "TWIMTBP"...because an extra investment in a monitor eliminates the problem (even though it only hides it, latency from low FPS doesn't magically dissappear with Gsync).


you have ZERO idea what you're even saying.......

Same reason Surround support sucks Nvidia just doesn't care to maintain support of their own innovations, they are always on the next one...


sounds like a team green hater.... like most the Bias BS posted on this website.
 
Joined
Sep 17, 2014
Messages
20,949 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
you have ZERO idea what you're even saying.......

Care to elaborate? The substance is lacking here. Perhaps you just missed the essence of what I said.

You're saying its fine they drop proper SLI support because Gsync offers variable refresh so who cares if your games drop to 30 fps. Did you forget that low FPS also translates to higher latency and pixel response? Did you forget that Gsync also comes with ULMB? ULMB excels with high refresh, high fixed framerates. So you're paying a premium on a monitor to get Gsync and half of its featureset isn't supported proper when you buy TWO cards from the same supplier. That sounds like some truly awesome aftersales support! Where can I sign up?

In case it is lost on you, note the sarcasm. The same sarcasm I posted earlier in this topic, which you happily confirmed as 'its fine, you got Gsync anyway'. You really don't get it, and proof is in the pudding.

So what is the best alternative, Fixed or ULMB?

I don't think these two exclude one another, in fact its a prerequisite to have fixed refresh for strobe to work properly. Which makes sense.

When it comes to motion clarity, which Gsync will never fix, ULMB/strobing backlight is king.
 
Last edited:
Joined
Mar 18, 2015
Messages
2,960 (0.89/day)
Location
Long Island
Hmmmm interesting concept .... using G-Sync over 100 fps ... why would anyone do that ? Kind alike saying.... "On our trip to disneyland, I took my Jeep rental from Orlando to "The Keys" and on the way back I locked the hubs and drive back in 4WD an I noticed mileage drops .... WTH ?

Yeah, WTH .... what they hell would you drive in 4WD on a Florida Highway ?

G-Sync works from 30 fps and has its most significant impact from 30 - 70 fps since it trails off after 60. Freesync works the same way (40 - 70 fps) as again the impact trails off once ya get above 60. Where they differ is that, once you can maintain > 60 fps, you turn G-Sync off and switch to ULMB.

Let's make it simple ....

NVidia G-Sync Monitor
30 - 60 / 70 fps - Use G-Sync
60 / 70 fps on up - Use ULMB

AMD Freesyn Monitor
30 - 60 / 70 fps - Use FreeSync
60 / 70 fps on up - No alternatives unless Monitor Manufacturer provided one

My son has SLI on his 1440p and rarely turns off G-Sync since he's above 80 fps most of the time.

G-Sync monitors do incude a cost premium to cover the cost of the strobing module which provides ULMB. AMD, as has been their modus operandi, chose to compete on price and that strategy has served them well. Some Monitor vendors have filled that void by offering their own strobing / blur reduction alternative to fill that gap..

Asus ROG XG27VQ - ASUS ELMB
BenQ XL2730Z - BENQ Blur Reduction
EIZO Foris FS2735 - EIZO Blur Reduction
LG 24GM79B - G Blur Reduction
LG 34UC79G - LG Blur Reduction
Samsung C49HG90 - Samsung
Samsung C24FG70 - Samsung
Samsung C27FG70 - Samsung

But yes no doubt as cards get more powerful, 144 Hz 4k becomes affordable, and games add more load, we are going to see games that get weird in SLI in 40 - 60 fps zone.... I been saying fo 3 generations now that nVidia is nerfing SLI because there is no competition on the top tiers so SLI was hurting the sale of their top tier cards. Can find no other explanation why across the test suite, scaling is 18% at 1080p, 33% at 1440p and amazingly, 54% at 2160p. While it has a very narrow range of impact, hopefully if SLI has been intentionally nerfed, this will force nVidia to address it.
 
Top