• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Low GPU Usage with SLI Titan X & G-SYNC

Joined
Jul 2, 2015
Messages
117 (0.04/day)
Location
USA
System Name The Mini Nuke
Processor Intel i9-9900K at 4.9GHz -1AVX Vffset 1.27V
Motherboard Asus Maximus XI Hero WiFi
Cooling NZXT - Kraken X62 Liquid CPU Cooler
Memory G.SKILL TridentZ 32 GB (4 x 8 GB) DDR4-4000 Memory Model F4-4000C17Q-32GTZRB
Video Card(s) ASUS RTX 3080 TUF-RTX3080-O10G-GAMING
Storage Samsung 850 EVO 500GB SSD, Intel 660p M.2 2280 2TB NVMe
Display(s) 1 LG 27GL850-B and 2 ROG Swift PG278Q
Case NZXT - H700 (Black) ATX Mid Tower Case
Power Supply EVGA SuperNOVA 1000 P2 - 1000W Platinum
Mouse Razer Naga Trinity
Keyboard Razer BlackWidow Chroma
Software Windows 10 Pro
When in Full Screen mode I have lower GPU usage (80%/75%) compared to when in Windowed Mode (87%/98%). I made the video below that shows the difference. I've included my systems specs below. Anyone got any ideas on how to fix this? Below is a list of things I've tried and system specs.

Thanks
BB
  1. Reinstall Nvidia Drivers using Clean Install
  2. Disabled G-SYNC
  3. Tried old 60Hz monitor, same lower GPU usage in Full Screen
  4. Tried different Refresh rates, currently set to 144Hz.
  5. Tried turning off all but 1 of my monitors
  6. With SLI off using only 1 Titan X GPU Usage is all good at 99%, regardless of Full Screen or Windowed.

Windows 10 Pro
3 ROG Swift PG278Q 27" 2560x1440 144Hz
Skylake i7-6700k @ 4.4Ghz with NZXT Kraken X61 Liquid Cooler
SLI GTX Titan X @ 1354Ghz
Asus Maximus VIII Hero Mobo
Samsung 850 EVO 500GB SSD
Crucial Ballistix Sport LT DDR4 2400 16GB
GeForce Driver 361.43


-----UPDATE-----
Thanks for the suggestions I've determined the problem has to do with with G-SYNC SLI, it's possible it is limited to the Asus ROG PG278Q. If someone has another G-SYNC monitor and SLI cards please let me know if you have poor GPU usage in Witcher 3.
 
Last edited:
Joined
Jan 9, 2016
Messages
64 (0.02/day)
Location
Omaha
System Name Betty
Processor i7-5930K @ 4.5ghz
Motherboard ASUS X99 PRO. USB 3.1
Cooling Corsair H110i GTX AIO
Memory 16GB Corsair LPX 2666MHZ DDR4
Video Card(s) Zotac AMPED EXTREME GTX 980TI X2
Storage Intel 250GB SSD/KINGSTON 500GB SSD
Case Corsair 760T
Power Supply Corsair HX1200i
Software Windows 10
Have you tried 4k?

I know I get about the same usage with my 980ti's in SLI on the Witcher 3 @ 1440p.
When I ramp things up to 4k GPU usage is pretty much pegged at 99% all the time. This is with a i7-5930K @ 4.5 and my GPUS boosting to 1410mhz..

As crazy as it sounds 1440p on 1 Titan X/980ti isn't a serious hurdle for these cards even in a game like W3 so with 2 of them ramping up the resolution will give them more of a workout.

EDIT: Also from looking at your vid one of your cards is actually hitting 99% usage at points while the other is ~80-90%. That isn't bad scaling at all. You also aren't playing with hairworks on. Peg all the hairworks options to the max and you should also get slightly higher usage @ 1440p.
 
Last edited:

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Yep, CPU is bottlenecking it a bit. Also differences between Fullscreen and Windowed could be because windowed works less efficient (as a matter of fact it does btw). Have you compared FPS of the modes, or just the usage? If the FPS stays even in both modes or is worse in windowed, you got your answer. But anyway: what Ascalaphus said is true, 4K will push your GPUs to the limit, 1440p can't because of the CPU bottleneck. And you can't likely do anything about it, as you already have the best gaming CPU (4 cores + Skylake IPC + OC). Another solution would simply be to play something else that is more demanding on the GPUs, but I think you have more than enough FPS anyway, am I right? ;) This is just about maximizing I guess.
 
Joined
Dec 18, 2005
Messages
8,253 (1.23/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
its not just a cpu bottleneck it can be other things.. driver sli utilization being the most likely one.. i often see less than full gpu power being used and less than full cpu power being used at the same time..

i dont mind less than full gpu power (or cpu power come to that) being used because to be honest mostly it isnt needed.. only when playing the benchmark game..

i have even down-clocked my cards back to the default.. i cant see much point in more fps for the sake of more fps.. my monitor will run at 165 hrz.. i run it at 120 because i think 120 is enough..

there is nothing wrong with having some grunt in reserve or running things less than flat out..

trog
 
Last edited:
Joined
Jul 2, 2015
Messages
117 (0.04/day)
Location
USA
System Name The Mini Nuke
Processor Intel i9-9900K at 4.9GHz -1AVX Vffset 1.27V
Motherboard Asus Maximus XI Hero WiFi
Cooling NZXT - Kraken X62 Liquid CPU Cooler
Memory G.SKILL TridentZ 32 GB (4 x 8 GB) DDR4-4000 Memory Model F4-4000C17Q-32GTZRB
Video Card(s) ASUS RTX 3080 TUF-RTX3080-O10G-GAMING
Storage Samsung 850 EVO 500GB SSD, Intel 660p M.2 2280 2TB NVMe
Display(s) 1 LG 27GL850-B and 2 ROG Swift PG278Q
Case NZXT - H700 (Black) ATX Mid Tower Case
Power Supply EVGA SuperNOVA 1000 P2 - 1000W Platinum
Mouse Razer Naga Trinity
Keyboard Razer BlackWidow Chroma
Software Windows 10 Pro
Thanks for the replies guys, I don't have a 4k monitor so can't try 4k, but I did try my 2560x1600 monitor and it did the same thing.
@Kanan that is what's strange Windowed mode is faster/better usage then FullScreen. Windowed 97fps 98%/88% usage, FullScreen 92fps 80%/75% usage. But yeah I have plenty of fps but just want to get the most out of my cards :)
@trog100 i was thinking sli driver optimization as well but DudeRandom84 on YouTube has the same skylake cpu 6700K and his 980 TI SLI in Witcher 3 usage is almost always above 95% on both cards.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Thanks for the replies guys, I don't have a 4k monitor so can't try 4k, but I did try my 2560x1600 monitor and it did the same thing.
@Kanan that is what's strange Windowed mode is faster/better usage then FullScreen. Windowed 97fps 98%/88% usage, FullScreen 92fps 80%/75% usage. But yeah I have plenty of fps but just want to get the most out of my cards :)
@trog100 i was thinking sli driver optimization as well but DudeRandom84 on YouTube has the same skylake cpu 6700K and his 980 TI SLI in Witcher 3 usage is almost always above 95% on both cards.
You can check your system, if anything is using the CPU other than the game (eating away performance you need for the GPUs/game) - also maybe he is using other settings than you or uses other drivers. There are a lot of variables to consider. But yeah, windowed mode shouldn't be faster, I think it's maybe the driver or something else in your system that screws things up "a bit". The YT guy plays in fullscreen I guess?

PS. but you can try DSR. DSR with +100% resolution results in 5K gaming with a 1440p monitor. With DSR your GPUs should always be at 100% usage. Give it a try. ;)
 
Joined
Jul 2, 2015
Messages
117 (0.04/day)
Location
USA
System Name The Mini Nuke
Processor Intel i9-9900K at 4.9GHz -1AVX Vffset 1.27V
Motherboard Asus Maximus XI Hero WiFi
Cooling NZXT - Kraken X62 Liquid CPU Cooler
Memory G.SKILL TridentZ 32 GB (4 x 8 GB) DDR4-4000 Memory Model F4-4000C17Q-32GTZRB
Video Card(s) ASUS RTX 3080 TUF-RTX3080-O10G-GAMING
Storage Samsung 850 EVO 500GB SSD, Intel 660p M.2 2280 2TB NVMe
Display(s) 1 LG 27GL850-B and 2 ROG Swift PG278Q
Case NZXT - H700 (Black) ATX Mid Tower Case
Power Supply EVGA SuperNOVA 1000 P2 - 1000W Platinum
Mouse Razer Naga Trinity
Keyboard Razer BlackWidow Chroma
Software Windows 10 Pro
Thanks @Kanan I've narrowed down the problem to a G-SYNC SLI issue with my Asus ROG PG278Q, thanks to your suggestion to try DSR.

So DSR is not supported for G-SYNC SLI, so I hooked up my old LG W3000h which is 2560x1600 @ 60Hz and tried DSR and voila near 99% usage on both GPUs. Backed the resolution down to 2560x1600 and still near 99% on both cards. The minute I even hook up one of my G-SYNC PG278Q monitors even if I don't have my desktop extended to it I get the poor GPU usage issue again (approx 75% on GPU1 80% on GPU2).

So it looks like the problem is definitely with G-SYNC SLI, perhaps just an issue with the Asus ROG PG278Q. Anyone else have a G-SYNC monitor and SLI GPU's that can test Witcher 3 and see if you get poor GPU usage?
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Thanks @Kanan I've narrowed down the problem to a G-SYNC SLI issue with my Asus ROG PG278Q, thanks to your suggestion to try DSR.

So DSR is not supported for G-SYNC SLI, so I hooked up my old LG W3000h which is 2560x1600 @ 60Hz and tried DSR and voila near 99% usage on both GPUs. Backed the resolution down to 2560x1600 and still near 99% on both cards. The minute I even hook up one of my G-SYNC PG278Q monitors even if I don't have my desktop extended to it I get the poor GPU usage issue again (approx 75% on GPU1 80% on GPU2).

So it looks like the problem is definitely with G-SYNC SLI, perhaps just an issue with the Asus ROG PG278Q. Anyone else have a G-SYNC monitor and SLI GPU's that can test Witcher 3 and see if you get poor GPU usage?
That doesn't sound rock solid to me. Isn't the ASUS monitor 2560x1440 and the LG 2560x1600? Those 200 lines of pixels adds ~14% more pixels to be rendered. It's entirely possible that the higher resolution is putting enough stress on your GPUs to make them the bottleneck where with 2560x1440 it wouldn't. This isn't to say that G-Sync isn't to blame however, just be careful not assume it's G-Sync since the two monitors don't technically have the same resolution and that difference could partially explain it.

I don't have nVidia or a G-Sync display but, is it not possible to disable G-Sync when using the ASUS display to rule it out?
 
Joined
Dec 18, 2005
Messages
8,253 (1.23/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
i have the asus swift g sync.. what i see running the heaven benchmark in windowed mode.. monitor set to 120 hrz with g-sync enabled in sli mode.. well less than 100% power usage.. in none sli mode i would see nearer 100% power usage.. my cpu usage is also very low.. maybe around 50%.. basically for whatever reason nothing is working hard..

if i run the furmark benchmark i would see gpu full power usage but not with anything else..

i dont think its g-sync because i got similar results with an acer 144 hrz monitor that didnt have it..



i am currently playing metal gear solids.. this does have a 60 fps built in limit and i only see around 30% gpu usage with that. :)

basically the only thing that pushes my sli gpu power up to its limits is furmark.. and with that the cards dont boost as high.. in short they run much slower..

all i am going to say is i think such behaviour is normal.. these cards get regulated in many ways.. some a bit mysterious.. in this example its the max boost speed that is the limiter.. the cards hit this speed at lower than full power.. when they do they throttle down the power usage.. if i run higher clocks the boost will try and go up until it ether trips over its own feet or hits another limit.. this could be voltage.. temps.. or power usage..

these things dont just run at a fixed speed.. they try and "boost" tis the boost that matters nothing else.. tis all one big variable..



trog

ps.. a furmark run.. full gpu power being used but note the much lower boost speed.. with just one card running full gpu power would still be used but the boost rate drops even more..



in sli mode the slower or weaker card governs the other one.. this may explain why they both run slower.. i dont know.. but not seeing full gpu power from both cards is normal the way i see it..

the top card will always run hotter than the bottom card.. it could be as simple as this in some cases..

also note the boost speed in furmark is lower than the claimed boost speed and it isnt the same on both cards.. with heaven.. the boost speed is higher than claimed but the power usage drops off.. ether way these things dont just bang out full power regardless.. tis not the way they work..
 
Last edited:
Joined
Jul 2, 2015
Messages
117 (0.04/day)
Location
USA
System Name The Mini Nuke
Processor Intel i9-9900K at 4.9GHz -1AVX Vffset 1.27V
Motherboard Asus Maximus XI Hero WiFi
Cooling NZXT - Kraken X62 Liquid CPU Cooler
Memory G.SKILL TridentZ 32 GB (4 x 8 GB) DDR4-4000 Memory Model F4-4000C17Q-32GTZRB
Video Card(s) ASUS RTX 3080 TUF-RTX3080-O10G-GAMING
Storage Samsung 850 EVO 500GB SSD, Intel 660p M.2 2280 2TB NVMe
Display(s) 1 LG 27GL850-B and 2 ROG Swift PG278Q
Case NZXT - H700 (Black) ATX Mid Tower Case
Power Supply EVGA SuperNOVA 1000 P2 - 1000W Platinum
Mouse Razer Naga Trinity
Keyboard Razer BlackWidow Chroma
Software Windows 10 Pro
That doesn't sound rock solid to me. Isn't the ASUS monitor 2560x1440 and the LG 2560x1600? Those 200 lines of pixels adds ~14% more pixels to be rendered. It's entirely possible that the higher resolution is putting enough stress on your GPUs to make them the bottleneck where with 2560x1440 it wouldn't. This isn't to say that G-Sync isn't to blame however, just be careful not assume it's G-Sync since the two monitors don't technically have the same resolution and that difference could partially explain it.

I don't have nVidia or a G-Sync display but, is it not possible to disable G-Sync when using the ASUS display to rule it out?
Thanks @Aquinus I would think the extra 14% pixels might be the difference except while running my non G-SYNC LG @ 2560x1600 with usage at close to 99% on both cards all I need to do is plug in one of my ASUS G-SYNC monitors and then the LG @ 2560x1600 now has the same low GPU usage problem.
The instant I plug the g-sync monitor into the display port windows bings as it detects the new hardware and I assume nvidia drivers also detect the new G-SYNC monitor and whatever they change causes the low GPU usage problem. Also I am just referring to Witcher 3, I am now learning that every program is going to behave differently with SLI, G-SYNC and all the other thousands of technologies that are packed into our computers nowadays.

@trog100 thanks trog one thing to note I am referring to low GPU usage not Power Usage, from your screenshot it looks like Thunder Master calls it Core Usage. And this whole SLI G-SYNC issue will vary from game to game and benchmark to benchmark. For example Crysis 3 runs fine with my SLI/G-SYNC setup 99% usage on both cards. I am focusing on Witcher 3 SLI with G-SYNC GPU Usage.

So I think I've figured out the issue of low GPU usage in Witcher 3 is caused by G-SYNC and SLI. Thanks for the help guys!
 
Joined
Dec 18, 2005
Messages
8,253 (1.23/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
running witcher full screen sli g-sinc fully maxed out at 1440 120 hrz refresh rate. just below 80 FPS.. .. pic taken after the game was shut down but you can see where it was.. ignore the numbers look at the graph.. see where it drop off from..



full boost was running but nothing else was near maximum.. power usage core usage tempts and voltage were all lower than max.. the power usage was around 60 %.. i am going to put it down to poor nvidia SLI implementation for witcher..

you are chasing a unicorn if you expect to see both your titan cards running at full power with most games.. they wont be..

benchmarks have better sli driver implementation.. 80 fps aint bad for witcher and its butter smooth with g sinc..

trog

ps.. my actual read out from cpu-z running witcher ... as i said earlier the cards will hit some cap or other.. in this case i think its the max voltage the cards are allowed to draw before throttling down or boosting less.. note my power usage is only 74.4 percent..

2016-01-25 22:22:04 ,

1429.8 , core speed..

1873.1 , memory speed

61.0 , gpu temp C

70 , fan speed percent

1547 , fan speed rpm

1963 , memory used MB

81 , gpu load percent

27 , memory controler load percent

14 , video engine load percent

36 , bus interface load percent

74.4 , power usage percent

1.1930 , voltage

exact figures as per cpu-z.. the card (cards) is hitting the max voltage cap.. tis the only figure even vaguely high or near a max..
 
Last edited:
Joined
Jul 2, 2015
Messages
117 (0.04/day)
Location
USA
System Name The Mini Nuke
Processor Intel i9-9900K at 4.9GHz -1AVX Vffset 1.27V
Motherboard Asus Maximus XI Hero WiFi
Cooling NZXT - Kraken X62 Liquid CPU Cooler
Memory G.SKILL TridentZ 32 GB (4 x 8 GB) DDR4-4000 Memory Model F4-4000C17Q-32GTZRB
Video Card(s) ASUS RTX 3080 TUF-RTX3080-O10G-GAMING
Storage Samsung 850 EVO 500GB SSD, Intel 660p M.2 2280 2TB NVMe
Display(s) 1 LG 27GL850-B and 2 ROG Swift PG278Q
Case NZXT - H700 (Black) ATX Mid Tower Case
Power Supply EVGA SuperNOVA 1000 P2 - 1000W Platinum
Mouse Razer Naga Trinity
Keyboard Razer BlackWidow Chroma
Software Windows 10 Pro
running witcher full screen sli g-sinc fully maxed out at 1440 120 hrz refresh rate. just below 80 FPS.. .. pic taken after the game was shut down but you can see where it was.. ignore the numbers look at the graph.. see where it drop off from..

full boost was running but nothing else was near maximum.. power usage core usage tempts and voltage were all lower than max.. the power usage was around 60 %.. i am going to put it down to poor nvidia SLI implementation for witcher..

you are chasing a unicorn if you expect to see both your titan cards running at full power with most games.. they wont be..

benchmarks have better sli driver implementation.. 80 fps aint bad for witcher and its butter smooth with g sinc..

ps.. my actual read out from cpu-z running witcher ... as i said earlier the cards will hit some cap or other.. in this case i think its the max voltage the cards are allowed to draw before throttling down or boosting less.. note my power usage is only 74.4 percent..
.

Thanks @Trog but see the screenshot below, I can achieve about 98% usage on both cards when using my old non G-SYNC 60Hz monitor. I am almost positive G-SYNC combined with SLI is messing up GPU usage in Witcher 3.
 
Joined
Dec 18, 2005
Messages
8,253 (1.23/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
do you think the situation (what you see) is peculiar to your system or a general rule.. ie.. g-sync lowers performance.. ??

my system is not exactly the same as yours but close..

running witcher i am seeing only 74.4 power usage with the voltage at its max.. my cards are set at stock limits.. if i was to raise the stock voltage limits i would see higher power usage.. it is all adjustable.. i dont like over-volting my cards because i do have problems keeping the temps in check on the top card..

my temps are okay but i do have to run the fans flat out to keep them this way.. my top card runs 20 C hotter than my bottom card.. i am not getting temp throttling but i would be with the stock fan profile..

i dont notice any performance loss from g-sync but then again i dont expect to see 100 percent or close power usage all the time.. i do expect one limiter or other to be hit.. there are at least four of them and they are all adjustable with software..

i have posted my own figures and thoughts.. which is about all i can do..:)

trog

ps.. one thing i will say is that i havnt noticed any quality gains from g-sync ether.. compared to my acer 144 hrz none g-sync monitor that is.. tearing was never a problem.. i changed panels because i wanted an ips type panel for better colour reproduction and viewing angles not because i wanted g-sync..
 
Last edited:
Joined
Aug 15, 2008
Messages
5,941 (1.04/day)
Location
Watauga, Texas
System Name Univac SLI Edition
Processor Intel Xeon 1650 V3 @ 4.2GHz
Motherboard eVGA X99 FTW K
Cooling EK Supremacy EVO, Swiftech MCP50x, Alphacool NeXXos UT60 360, Black Ice GTX 360
Memory 2x16GB Corsair Vengeance LPX 3000MHz
Video Card(s) Nvidia Titan X Tri-SLI w/ EK Blocks
Storage HyperX Predator 240GB PCI-E, Samsung 850 Pro 512GB
Display(s) Dell UltraSharp 34" Ultra-Wide (U3415W) / (Samsung 48" Curved 4k)
Case Phanteks Enthoo Pro M Acrylic Edition
Audio Device(s) Sound Blaster Z
Power Supply Thermaltake 1350watt Toughpower Modular
Mouse Logitech G502
Keyboard CODE 10 keyless MX Clears
Software Windows 10 Pro
It's Gsync and this is why I sold my Swift within a couple months. I had so many issues with Gsync and this was one of them. Same cards, same game too.
 
Joined
Dec 18, 2005
Messages
8,253 (1.23/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
Joined
Jul 2, 2015
Messages
117 (0.04/day)
Location
USA
System Name The Mini Nuke
Processor Intel i9-9900K at 4.9GHz -1AVX Vffset 1.27V
Motherboard Asus Maximus XI Hero WiFi
Cooling NZXT - Kraken X62 Liquid CPU Cooler
Memory G.SKILL TridentZ 32 GB (4 x 8 GB) DDR4-4000 Memory Model F4-4000C17Q-32GTZRB
Video Card(s) ASUS RTX 3080 TUF-RTX3080-O10G-GAMING
Storage Samsung 850 EVO 500GB SSD, Intel 660p M.2 2280 2TB NVMe
Display(s) 1 LG 27GL850-B and 2 ROG Swift PG278Q
Case NZXT - H700 (Black) ATX Mid Tower Case
Power Supply EVGA SuperNOVA 1000 P2 - 1000W Platinum
Mouse Razer Naga Trinity
Keyboard Razer BlackWidow Chroma
Software Windows 10 Pro
Thanks for the confirmation PP Mguire. trog100, I'm not really concerned with Power Usage right now, my main concern is GPU Usage. And I am almost positive it is not something peculiar to only my system, because as my screenshot above shows I can get 98% GPU Usage on on both cards with my old monitor. I am almost 100% sure the GPU Usage issue in Witcher 3 is G-SYNC and SLI. I have also found other Witcher 3 YouTube videos of people with G-SYNC & SLI and their GPU Usage is low.

Now with that said I love G-SYNC and have only found this issue in Witcher 3. For example Crysis 3 runs 99% GPU Usage with SLI and G-SYNC. So I think your mileage is going to vary from game to game. And even though my old monitor has higher FPS and great GPU Usage in Witcher 3, I still prefer the way the G-SYNC monitor looks because to my eyes it provides a much smoother experience.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Thanks for the confirmation PP Mguire. trog100, I'm not really concerned with Power Usage right now, my main concern is GPU Usage. And I am almost positive it is not something peculiar to only my system, because as my screenshot above shows I can get 98% GPU Usage on on both cards with my old monitor. I am almost 100% sure the GPU Usage issue in Witcher 3 is G-SYNC and SLI. I have also found other Witcher 3 YouTube videos of people with G-SYNC & SLI and their GPU Usage is low.

Now with that said I love G-SYNC and have only found this issue in Witcher 3. For example Crysis 3 runs 99% GPU Usage with SLI and G-SYNC. So I think your mileage is going to vary from game to game. And even though my old monitor has higher FPS and great GPU Usage in Witcher 3, I still prefer the way the G-SYNC monitor looks because to my eyes it provides a much smoother experience.
Good that you know what the problem is now. GSync is nice to have, but nothing more. The thing why @trog100 is saying Gsync doesn't provide any benefits to him compared to 144Hz is perfectly reasonable, because high Hz/FPS doesn't suffer from tearing etc. Gsync is good if you have low to medium FPS (30-80 maybe). So - that's obviously no problem with both of your PCs whatsoever. Both of you have a very strong SLI setup and both of you want to play with 1440p which is a relatively underwhelming resolution for such a setup (ideally you would use 4K I'd say, because it has the power for good FPS with that resolution). So that said: Agent maybe Gsync is compared better to an 60 Hz monitor, your old one, but I highly doubt a new 144 Hz(or 160Hz+) monitor would look any worse compared to that. So if you really want that 100%/99% GPU usage all the time, what you could do, is to switch to an normal 144Hz+ monitor and just play without Gsync/Vsync and high fps, therefore having no limitations and no tearing (because of the high refreshrate/FPS). Btw. I had this discussion a lot of times with my friend, he essentially told me the same, if you have an high Hz monitor with high FPS you don't need a Freesync/Gsync monitor, because it changes nothing, or only does on your Minimum FPS. This problem however, isn't one you both could have, because your GPU setups are simply too strong to have a low minimum FPS.
 
Joined
Dec 18, 2005
Messages
8,253 (1.23/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
purely from a gaming perspective i was happy with my 144 hrz 1080 tn panel acer predictor.. cost around £200 quid..

but i am into photography and my rather large collection of carefully edited pics just did not look f-cking right to me.. and i wasnt keen on the poor viewing angles.. it just niggled me and kept on niggling me.. :)

i really dont want a 4 K monitor.. i think a 1440 IPS type panel with a high refresh rate is the gaming sweet spot even with top end hardware.. there are not that many such panels out there and there are no 4 K panels.. the monitor i have (cost £700) gives me pretty much everything i want from a monitor.. it aint perfect but no monitor is..

to go 4 K i would lose other things i think more important.. to be honest i wasnt over keen on paying £700 quid for a monitor but to get what i wanted i had no choice so i thought f-ck it and coughed up the dosh..

4 K might be the be all and end all for some but not for me.. the performance hit is simply too much even with high end cards.. 1440 is my gaming sweet spot..

but i am happy to say i dont see much gain from g-sync.. people that move up from 60 hrz maybe will but i dont.. purely from a gaming perspective my 144 htz 1080 acer was perfectly okay tearing was never a problem..

power usage and core usage are linked.. i am seeing 75 fps at 1440 in witcher with everything fully maxed out.. full power usage or not both my cards run at maximum boost or very close to it.. i also dont think g-sinc has any effect on this ether.. but other opinions will maybe differ..

i have witcher but so far havnt played it much only to see how pretty it is and it isnt a game that really needs or benefits from super high frame rates.. :)

power usage.. core usage.. voltage.. all these things control boost speed.. boost speed governs frame rates.. so does sli optimization.. even at only 74.4 power usage and 80% core usage both my cards are boosting higher than what gpu-z says they should be at the clocks i have them set at.. i am pretty sure one of my cards is hitting its max voltage limit and this is what is lowering the power and core usage.. but that is how it should be from my perspective.. i see no problems..

i have offered my own experience in an attempt to help the OP but like is often the case i dont think i am saying what he wants to hear.. he he

he want to see both his super cards at full power usage.. if they aint he thinks he is losing potential performance.. i have seen such comment before and such thinking.. :)

the way these cards work is that they will attempt to boost.. they will boost until they hit one of the several built into the bios limits.. with more than one card its the weaker card that controls the pair.. my two cards although both the same make and type are not identical.. one has a slightly higher voltage limit than the other.. different silicone.. or one needs more voltage than the other at any given power usage.. when the voltage limit is hit the cards stop boosting any more.. in short they lower the power usage.. all perfectly normal and how it should be..

trog
 
Last edited:
Joined
Aug 15, 2008
Messages
5,941 (1.04/day)
Location
Watauga, Texas
System Name Univac SLI Edition
Processor Intel Xeon 1650 V3 @ 4.2GHz
Motherboard eVGA X99 FTW K
Cooling EK Supremacy EVO, Swiftech MCP50x, Alphacool NeXXos UT60 360, Black Ice GTX 360
Memory 2x16GB Corsair Vengeance LPX 3000MHz
Video Card(s) Nvidia Titan X Tri-SLI w/ EK Blocks
Storage HyperX Predator 240GB PCI-E, Samsung 850 Pro 512GB
Display(s) Dell UltraSharp 34" Ultra-Wide (U3415W) / (Samsung 48" Curved 4k)
Case Phanteks Enthoo Pro M Acrylic Edition
Audio Device(s) Sound Blaster Z
Power Supply Thermaltake 1350watt Toughpower Modular
Mouse Logitech G502
Keyboard CODE 10 keyless MX Clears
Software Windows 10 Pro
This is why I use a modified bios on my cards. 1392 24/7 and I still had GPU usage issues on some games with Titan X SLI on an ROG Swift. It's also why I got rid of the Swift in favor for what I'm using now. I had issues with Gsync on Photoshop CS6 and Vegas Pro 13, including a few other programs I use on a regular basis. For what it's worth Gsync when it's working properly is great, but if you use your machine for more than gaming and want max performance it's pretty buggy still.

I'm one that doesn't mind if GPU usage is lower if frames are being capped but if I'm not doing over 60fps and GPU usage is lower than 90% on both cards I start to have a problem.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
purely from a gaming perspective i was happy with my 144 hrz 1080 tn panel acer predictor.. cost around £200 quid..

but i am into photography and my rather large collection of carefully edited pics just did not look f-cking right to me.. and i wasnt keen on the poor viewing angles.. it just niggled me and kept on niggling me.. :)

i really dont want a 4 K monitor.. i think a 1440 IPS type panel with a high refresh rate is the gaming sweet spot even with top end hardware.. there are not that many such panels out there and there are no 4 K panels.. the monitor i have (cost £700) gives me pretty much everything i want from a monitor.. it aint perfect but no monitor is..

to go 4 K i would lose other things i think more important.. to be honest i wasnt over keen on paying £700 quid for a monitor but to get what i wanted i had no choice so i thought f-ck it and coughed up the dosh..

4 K might be the be all and end all for some but not for me.. the performance hit is simply too much even with high end cards.. 1440 is my gaming sweet spot..

but i am happy to say i dont see much gain from g-sync.. people that move up from 60 hrz maybe will but i dont.. purely from a gaming perspective my 144 htz 1080 acer was perfectly okay tearing was never a problem..

power usage and core usage are linked.. i am seeing 75 fps at 1440 in witcher with everything fully maxed out.. full power usage or not both my cards run at maximum boost or very close to it.. i also dont think g-sinc has any effect on this ether.. but other opinions will maybe differ..

i have witcher but so far havnt played it much only to see how pretty it is and it isnt a game that really needs or benefits from super high frame rates.. :)

power usage.. core usage.. voltage.. all these things control boost speed.. boost speed governs frame rates.. so does sli optimization.. even at only 74.4 power usage and 80% core usage both my cards are boosting higher than what gpu-z says they should be at the clocks i have them set at.. i am pretty sure one of my cards is hitting its max voltage limit and this is what is lowering the power and core usage.. but that is how it should be from my perspective.. i see no problems..

i have offered my own experience in an attempt to help the OP but like is often the case i dont think i am saying what he wants to hear.. he he

he want to see both his super cards at full power usage.. if they aint he thinks he is losing potential performance.. i have seen such comment before and such thinking.. :)

the way these cards work is that they will attempt to boost.. they will boost until they hit one of the several built into the bios limits.. with more than one card its the weaker card that controls the pair.. my two cards although both the same make and type are not identical.. one has a slightly higher voltage limit than the other.. different silicone.. or one needs more voltage than the other at any given power usage.. when the voltage limit is hit the cards stop boosting any more.. in short they lower the power usage.. all perfectly normal and how it should be..

trog
Yeah for me 1440p is "the" gaming resolution still, too. 4K simply has no 120Hz+ monitors and this disqualifies them for me, because I want high refreshrates and would trade them for nothing. But I wouldn't drive a SLI 980 TI or Titan X setup for 1440p, I think this is simply way too much for it. Or I would use DSR with it, then it's okay I guess.

GPU-Z is only "lowest GPU core boost", like listed on newegg and other websites. It's not wrong.

I don't think your comment was senseless, the insight you gave with what you said about Gsync and no Gsync was helpful for all of us. Thanks
 
Joined
Aug 15, 2008
Messages
5,941 (1.04/day)
Location
Watauga, Texas
System Name Univac SLI Edition
Processor Intel Xeon 1650 V3 @ 4.2GHz
Motherboard eVGA X99 FTW K
Cooling EK Supremacy EVO, Swiftech MCP50x, Alphacool NeXXos UT60 360, Black Ice GTX 360
Memory 2x16GB Corsair Vengeance LPX 3000MHz
Video Card(s) Nvidia Titan X Tri-SLI w/ EK Blocks
Storage HyperX Predator 240GB PCI-E, Samsung 850 Pro 512GB
Display(s) Dell UltraSharp 34" Ultra-Wide (U3415W) / (Samsung 48" Curved 4k)
Case Phanteks Enthoo Pro M Acrylic Edition
Audio Device(s) Sound Blaster Z
Power Supply Thermaltake 1350watt Toughpower Modular
Mouse Logitech G502
Keyboard CODE 10 keyless MX Clears
Software Windows 10 Pro
Went from 144hz to 4k 60hz PLS. No regrets, but everybody likes their gaming experience differently.

As to the SLI setup for 1440p, nah I went from 980s to Titan X and even on some games that wasn't enough but I like to max everything out. Still needed copious amounts of AA with 1440p which I don't need in 4k.
 
Joined
Dec 18, 2005
Messages
8,253 (1.23/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
"This is why I use a modified bios on my cards."

heat is the limiter for me with my set up.. my two over large cards sit very close together.. the top card runs 20 C hotter than the bottom card.. 55 C and 75 C respectively.. this is the best i can do with fan profiles set to run the fans flat out at 60 C bottom card and 70 C top card.. pushing up the power and core speeds would simply make my heat problem worse.. so basically i dont over do it.. in fact i leave my power voltage and temp limit setting at stock bios default..

one joy with having over kill at 1440 is that i dont have to push my cards to the limits.. lesser cards would not have the same heat problem and i would push them harder.. i do have two (essential) side fans blowing room ambient air directly on to my cards so in essence my case fans help the card fans.. short of water cooling my top card heat is my limiter..

witcher at 75 fps 1440 fully maxed out isnt exactly over kill ether.. i would be happier to see 120 fps :) but witcher with all its moving tree branches and grass just for the sake of moving tree branches and grass is a bit of an extreme example so i can live with only 75 fps.. he he

playing metal gear solids phantom pain with its built in 60 fps limit i only see around 30% gpu power usage.. this suits me because nothing is working hard or getting hot.. he he..

one day i may water cool my top card but i would need another case.. the one i have cant mount the rad anywhere sensible..

trog
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
"This is why I use a modified bios on my cards."

heat is the limiter for me with my set up.. my two over large cards sit very close together.. the top card runs 20 C hotter than the bottom card.. 55 C and 75 C respectively.. this is the best i can do with fan profiles set to run the fans flat out at 60 C bottom card and 70 C top card.. pushing up the power and core speeds would simply make my heat problem worse.. so basically i dont over do it.. in fact i leave my power voltage and temp limit setting at stock bios default..

one joy with having over kill at 1440 is that i dont have to push my cards to the limits.. lesser cards would not have the same heat problem and i would push them harder.. i do have two (essential) side fans blowing room ambient air directly on to my cards so in essence my case fans help the card fans.. short of water cooling my top card heat is my limiter..

witcher at 75 fps 1440 fully maxed out isnt exactly over kill ether.. i would be happier to see 120 fps :) but witcher with all its moving tree branches and grass just for the sake of moving tree branches and grass is a bit of an extreme example so i can live with only 75 fps.. he he

playing metal gear solids phantom pain with its built in 60 fps limit i only see around 30% gpu power usage.. this suits me because nothing is working hard or getting hot.. he he..

one day i may water cool my top card but i would need another case.. the one i have cant mount the rad anywhere sensible..

trog
The MGS 60fps limit can be removed (then you have a 96 fps limit haha), TPU uses it too in their benchmarks for GPUs (but don't ask me how, I guess google it ;) ). Yeah water cooling would be a lot better in your system, its not very balanced like it is now with that temperature difference and somewhat bottlenecking overclocking.
 
Joined
Aug 15, 2008
Messages
5,941 (1.04/day)
Location
Watauga, Texas
System Name Univac SLI Edition
Processor Intel Xeon 1650 V3 @ 4.2GHz
Motherboard eVGA X99 FTW K
Cooling EK Supremacy EVO, Swiftech MCP50x, Alphacool NeXXos UT60 360, Black Ice GTX 360
Memory 2x16GB Corsair Vengeance LPX 3000MHz
Video Card(s) Nvidia Titan X Tri-SLI w/ EK Blocks
Storage HyperX Predator 240GB PCI-E, Samsung 850 Pro 512GB
Display(s) Dell UltraSharp 34" Ultra-Wide (U3415W) / (Samsung 48" Curved 4k)
Case Phanteks Enthoo Pro M Acrylic Edition
Audio Device(s) Sound Blaster Z
Power Supply Thermaltake 1350watt Toughpower Modular
Mouse Logitech G502
Keyboard CODE 10 keyless MX Clears
Software Windows 10 Pro
"This is why I use a modified bios on my cards."

heat is the limiter for me with my set up.. my two over large cards sit very close together.. the top card runs 20 C hotter than the bottom card.. 55 C and 75 C respectively.. this is the best i can do with fan profiles set to run the fans flat out at 60 C bottom card and 70 C top card.. pushing up the power and core speeds would simply make my heat problem worse.. so basically i dont over do it.. in fact i leave my power voltage and temp limit setting at stock bios default..

one joy with having over kill at 1440 is that i dont have to push my cards to the limits.. lesser cards would not have the same heat problem and i would push them harder.. i do have two (essential) side fans blowing room ambient air directly on to my cards so in essence my case fans help the card fans.. short of water cooling my top card heat is my limiter..

witcher at 75 fps 1440 fully maxed out isnt exactly over kill ether.. i would be happier to see 120 fps :) but witcher with all its moving tree branches and grass just for the sake of moving tree branches and grass is a bit of an extreme example so i can live with only 75 fps.. he he

playing metal gear solids phantom pain with its built in 60 fps limit i only see around 30% gpu power usage.. this suits me because nothing is working hard or getting hot.. he he..

one day i may water cool my top card but i would need another case.. the one i have cant mount the rad anywhere sensible..

trog
I don't really feel the need to raise the limit on TPP actually. Fells fine @ 60.

If you're doing 60 and 70 on your cards then you have room to spare.
 
Joined
Aug 2, 2011
Messages
1,451 (0.31/day)
Processor Ryzen 9 7950X3D
Motherboard MSI X670E MPG Carbon Wifi
Cooling Custom loop, 2x360mm radiator,Lian Li UNI, EK XRes140,EK Velocity2
Memory 2x16GB G.Skill DDR5-6400 @ 6400MHz C32
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra OC Scanner core +750 mem
Storage MP600 2TB,960 EVO 1TB,XPG SX8200 Pro 1TB,Micron 1100 2TB,1.5TB Caviar Green
Display(s) Acer X34S, Acer XB270HU
Case LianLi O11 Dynamic White
Audio Device(s) Logitech G-Pro X Wireless
Power Supply EVGA P3 1200W
Mouse Logitech G502 Lightspeed
Keyboard Logitech G512 Carbon w/ GX Brown
VR HMD HP Reverb G2 (V2)
Software Win 11
It's been well documented. There are issues with using SLI and G-Sync panels. I sold my second GTX780 because of this.

Everything runs great on the one GPU with G-Sync enabled on my computer. The titles that WOULD run SLI with G-Sync, did not run well at all (lots of stutter). Note that these titles ran great with SLI enabled before I bought a G-Sync panel.
 
Top