• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GTX 980Ti SLI - low Frame Rate on 4k Monitor vs 4k tv - Witcher 3 - 5960x - Windows 10

Holy crap.... this smells like a scandal :slap:... with nvidia skunkworks in the main role lol :D
 
have you tried to mirror the screens (so they both display the same immage) and then play from the save point? what is the FPS then? GPU utilistation log with GPU-z would also rule out SLI problems (run GPU-z and check both GPUs have the same utilization on both setups).
 
have you tried to mirror the screens (so they both display the same immage) and then play from the save point? what is the FPS then? GPU utilistation log with GPU-z would also rule out SLI problems (run GPU-z and check both GPUs have the same utilization on both setups).

I tried using the clone mode and with chroma set to 4:4:4 i've got the same 40 fps limitation. If i set chroma to 4:2:0 i've got 40 fps too but with a flickering screen.

I also tried something else: with the DVI to HDMI cable i set my TV on 4k but instead of using 4:2:0 chroma at 60 hz i tried 4:4:4 chroma at 30 herz. And i've got around 62-64 FPS with vsync off. Of course the game wasn't smooth because of the 30hz refresh rate.

By the way i was having another similar problem before. It was actually the same problem but it affected almost all games (now it only affects the Witcher 3 and a few others). I managed to fix it by setting the PCIE speed in my bios to manual (gen3 8X mode for both cards) instead of auto.
Maybe the PCIE bandwith isn't enough for 4k@60hz with chroma 4:4:4, i dunno...
 
Last edited:
I managed to fix it by setting the PCIE speed in my bios to manual (gen3 8X mode for both cards) instead of auto.
Maybe the PCIE bandwith isn't enough for 4k@60hz with chroma 4:4:4, i dunno...

Pcie speed affecting color space tax on fps rate... it is sooo cuckoo like women logic.

Out of curiousity... are you using both sli bridges?
 
Pcie speed affecting color space tax on fps rate... it is sooo cuckoo like women logic.

Out of curiousity... are you using both sli bridges?

I'm using an MSI SLI bridge (i bought one certified for 4k because of my problem but it didn't fix it).

By the way i just tried to change PCIE speed from gen3 to gen2 in the bios and i got about 30 FPS instead of 40 before in chroma 4:2:0.
I thought PCIE speed didn't have much effect on performance but it looks like this game is very dependant on PCIE speed, at least on very high resolutions.
 
I'm using an MSI SLI bridge (i bought one certified for 4k because of my problem but it didn't fix it).

By the way i just tried to change PCIE speed from gen3 to gen2 in the bios and i got about 30 FPS instead of 40 before in chroma 4:2:0.
I thought PCIE speed didn't have much effect on performance but it looks like this game is very dependant on PCIE speed, at least on very high resolutions.

I guess it applies only to SLI setups... but yours is a 8 + 8, not a true 16 + 16?

SLI bridges have certification based on resolution? Well this is new to me.. too long in the red camp.
 
I signed up on this site because i'm having the same issue as Ryan with my 4k LG TV and my 980 Ti SLI.
When i play The Witcher 3 in 3840x2160@ 60hz with chroma set to 4:2:0 in my TV i get smooth 60 FPS with all settings at max (hairworks is activated too).
But when i set my TV to chroma 4:4:4 mode, with the same settings i get only 40 FPS.
If i try to disable antialisasing and hairworks i'm still getting 40 fps with chroma 4:4:4, it's really weird.
For 4k@60hz with chroma 4:2:0 i'm using a DVI to HDMI cable plugged to my TV.
For 4k@60Hz with chroma 4:4:4 i'm using an HDMI 4k certified cable plugged to my TV.


It looks like a several games have this 40 FPS limitation in 4k but not all (with GTA 5 i get 60 FPS with 4:4:4 chroma).

If it try to connect my TV with an HDMI cable to get chroma 4:2:0 at 4k@60hz i get a flickering bug and only 40 fps too.
Can you reduce the quality to get more FPS? If not it may be a different issue as I am able to get 60fps on my monitor in 444 I just have to reduce quality. I get an avg of 40fps mostly when it is raining but with 4:2:0 on my tv I can get 60fps in the rain.

I wonder if the card is saving on compute when working with 4:2:0... I've been told that It shouldn't matter.

I feel like I need to speak with an Nvidia engineer to figure this one out.
 
Is your GPU usage around 70-80% when you are set to 444 as well?
What GPU's are you running? MSI? I'm using 2x MSI GTX 980ti 6G.
 
I guess it applies only to SLI setups... but yours is a 8 + 8, not a true 16 + 16?

SLI bridges have certification based on resolution? Well this is new to me.. too long in the red camp.

I have a 8+8 SLI setup (3770k with a P8Z77-V motherboard).
I know MSI and EVGA both released 4k certified SLI bridge. I had actually little hope that it could fix my problem, but i had to try, just in case. I guess it's just like HDMI 4k certified cables, they're just the same as standard cables, but 4k certified cables have better chances to work at 4k because they're better shielded.
 
Hi,

Please do the following

Download GPU-Z, at the bottom of the Sensors tab, click on "Log to file", run the game and when it hits the 40FPS limit, take a screenshot of the Graphics Card tab and of the Sensors tab.

Also what is the Bus Interface when you start the render test (in full-screen)?
 
Can you reduce the quality to get more FPS? If not it may be a different issue as I am able to get 60fps on my monitor in 444 I just have to reduce quality. I get an avg of 40fps mostly when it is raining but with 4:2:0 on my tv I can get 60fps in the rain.

I wonder if the card is saving on compute when working with 4:2:0... I've been told that It shouldn't matter.

I feel like I need to speak with an Nvidia engineer to figure this one out.


If i set the quality preset to medium i get smooth 60 FPS gameplay with chroma 4:4:4 at 4k60hz.
By the way my results were with vsync enabled. If i disable Vsync i get about 50 FPS instead of 40. It looks like the frame rate is capped at 40 if frame rate is below 60. (That's why i didn't see any difference when i only disabled anti-alisasing)
 
Is your GPU usage around 70-80% when you are set to 444 as well?
What GPU's are you running? MSI? I'm using 2x MSI GTX 980ti 6G.

I haven't checked it yet, but i'll look into it.
My graphic cards are 2 x gigabyte 980 Ti windforce 3 OC
 
Hi,

Please do the following

Download GPU-Z, at the bottom of the Sensors tab, click on "Log to file", run the game and when it hits the 40FPS limit, take a screenshot of the Graphics Card tab and of the Sensors tab.

Also what is the Bus Interface when you start the render test (in full-screen)?

Here are the files :)
For about 15 seconds i ran at 40 FPS with vsync on, then more 15 seconds with vsync off at about 45-50 FPS

The bus interface is PCIE 3 8X for both cards.
 

Attachments

  • GPU-Z Sensor Log.txt
    GPU-Z Sensor Log.txt
    22.3 KB · Views: 492
  • GPUZ.gif
    GPUZ.gif
    17.4 KB · Views: 281
Last edited:
Is your GPU usage around 70-80% when you are set to 444 as well?
What GPU's are you running? MSI? I'm using 2x MSI GTX 980ti 6G.

I just checked it with MSI afterburner and CPU load is 85-87% on GPU1 and 92-94% on GPU2 with 4:4:4 choma (VSync is disabled). With Vsync enabled it will probably be lower.

Edit: with Vsync enabled GPU usage drops to 72-73% which is quite logical because of the 40FPS limit.
 
Perfcsp SLI? It should be there? Temps are a bit high...
 
If you took the screenshot when it was at 40FPS, your GPU load was of 60% which means that something else lowered your FPS.

What is your approx. CPU load when it hits 40FPS?

To me it looks like:
A) NVidia drivers at their finest
B) Bug in the game
C) An hardware limitation in the monitor

EDIT: D) An hardware limitation in the cards

Second EDIT: What happens if you set the screen to 1920x1080 with DSR@3840x2160?
 
Last edited:
If you took the screenshot when it was at 40FPS, your GPU load was of 60% which means that something else lowered your FPS.

What is your approx. CPU load when it hits 40FPS?

To me it looks like:
A) NVidia drivers at their finest
B) Bug in the game
C) An hardware limitation in the monitor

To take the screenshot, i paused the game then pressed alt+tab to switch from the game to GPU-Z, then i took the screenshot.
I'm more familiar with MSI afterburner and i can see that GPU load is around 72-73% on GPU1 and 75-77% on GPU2 when i'm playing at 40FPS with Vsync on. If i disable Vsync it get about 48-52 FPS and my CPU usage is 85-87% on GPU1 and 92-94% on GPU2

Edit:
With chroma 4:2:0, CPU usage is about 97% for both CPUs and I get about 62-64FPS
 
Last edited:
Perfcsp SLI? It should be there? Temps are a bit high...
It looks perfect but performance is still lower than in chroma 4:2:0 mode.
I noticed the temps were a bit high too but only on GPU2 (maybe it's because it's close to the CPU which is overclocked to 4.5Ghz).
 
Go to your Documents, open The Witcher 3 folder, open the user.settings file with notepad and make sure that the line LimitFPS=?? is not at 40 but at 60 or whatever is your refresh rate.
 
Go to your Documents, open The Witcher 3 folder, open the user.settings file with notepad and make sure that the line LimitFPS=?? is not at 40 but at 60 or whatever is your refresh rate.
I just checked it and i have LimitFPS=0 (i think 0 means unlimited).
 
I just checked it and i have LimitFPS=0 (i think 0 means unlimited).
And what happens if you manually type 60 instead?
 
I wonder if there is some sort of serious f**kery around the color compression... it chokes up at 4K and the 4:4:4 set turns the thing into delay... yes the engines are waiting the darn compression and thus the load is not 100%. On 4:2:0 there way less job to do actually... and the clusters are fed enough...

It would turn out really ugly if it is true... well that may be the reason why actually HBM was chosen... AMD cannot suffer it due to wide a bus... and that is actually the reason why AMD works better at higher resolution. The time budget is exceeded it seems. I hope I am wrong thou.
 
Last edited:
I wonder if there is some sort of serious f**kery around the color compression... it chokes up at 4K and the 4:4:4 set turns the thing into delay... yes the engines are waiting the darn compression and thus the load is not 100%. On 4:2:0 there way less job to do actually... and the clusters are fed enough...

It would turn out really ugly if it is true... well that may be the reason why actually HBM was chosen... AMD cannot suffer it due to wide a bus... and that is actually the reason why AMD works better at higher resolution. The time budget is exceeded it seems. I hope I am wrong thou.

if its the 4:4:4 settigns makign the problems then why does it still give 40 fps at 4:2.0 with cloned screens.
I tried using the clone mode and with chroma set to 4:4:4 i've got the same 40 fps limitation. If i set chroma to 4:2:0 i've got 40 fps too but with a flickering screen.

Looks more like a problem with the screen OR the NV drivers to me. Trying to run it on 1080p with DSR at 4k should give the same load as 4k, if its still a problem then its the drivers (or maybe the card) but of you then gets 60 FPS on both screens its the scaler or something else in the Phillips screen.
 
And what happens if you manually type 60 instead?

I tried entering 60 instead of 0 but it didn't change anything.

But i have found something interesting: If i disable Vsync and wait a little, performance seems to increase suddenly after a few seconds. I get about 10 more FPS. GPU1 utilisation increased from around 87% to 98% and GPU2 utilisation stays the same at about 98%. The performance is then the same as with Chroma 4:2:0.
If i activate Vsync again performance is still good but if FPS drops too low, it then gets stuck at 40 FPS again, and i have to deactivate Vsync and wait again to get the good performance back.

I tried GTA V to see if there is a similar problem but GTA V worked perfectly in both chroma 4:2:0 and 4:4:4 modes (GPU utilisation was about 98% for both GPUs, with vsync off)
 
Back
Top