NVIDIA Reflex Tested with LDAT v2 - Making you a Better Gamer 16

NVIDIA Reflex Tested with LDAT v2 - Making you a Better Gamer

Conclusion »

LDAT v2 and Monitor Input Lag Testing

Before concluding, there's one more aspect of the Latency Display Analysis Tool I'd like to go over. While this doesn't directly relate to the NVIDIA Reflex technology, it's still an interesting "added benefit" of the tool. Since it accurately and reliably measures end-to-end system latency, the LDAT v2 can be used for monitor input lag testing. While it can't extract the input lag of the display itself unless you're using one of the two 360 Hz Full HD monitors mentioned in the introduction, displays are the only way to cause a change in the system latency numbers if all other parts of the chain remain the same (meaning, my test system). In other words, if I run 100 iterations of the mouse-to-photon test on monitor A and then switch to monitor B and redo my tests, the resulting numbers can tell us how those two monitors compare in terms of their input lag.

To test monitor input lag until now, I've resorted to the high-speed camera method, combined with a modified gaming mouse. While this method also measures total system latency with some additional factors to be taken into consideration, such as camera lag and the mouse LED reaction time, and yields relevant, real-world results, it is both tedious and insanely time consuming.


This method starts by connecting a modified Logitech G9x gaming mouse to my PC. The mouse has a blue LED connected directly to its primary button; it instantly illuminates after pressing the button. The USB sample rate is set to 1,000 Hz via Logitech Gaming Software. I then mount the Nikon 1 J5, a mirrorless camera capable of recording video in 1,200 FPS, in front of the monitor. After that, I run Counter-Strike: Global Offensive and load a custom map consisting of nothing but a huge white square suspended in a black void. The camera is set up such that it records the entire screen. Vertical Sync and G-Sync/FreeSync are also turned off if available because we don't want anything delaying the drawing of frames; the goal is to have the first frame reach the screen as fast as the monitor itself lets it rather than limiting it by various syncing methods. The test is conducted by starting the video recording and pressing the left mouse button, which is bound to the in-game command "Strafe Left," after which the LED blinks and an in-game movement occurs. I repeat this 20 times and then open the recorded videos in QuickTime, which has a nice option of browsing through a video frame-by-frame. I then find the frame where the LED first turned on and carefully look for the frame where the first glimpse of on-screen movement can be seen. The exact number of frames it took between those two events to happen is then multiplied by 0.8333 because I'm recording in 1,200 FPS (1 frame = 0.8333 ms). To get the final result, I subtract 5 ms from the calculated time because that's the average click latency of the Logitech G9x, which measures between 4 and 6 ms. There are a couple of other factors that slightly influence the score, such as the LED reaction time (1 ms or less), camera lag (1 ms), and USB polling rate (1 ms), but those aren't constant, so I'm not subtracting them from the final calculated result. That's also one of the reasons why I'm doing as many as 20 measurements—the impact of the aforementioned error margins is reduced with each additional sample. In the end, we get the so-called button-to-pixel lag—the time that passes between an action with your mouse and said action first registering on the screen.

All that is nice and dandy, but recorded sample analysis takes a crazy amount of time, and there's plenty of room for operator errors. You're basically pixel-peeping a low-res video, doing your best to spot the first occurrence of movement, which happens on a different part of the screen for every single iteration of the test. Getting 100 usable measurements with the high-speed camera method would take a couple of hours.


The LDAT v2 can do the same with higher accuracy and reliability in less than a minute. NVIDIA made this possible by implementing the flash indicator feature in some Reflex-supported games and adding the Auto Fire feature to LDAT's software. The Auto Fire feature allows me to select a number of "shots" (test iterations), as well as the "shot delay" (a delay between mouse clicks). All I have to do is run the desired game, align the LDAT sensor to the flash indicator area, and press the "mouse" button on the LDAT. The sensor and the accompanying software take care of everything else. Less than a minute later, I have my test results—the minimum, average, and maximum measured end-to-end system latency and the standard deviation.

The game I'm going to use for my future LDAT-powered monitor input lag testing is Overwatch, for several reasons. It has an integrated 400 FPS framerate limit, which my test system has no trouble hitting and maintaining at low settings at any given resolution. By keeping the framerate at 400 FPS, I'm maintaining a low and constant GPU render latency (around 2 ms). Overwatch also has an easily accessible practice range, which allows me to do all of my tests in a controlled environment. Finally, its game engine detects inputs even when a gun is reloading, meaning I don't have to worry about character selection or their ammo magazine size—once I trigger the Auto Fire feature, the test will run in 100 iterations regardless of what's happening in the game.

My tests will be conducted with the NVIDIA Reflex technology turned off (at 400 FPS, it wouldn't make any difference to system latency anyway), with adaptive refresh rate technologies (G-Sync, FreeSync, VRR) deactivated and V-Sync off.
Next Page »Conclusion
View as single page
May 15th, 2024 18:08 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts