ViewSonic ELITE XG240R 144 Hz FreeSync Monitor Review 12

ViewSonic ELITE XG240R 144 Hz FreeSync Monitor Review

Value & Conclusion »

Gaming Performance

Gaming performance is this monitor's bread and butter. It's where it all comes together for the ViewSonic XG240R as its 144 Hz panel combined with a 1 ms GtG response time (with the Overdrive technology active) makes for exceptionally smooth gameplay with sharp moving visuals and no perceivable input lag. Well, as long as you have a PC that can deliver 100 or more frames per second in the games you're playing at Full HD resolution. At lower frames per second, everything suffers with the perceived smoothness of the crosshair and sharpness of fast-moving objects worse off, although the adaptive sync technology still keeps the gameplay buttery smooth. Of course, the same can be said for most other similarly specced gaming monitors on the market.

Even though the ViewSonic XG240R "only" supports FreeSync, there's still hope that you can tap into its adaptive sync goodness if you own an NVIDIA graphics card. As you surely know, NVIDIA recently started supporting adaptive sync on FreeSync monitors through their drivers. As long as you own a GTX 1000 or RTX 2000-series graphics card, the NVIDIA Control Panel will recognize the monitor as it would any "true" G-Sync monitor. You'll have to open up the "Set up G-SYNC" menu and then click on the "Enable G-SYNC, G-SYNC Compatible" and "Enable settings for the selected display model" check boxes. Even though NVIDIA didn't validate the ViewSonic XG240R as "G-SYNC Compatible", you'll have no issues using it as one—that only means it didn't get NVIDIA's official (and mostly meaningless) stamp of approval. For all intents and purposes, the ViewSonic XG240R behaves the same when used for high refresh rate, adaptively synchronized gaming with NVIDIA's GTX 1000 and RTX 2000 series and AMD graphics cards.

One minor detail might potentially confuse you when first using this monitor: FreeSync is turned off by default. To activate it, which is necessary if you want to use the adaptive synchronization on AMD's and NVIDIA's graphics cards, you'll have to open the OSD and go to Gaming Settings > AMD FreeSync.

Response Time and Overdrive

The ViewSonic XG240R has a 1 ms GtG response time. The panel uses the Overdrive technology to make pixel transitions faster. It can be found under "Response Time OD" in the OSD. The Response Time can be adjusted to one of five available settings: Standard, Fast, Faster, Ultra Fast, and Fastest.


I extensively tested all settings by using the so-called pursuit camera method developed by the good people of Blur Busters, namely Mark D. Rejhon. The idea of the procedure is to use a standard DSRL camera to capture the motion blur exactly like your eyes see it. That's achieved by mounting a camera on a smooth slider, setting the camera exposure time to four times the length of the monitor refresh rate, and loading the Ghosting Test Pattern with the Pursuit Camera Sync Track invented by Mark Rejhon of Blur Busters. Then, the camera has to be slid sideways at the same speed as the on-screen motion. The sync track is there to tell you if you're moving the camera too fast or too slow, or if it shakes too much. The procedure takes some practice and getting used to but yields great results and lets us examine the differences between various overdrive settings at various monitor refresh rates.

I took a series of photos at 60, 100, 120, and 144 Hz, at all five available overdrive settings. Let's take a look at the results to figure out what the ideal overdrive setting would be.



It's a close call between Faster and Ultra Fast, but overall, I'd recommend going with the Ultra Fast overdrive setting. It results in the sharpest moving visuals at all refresh rates, with the least amount of visible ghosting, trailing, or overshoot.

Input Lag


To measure the input lag of a monitor, I use a high-speed camera and a modified gaming mouse. Here's a detailed explanation of my testing procedure. It's important that you're aware of it in order to interpret the results properly.

I start by connecting a modified Logitech G9x gaming mouse to my PC. The mouse has a blue LED connected directly to its primary button that instantly illuminates after pressing the button. The USB sample rate is set to 1,000 Hz via the Logitech Gaming Software. Then, I mount the Nikon 1 J5, a mirrorless camera capable of recording video in 1,200 FPS, in front of the monitor. After that I run Counter Strike: Global Offensive and load a custom map (Map_Flood, made by a member of the Blur Busters community) consisting of nothing but a huge white square suspended in a black void. The camera is set up in a way that has it record the entire screen.

Every video setting in CS:GO is either switched to the lowest-possible setting or turned off, and a console command "fps_max 0" is used to turn off the built-in FPS limiter and get as many frames per second as possible, which removes any input lag caused by the game engine from the equation. My system is equipped with an overclocked Core i9-9900K and a GTX 1080 Ti, so it has no trouble hitting 2,000 FPS in that scenario. Vertical Sync and G-Sync/FreeSync are also turned off because we don't want anything delaying the drawing of the frames—our goal is to have the first frame reach the screen as fast as the monitor itself lets it rather than limiting it by various syncing methods. You're probably wondering how much additional input lag is introduced by G-Sync or FreeSync, which is undoubtedly a feature every user of a monitor that supports it will enable as that's why you're buying it in the first place. I tested several different G-Sync and FreeSync monitors extensively with G-Sync/FreeSync on and off and found that G-Sync/FreeSync introduces an additional 2 ms of input lag on average.

The test is conducted by starting the video recording and pressing the left mouse button, which is bound to the in-game command "Strafe Left", after which the LED blinks and an in-game movement occurs. I repeat this process twenty times and open the recorded videos in QuickTime, which has a nice option of browsing through a video frame by frame. Then, I find the frame where the LED first turned on and carefully look for the frame with the first glimpse of on-screen movement. The exact number of frames it took between those two events is then multiplied by 0.8333 because I'm recording in 1,200 FPS (1 frame = 0.8333 ms). To get the final result, I subtract 5 ms because that's the average click latency of the Logitech G9x (it measures between 4 and 6 ms). There are a couple of other factors that slightly influence the score, such as the LED reaction time (1 ms or less), camera lag (1 ms), and USB polling rate (1 ms), but none of those are consistent enough to subtract from the final calculated result. That's also one of the reasons why I'm doing as many as twenty measurements—the impact of error margins is reduced with each taken sample.

In the end, we get the so-called button-to-pixel lag value—the time between when you input an action with your mouse and said action is first registered on screen. Anything below 16 ms (equals one frame of lag at 60 Hz) can be considered gaming-grade, and such a monitor is suitable for even the most demanding gamers and eSports professionals. If the input lag falls between 16–32 ms (between 1–2 frames of lag at 60 Hz), the monitor is suitable for almost everyone but the most hardcore gamers, especially if they're playing first-person shooters on a professional level. Finally, if a monitor's input lag is higher than 32 ms (over 2 frames of lag at 60 Hz), even casual gamers should notice it. Will they be bothered by it? Not necessarily, but I can't recommend a screen like that for serious gaming.

Here's how the ViewSonic XG240R holds up in terms of input lag:



As we can see by looking at these numbers, the ViewSonic XG240R offers excellent gaming performance. Minimum input lag was as low as 3.33 ms and never went above 15.83 ms, with an average of 10.24 ms. This truly is a gaming-grade monitor perfectly capable of meeting even the highest demands of eSports professionals.
Next Page »Value & Conclusion
View as single page
Apr 27th, 2024 17:23 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts