Tuesday, August 24th 2021

ViewSonic ELITE Launches New 32" Gaming Monitors with the Latest Gamer-Centric Technologies

A leading global provider of visual solutions, ViewSonic Corp., announces a new collection of ViewSonic ELITE 32" professional gaming monitors geared up with the latest display technologies. The new models allow gamers to experience next-level gaming with quantum-dot technology on the class-leading ELITE XG320Q monitor, extract the full potential of next gen consoles with the ELITE XG320U with HDMI 2.1 capabilities, or enter true cinematic immersion with the flagship Mini-LED-backlit ELITE XG321UG.

"At ViewSonic ELITE, we continuously seek to deliver high-end gaming monitors to suit every gamer from enthusiast to professional. While 27" monitors are typical for mainstream gaming, we recognized the demand for widescreens and expanded our 32" product line," says Oscar Lin, Head of Monitor Business Unit at ViewSonic. "Equipped with cutting-edge technologies to revolutionize the way gamers see, play, and experience games, the ELITE 32" monitor series deliver an immersive viewing experience alongside ultra-smooth gameplay and incredible color accuracy."
All monitors are packed with gamer-centric design features, its ELITE Design Enhancements (EDE) elevates the battlestation - from the ambient RGB LED light to create the perfect atmosphere and cable-drag-free mouse anchor to reinforced headphones hook for a clutter-free desk setup. Supported with TÜV-certified eye comfort, these monitors deliver marathon gaming sessions without the straining of eyes. Engineered with tilt, swivel, and height adjustments, the displays provide a wide range of movements for the ideal viewing position.

ELITE XG320Q: Realistic Colors with Quantum-Dot Technology
With the wildly-sleek ELITE XG320Q monitor, it is all about color, clarity, and speed. The 2K QHD Fast IPS display boasts hyper-accurate colors and high-contrasting details from quantum-dot technology. When hit with the LED backlight, each 'dot' or nanoparticle produces superbly precise color and deeper contrast for crisp, lifelike images.

The display is equipped with NVIDIA G-Sync compatible technology for an overclockable refresh rate of 175 Hz, delivering extremely smooth frame rates and a 0.5ms (MPRT) response time. Players can say goodbye to input lag, ghosting, and image stutter with PureXP Motion Blur Reduction capabilities, and enjoy fast-moving visuals in captivating detail.

ELITE XG320U: High Fidelity Console Gaming with HDMI 2.1 Connectivity
Designed for next-generation console gaming such as Xbox Series X/S and PlayStation 5, the uber-stylish 4K ELITE XG320U display includes a single-cable HDMI 2.1 connection that unlocks an unparalleled gaming experience. The XG320U supports 4K resolution with an expanded 99% Adobe RGB color gamut and lightning-fast overclockable 150 Hz refresh rate, making every landscape and battle sequence appear in pixel perfection.

Certified with AMD FreeSync Premium Pro, PureXP Motion Blur Reduction, a 1 ms (MPRT) response time, and Vesa DisplayHDR 600, this monitor allows gamers to surge through fast-paced FPS and action-adventure games at the highest visual settings.

ELITE XG321UG: Mini LED Backlight Technology
The 4K ultra-high definition ELITE XG321UG monitor utilizes industry-leading Mini-LED backlight technology, and is combined with 144 Hz refresh rate for ultra-low latency, stutter-free gameplay.

Key Features

ELITE XG320Q
  • 32" 2K QHD Vesa DisplayHDR 600 Fast IPS display
  • Hyper-realistic colors from quantum-dot technology and 99% Adobe Color Gamut
  • 165 Hz refresh rate overclockable up to 175 Hz and 0.5 ms (MPRT) response time
  • NVIDIA G-Sync technology and PureXP Motion Blur Reduction
ELITE XG320U
  • 32" 4K UHD Vesa DisplayHDR 600 IPS display
  • Expand gameplay onto next-gen consoles with single-cable HDMI 2.1 connectivity
  • Brilliant, vibrant imagery from 99% Adobe Color Gamut
  • Refresh rate of 144 Hz (overclockable to 150 Hz) and 1 ms (MPRT) response time
  • AMD FreeSync Premium Pro technology and PureXP Motion Blur Reduction
Availability
ViewSonic ELITE XG320Q and XG320U monitors will be available worldwide in Q3, 2021. Later in Q4 2021, ViewSonic ELITE XG321UG will be made available worldwide.
Add your own comment

24 Comments on ViewSonic ELITE Launches New 32" Gaming Monitors with the Latest Gamer-Centric Technologies

#1
Kohl Baas
OMG, will it finally happen? 4K_144Hz@32"?!?

Can't believe it!
Posted on Reply
#3
Valantar
TheLostSwedeTime delayed press release? Although it seems like they should've already been available, so this is a press release to less is know they've been delayed, for a second time.
tftcentral.co.uk/news/viewsonic-elite-xg320u-with-4k-resolution-144hz-refresh-rate-and-hdmi-2-1-included
Yeah, these HDMI 2.1 monitors do seem to get delayed time and time again. Hopefully this means that it's actually arriving at some point in the not too distant future. I do hope it has a USB-C input and KVM funcitonality too (the lack of exhaustive spec sheets leaves some hope at least) - that would put it on par with the Eve Spectrum in terms of features but at 32", making it pretty near perfect for what I want. It would be pretty good even without that thogh. That XG321UG seems fancy, but it's likely way too expensive for me.
Posted on Reply
#4
john_
Finally, TVs and monitors are starting to become really attractive to consider an upgrade. Going from FullHD to 2K seemed like half step and going to a 4K monitor with 60Hz refresh rate, like a no change. But maybe in the end of 21 and latter we will have options that will be having all the necessary goodies.
4K resolution
high brightness/HDR
32''-40'' diagonal
over 100Hz refresh rate
FreeSync Premium(G-Sync compatible)
low latency
accurate colors
acceptable price
Posted on Reply
#5
Chomiq
There have been delays in panel manufacturing that's why the displays have been pushed back. 32" 144 Hz HDMI 2.1 VA models that were announced before the IPS ones are still in limbo, some brands even unlisted them from their sites (at least Philips did it).
Posted on Reply
#6
Tardian
Unicorn ready? PS5 is listed as an endangered species. Mining is believed to be the culprit.
Posted on Reply
#7
trsttte
Finally! Let's see what cost and availability looks like. I do wonder though why they don't mention much details for the XG321UG (micro led one), not even the key features are listed like for the others.

It's unfortunate that they're very unlikely to support display port 2.0 though, the monitor industry always moves so fucking slowly - I know there is no hardware with support for it yet either but I don't buy a monitor every year and the next gpu releases are set to use dp 2.0 (even the alder lake igpu)
Posted on Reply
#8
Valantar
trsttteFinally! Let's see what cost and availability looks like. I do wonder though why they don't mention much details for the XG321UG (micro led one), not even the key features are listed like for the others.

It's unfortunate that they're very unlikely to support display port 2.0 though, the monitor industry always moves so fucking slowly - I know there is no hardware with support for it yet either but I don't buy a monitor every year and the next gpu releases are set to use dp 2.0 (even the alder lake igpu)
That's a bit of an odd take. PCs need to be forward compatible with future monitors, as the resolution and refresh rate of the monitor determines the necessary bandwidth. This also means that DP 1.4 with DSC is perfectly fine for 2160p144 - it is all this monitor will ever need. Adding DP 2.0 would increase costs for no benefit. And it will of course be compatible with any future PC with DP 2.0 as DP 2.0 is backwards compatible with 1.4. DP 2.0 is only really needed for 2160p240Hz or 2160p144 with 12-bit color.
Posted on Reply
#9
Tardian
ValantarThat's a bit of an odd take. PCs need to be forward compatible with future monitors, as the resolution and refresh rate of the monitor determines the necessary bandwidth. This also means that DP 1.4 with DSC is perfectly fine for 2160p144 - it is all this monitor will ever need. Adding DP 2.0 would increase costs for no benefit. And it will of course be compatible with any future PC with DP 2.0 as DP 2.0 is backwards compatible with 1.4. DP 2.0 is only really needed for 2160p240Hz or 2160p144 with 12-bit color.
DSC is a compression algorithm that reduces the size of the data stream by up to a 3:1 ratio.[22] Although not mathematically lossless, DSC meets the ISO 29170 standard for "visually lossless" compression in most images, which cannot be distinguished from uncompressed video.[25][26] Using DSC with HBR3 transmission rates, DisplayPort 1.4 can support 8K UHD (7680 × 4320) at 60 Hz or 4K UHD (3840 × 2160) at 120 Hz with 30 bit/px RGB color and HDR. 4K at 60 Hz 30 bit/px RGB/HDR can be achieved without the need for DSC. On displays which do not support DSC, the maximum limits are unchanged from DisplayPort 1.3 (4K 120 Hz, 5K 60 Hz, 8K 30 Hz).[27]

en.wikipedia.org/wiki/DisplayPort#1.4
'which cannot be distinguished from uncompressed video' ... by whom? People have a wide range of visual abilities. Some can't see any benefit from beyond 60hz whilst others find 120hz insufficient.

Some family members can't distinguish SD video from HD video whilst others see the clear advantage of high bit rate HDR 4K video @60fps or better.

Most photographers edit on 10 bit or better monitors.
Posted on Reply
#10
Valantar
Tardian'which cannot be distinguished from uncompressed video' ... by whom? People have a wide range of visual abilities. Some can't see any benefit from beyond 60hz whilst others find 120hz insufficient.

Some family members can't distinguish SD video from HD video whilst others see the clear advantage of high bit rate HDR 4K video @60fps or better.

Most photographers edit on 10 bit or better monitors.
"10 bit or better" - unless they're using $40 000 reference monitors (which they don't, outside of some extremely rich gear fetishists), they're using 10-bit or 8-bit+frc monitors. As long as you're at 10 bit color depth the gamut is more important than the specific bit depth. And 10-bit color is is perfectly fine on DP 1.4 at 60Hz without DSC or 144Hz with DSC (includuing HDR, not that that matters much to photographers).

As for whether the compression of DSC is visible: I have never, ever heard of anyone claiming to be able to see a difference. Not a single person. Now, that is clearly anecdotal and not representative, and the amount of people with DSC-equipped monitors is relatively low, but it's designed to be indistinguishable from uncompressed, which makes it unlikely to have significantly missed that goal. And it's rather obvious that they'll have aimed for it being invisible even with perfect visual acuity. And, thanks to the wonders of displays updating at least 60 times a second, compression errors or artifacts are rendered invisible through their short time on screen as long as they are sufficiently small and infrequent. You can't pixel peep an image that's visible for 16.667ms. Remember, this isnt a massively compressed format like most video codecs (h.264 is ~2000:1, DSC is 3:1 or 4:1 depending on the version). If you've got any evidence to the contrary, feel free to provide it, but if not, then I choose to trust that DSC works as advertised and widely reported.

Besides, I for one would much rather have a visually lossless compressed image than spend $100+ (and likely a lot more at >2m/6ft) for new DP 2.0 cables. Given the bandwidth requirements, those cables are likely to be active, and that means they'll be ridiculously expensive, difficult to get a hold of, and likely unreliable for the first production runs.
Posted on Reply
#11
Tardian
Valantar"10 bit or better" - unless they're using $40 000 reference monitors (which they don't, outside of some extremely rich gear fetishists), they're using 10-bit or 8-bit+frc monitors. As long as you're at 10 bit color depth the gamut is more important than the specific bit depth. And 10-bit color is is perfectly fine on DP 1.4 at 60Hz without DSC or 144Hz with DSC (includuing HDR, not that that matters much to photographers).

As for whether the compression of DSC is visible: I have never, ever heard of anyone claiming to be able to see a difference. Not a single person. Now, that is clearly anecdotal and not representative, and the amount of people with DSC-equipped monitors is relatively low, but it's designed to be indistinguishable from uncompressed, which makes it unlikely to have significantly missed that goal. And it's rather obvious that they'll have aimed for it being invisible even with perfect visual acuity. And, thanks to the wonders of displays updating at least 60 times a second, compression errors or artifacts are rendered invisible through their short time on screen as long as they are sufficiently small and infrequent. You can't pixel peep an image that's visible for 16.667ms. Remember, this isnt a massively compressed format like most video codecs (h.264 is ~2000:1, DSC is 3:1 or 4:1 depending on the version). If you've got any evidence to the contrary, feel free to provide it, but if not, then I choose to trust that DSC works as advertised and widely reported.

Besides, I for one would much rather have a visually lossless compressed image than spend $100+ (and likely a lot more at >2m/6ft) for new DP 2.0 cables. Given the bandwidth requirements, those cables are likely to be active, and that means they'll be ridiculously expensive, difficult to get a hold of, and likely unreliable for the first production runs.
I was woke with the 60hz paradigm until my 205cm son told me I was incorrect!

Now I am wrong maybe: never ... except in the mind of my wife ... however, she is a demigod.

I conducted a test of my perception and he was (of course) right.

Wave your mouse across the page diagonally really quickly.

You can see the refresh rate in terms of the number of arrows you see.

Yep, that hertz.

I am sure if I visited another site I frequent and occasionally comment on: AVSFourm.com, I would find someone who is about 11 sigmas to the right on eyesight who would disagree.

Yes, point taken the plebians wouldn't know if they were stuck in the eye with a blunt stick and especially after that!
Posted on Reply
#12
Valantar
TardianI was woke with the 60hz paradigm until my 205cm son told me I was incorrect!

Now I am wrong maybe: never ... except in the mind of my wife ... however, she is a demigod.

I conducted a test of my perception and he was (of course) right.

Wave your mouse across the page diagonally really quickly.

You can see the refresh rate in terms of the number of arrows you see.

Yep, that hertz.

I am sure if I visited another site I frequent and occasionally comment on: AVSFourm.com, I would find someone who is about 11 sigmas to the right on eyesight who would disagree.

Yes, point taken the plebians wouldn't know if they were stuck in the eye with a blunt stick and especially after that!
The difference between 60Hz and 120Hz or higher (or really even 75Hz, though that depends more on the application) is clearly perceptible to pretty much anyone. The human eye is extremely good at capturing minute, rapid motion. What we aren't good at is spotting a tiny group of pixels that are a tiny bit off from their intended color for a few miliseconds. That's the kind of visual acuity you'd need to spot the difference between DSC and non-DSC displayport output. We're not talking banding, large-scale compression artifacts or anything like that. It might of course be possible to create highly specific test patterns or something similar that bring about visible artifacting with DSC (through identifying specific weaknesses in the compression system) but outside of entirely unrealistic cases like that I have yet to hear of a single example of it not being entirely imperceptible. Our senses can also be trained to be especially attuned towards specific things, you'll always be able to find AV enthusiasts claiming the ability to see the invisible or hear things that aren't sound, but those subcultures are so full of placebo and self-deception it's essentially impossible to differentiate reality from delusion. So for a monitor with a color depth, resolution and refresh rate combination that doesn't exceed the bandwidth provided by DP 1.4+DSC, there is no reason for them to use DP 2.0. It would drive up prices and limit availability with zero benefits to show for it.
Posted on Reply
#13
Tardian
ValantarThe difference between 60Hz and 120Hz or higher (or really even 75Hz, though that depends more on the application) is clearly perceptible to pretty much anyone. The human eye is extremely good at capturing minute, rapid motion. What we aren't good at is spotting a tiny group of pixels that are a tiny bit off from their intended color for a few miliseconds. That's the kind of visual acuity you'd need to spot the difference between DSC and non-DSC displayport output. We're not talking banding, large-scale compression artifacts or anything like that. It might of course be possible to create highly specific test patterns or something similar that bring about visible artifacting with DSC (through identifying specific weaknesses in the compression system) but outside of entirely unrealistic cases like that I have yet to hear of a single example of it not being entirely imperceptible. Our senses can also be trained to be especially attuned towards specific things, you'll always be able to find AV enthusiasts claiming the ability to see the invisible or hear things that aren't sound, but those subcultures are so full of placebo and self-deception it's essentially impossible to differentiate reality from delusion. So for a monitor with a color depth, resolution and refresh rate combination that doesn't exceed the bandwidth provided by DP 1.4+DSC, there is no reason for them to use DP 2.0. It would drive up prices and limit availability with zero benefits to show for it.
We mostly agree.*

* Douglas Adams reference
Posted on Reply
#14
lepudruk
32" 4k in 144hz would be perfect for me. With upcoming 42" 4k oleds from LG all will depend on the price.
Posted on Reply
#15
Tardian
lepudruk32" 4k in 144hz would be perfect for me. With upcoming 42" 4k oleds from LG all will depend on the price.
Based on the C1 48 price the 42 won't be cheap. I could have got a 55 for the same price and I really shopped and got a sweet deal.
Posted on Reply
#16
trsttte
ValantarThat's a bit of an odd take. PCs need to be forward compatible with future monitors, as the resolution and refresh rate of the monitor determines the necessary bandwidth. This also means that DP 1.4 with DSC is perfectly fine for 2160p144 - it is all this monitor will ever need. Adding DP 2.0 would increase costs for no benefit. And it will of course be compatible with any future PC with DP 2.0 as DP 2.0 is backwards compatible with 1.4. DP 2.0 is only really needed for 2160p240Hz or 2160p144 with 12-bit color.
My issue is not with DSC being or not being fine. IMO it's fine (might even have some advantages because cables are getting real expensive and bulky already without the massive bandwidth of dp 2.0 - although we shouldn't strive for "enough" anyway) but support for it is flaky as fuck.

If future gpus using dp 2.0 supported it without questions (hell, even current ones with dp 1.4) i'd be ok with it but that's not the reality and this will be a top of the line product so it's a bit sad is all.
Posted on Reply
#17
Valantar
trsttteMy issue is not with DSC being or not being fine. IMO it's fine (might even have some advantages because cables are getting real expensive and bulky already without the massive bandwidth of dp 2.0 - although we shouldn't strive for "enough" anyway) but support for it is flaky as fuck.

If future gpus using dp 2.0 supported it without questions (hell, even current ones with dp 1.4) i'd be ok with it but that's not the reality and this will be a top of the line product so it's a bit sad is all.
But that's really no different than how things have been previously. Either your GPU supports an output resolution/refresh rate combo, or it doesn't. There have been lots of instances throughout history when a GPU has had a max output resolution lower than the peak capability of its interfaces. Just because DSC isn't specific to an interface but an extension of one only makes it marginally more confusing. It's still a binary specification - either it's supported or it isn't. I guess they could have called DP 1.4+DSC DP 1.5 for the sake of simplicity, but ... creating new standards arbitrarily is hardly simple. And besides, all GPUs with even a remote hope of rendering games at 2160p144 support DP 1.4+DSC. The only issue is that the fact that DSC support is optional was poorly communicated, and that GPU vendors did a terrible job of listing this in their specs. But AFAIK all RX 5000-series and later AMD GPUs and all RTX 20-series and later Nvidia GPUs support it.
Posted on Reply
#18
trsttte
ValantarAnd besides, all GPUs with even a remote hope of rendering games at 2160p144 support DP 1.4+DSC.
Gaming is not all that people do, with 10bit color you'll basically be limited to under 100hz which is not terrible but cmon let's not excuse companies who cheap out when they can do better.

DSC even predates DP 1.4 so this mess was very much avoidable.
Posted on Reply
#19
Valantar
trsttteGaming is not all that people do, with 10bit color you'll basically be limited to under 100hz which is not terrible but cmon let's not excuse companies who cheap out when they can do better.

DSC even predates DP 1.4 so this mess was very much avoidable.
...but what are you doing at 144Hz besides gaming? I guess 120Hz content creation could be a thing, but ... editing at 60Hz would be perfectly fine for that, as would stepping down to 4:2:2 for 120Hz previews. Perfect? Of course not. That's why crazy expensive professional hardware exists. And besides, you don't need to game on a GPU just because it's from the past two generations. But if your requirements for whatever you are doing is 2160p144 with a single cable, then you need hardware that supports DSC. Blame GPU makers for being slow, I guess.

Also, they don't really seem to be cheaping out - from what I can tell, there isn't yet any DP 2.0 hardware on the market, and you need that to make a monitor supporting it. A standard being finalized is not equal to hardware being designed, tested, and put into mass production, after all. It took a long time for HDMI 2.1 to get out there, and that has the massive TV and game console markets to aim for - monitors are far more limited.
Posted on Reply
#20
lepudruk
TardianBased on the C1 48 price the 42 won't be cheap. I could have got a 55 for the same price and I really shopped and got a sweet deal.
Some sources believe that LG's 42" won't exceed 1k$ (whathifi) and I don't expect to find new Viesonic monitors any cheaper then that so... Even if they come with same price I would still choose oled.
Posted on Reply
#21
Tardian
Valantar...but what are you doing at 144Hz besides gaming? I guess 120Hz content creation could be a thing, but ... editing at 60Hz would be perfectly fine for that, as would stepping down to 4:2:2 for 120Hz previews. Perfect? Of course not. That's why crazy expensive professional hardware exists. And besides, you don't need to game on a GPU just because it's from the past two generations. But if your requirements for whatever you are doing is 2160p144 with a single cable, then you need hardware that supports DSC. Blame GPU makers for being slow, I guess.

Also, they don't really seem to be cheaping out - from what I can tell, there isn't yet any DP 2.0 hardware on the market, and you need that to make a monitor supporting it. A standard being finalized is not equal to hardware being designed, tested, and put into mass production, after all. It took a long time for HDMI 2.1 to get out there, and that has the massive TV and game console markets to aim for - monitors are far more limited.
However, editing 4K 120fps video does require significant bandwidth and has become popular and easily accessible.
lepudrukSome sources believe that LG's 42" won't exceed 1k$ (whathifi) and I don't expect to find new Viesonic monitors any cheaper then that so... Even if they come with same price I would still choose oled.
The same sources that suggested a PS5 was obtainable? This is code for prepare to be disappointed about the price, but not the image quality.
Posted on Reply
#22
Valantar
TardianHowever, editing 4K 120fps video does require significant bandwidth and has become popular and easily accessible.
Sure, that happens, and I guess you could describe it as "growing". But "popular" or "easily accessible"? I really wouldn't say so. There are a few mirrorless cameras supporting it, and some phones, though ... if you're editing 4k120 video shot on a phone, you're losing nothing at all even working at 4:2:2, let alone with DSC (and tbh you're likely doing your editing on an iPad). Not to mention that the actual use cases for 120fps video are pretty limited. If you're using it for slow motion you don't need a 120Hz display, and beyond that there are few genres of video production where going above 60Hz makes sense. Then there's the low adoption rate of >60Hz displays and whether it's worth it to produce for a tiny minority of viewers. And again, it's not unreasonable to need the newest generations of equipment in order to utilize the newest generations of I/O, even if previous generations cover every other need. That's just not how time or product development work. Getting back to the point, all of this will work fine with DSC. It works equally well for static images as for video - on a PC there's no difference after all, each frame is transmitted in full every time, so any possible compression artifacts (of which there aren't supposed to be any on any level visible to humans) would be present for a few milliseconds at worst.

The handful of people doing this use the same displays and the same hardware as everyone else. And if you're one of the few working on a 12-bit HDR reference monitor, then having sufficient GPU bandwidth is trivial, as you clearly have a massive budget for equipment. And you're likely using a Radeon Pro or Quadro anyhow, and are comfortable with multi-cable connections if necessary.

DP 2.0 will arrive when it's ready and there's a need for it. That will likely be with 2160p240 displays, whenever they arrive. Until then, we simply don't have a need for it. And at least the past two generations of GPUs support DSC - no GPUs support DP 2.0, forcing even further upgrades.
Posted on Reply
#23
Tardian
ValantarSure, that happens, and I guess you could describe it as "growing". But "popular" or "easily accessible"? I really wouldn't say so. There are a few mirrorless cameras supporting it, and some phones, though ... if you're editing 4k120 video shot on a phone, you're losing nothing at all even working at 4:2:2, let alone with DSC (and tbh you're likely doing your editing on an iPad). Not to mention that the actual use cases for 120fps video are pretty limited. If you're using it for slow motion you don't need a 120Hz display, and beyond that there are few genres of video production where going above 60Hz makes sense. Then there's the low adoption rate of >60Hz displays and whether it's worth it to produce for a tiny minority of viewers. And again, it's not unreasonable to need the newest generations of equipment in order to utilize the newest generations of I/O, even if previous generations cover every other need. That's just not how time or product development work. Getting back to the point, all of this will work fine with DSC. It works equally well for static images as for video - on a PC there's no difference after all, each frame is transmitted in full every time, so any possible compression artifacts (of which there aren't supposed to be any on any level visible to humans) would be present for a few milliseconds at worst.

The handful of people doing this use the same displays and the same hardware as everyone else. And if you're one of the few working on a 12-bit HDR reference monitor, then having sufficient GPU bandwidth is trivial, as you clearly have a massive budget for equipment. And you're likely using a Radeon Pro or Quadro anyhow, and are comfortable with multi-cable connections if necessary.

DP 2.0 will arrive when it's ready and there's a need for it. That will likely be with 2160p240 displays, whenever they arrive. Until then, we simply don't have a need for it. And at least the past two generations of GPUs support DSC - no GPUs support DP 2.0, forcing even further upgrades.
4K @120fps
  1. Canon EOS R5
  2. Sony A7S III
  3. Canon EOS C70
  4. Kandao QooCam 8K
  5. Samsung Galaxy S20 Ultra
  6. Z Cam E2
  7. Kandao Obsidian S
  8. Insta360 Pro 2
Using the 180-degree shutter rule
Let’s move on to why you shouldn’t always film at 60fps – even though looking at the basics it would seem sensible. This is where the technical aspect comes in. You need to think about the 180-degree shutter rule. This dictates that the fps should always be half of the shutter speed. At 29.7fps it’s relatively easy to find a balance of gain (ISO) and IRIS (Aperture) to enable that all-important 1/60th shutter speed. But, crank that to 60fps and you suddenly need a shutter speed of 1/120th – and that’s quite a jump. Shooting a frame at 1/120th of second is going to eliminate any motion blur, whereas shooting at 1/60th will give motion blur in the frame. Motion blur is important for video as it helps with that all-important persistence of vision. The blur actually helps with the smoothness of playback. That 4K footage shot at 60fps and 1/120th of a second played back at 60fps will look fine. There’s enough content there for the motion to look silky smooth. Stretch that footage out over 2 seconds and the optical illusion and lack of blur still fools the eye, but, reduce it to 30fps, and you start to break the 180-degree shutter rule. The effect is admittedly slight, but it is noticeable. Break the 180-degree shutter rule with faster shutter speeds and you get the action-packed 300 gladiator-style look. Slow it right down and it all becomes a bit romantic. Ultimately 60fps is a sought-after feature as it enables you to shoot smooth slow motion, and these days it’s an effect that you as a filmmaker can’t be without, but do be careful as using it wrongly can have a dramatic effect.

Why shoot 4K at 120fps?
Recording at 4K at 120fps is pretty niche, but it’s still a fantastic feature to have in a camera. There are a few points to take into consideration when looking at a camera that shoots 4K at 120fps, such as the amount of data recorded. This is usually measured as 100mb/s (megabits per second) rather than 100mb per frame. Most cameras, especially mirrorless and DSLR, will therefore split the bit rate across frames as VBR (variable bit rate). If the bit rate was fixed, then at 30fps 100mb/s, each frame would be 3.3Mbs. Crank it to 60fps and that drops to 1.6Mbs. This isn’t the way it works due to VBR. The more frames you have, the less change you have between frames and the less data needs recording. So one frame may max out at 3.3mbs where there’s tonnes of movement, and the next may be 1mbs where there’s less change in the frame. Shoot at 30fps and the likelihood is that more will change within the frame so the data recorded will be greater. Still, as the frame rate increases even with that flexibility of variable bitrate, it can still push the limits of the camera’s max 100mb/s. Take the GoPro as an example. You’d be hard pushed to see the difference in 1080p action footage shot at 30 or 60fps, but push it to 120fps and you start to see some pixelation. At 240fps you really start to see the drop. Use the same settings for a static scene and the quality of the high fps footage will still look good. Less has changed in the frame so less data needs to be recorded. Likewise you’ll often see cameras such as the Canon EOS 5D Mark IV that will record 1080p at 60fps, but you need to drop the resolution to 720p to record at 120fps. It’s probably limited by Canon as the 4K mb/s is 500mb/s, which seems insanely high. 4K at 120fps is amazing, but can the camera actually cope? What’s the mb/s at 4K? Is there enough scope to capture decent footage? The Sony RX0 can shoot at 1000fps in 1080p, for instance, which seems impressive until you see the quality of the footage in normal lighting conditions. Then there’s the heat issue. All that processing can come at a price. Finally, there’s shutter speed. Shooting 4K at 120fps, the shutter speed needs to be set at 1/250th of a second (there’s no 1/240th so you round to closest). That’s all well and good but then lighting really becomes an issue. You also need to consider file size and processing power. 4K already consumes storage, but slow-motion footage requires rendering and processing. This is where you really need to start looking at Nvme M.2 hard drives.

camerajabber.com/buyersguides/which-cameras-shoot-4k-at-120fps/
Not covered above, is the need for Secure Digital Ultra Capacity (SDUC) UHS-III or SD Express format or CFexpress cards.
ValantarSure, that happens, and I guess you could describe it as "growing". But "popular" or "easily accessible"? I really wouldn't say so. There are a few mirrorless cameras supporting it, and some phones, though ... if you're editing 4k120 video shot on a phone, you're losing nothing at all even working at 4:2:2, let alone with DSC (and tbh you're likely doing your editing on an iPad). Not to mention that the actual use cases for 120fps video are pretty limited. If you're using it for slow motion you don't need a 120Hz display, and beyond that there are few genres of video production where going above 60Hz makes sense. Then there's the low adoption rate of >60Hz displays and whether it's worth it to produce for a tiny minority of viewers. And again, it's not unreasonable to need the newest generations of equipment in order to utilize the newest generations of I/O, even if previous generations cover every other need. That's just not how time or product development work. Getting back to the point, all of this will work fine with DSC. It works equally well for static images as for video - on a PC there's no difference after all, each frame is transmitted in full every time, so any possible compression artifacts (of which there aren't supposed to be any on any level visible to humans) would be present for a few milliseconds at worst.

The handful of people doing this use the same displays and the same hardware as everyone else. And if you're one of the few working on a 12-bit HDR reference monitor, then having sufficient GPU bandwidth is trivial, as you clearly have a massive budget for equipment. And you're likely using a Radeon Pro or Quadro anyhow, and are comfortable with multi-cable connections if necessary.

DP 2.0 will arrive when it's ready and there's a need for it. That will likely be with 2160p240 displays, whenever they arrive. Until then, we simply don't have a need for it. And at least the past two generations of GPUs support DSC - no GPUs support DP 2.0, forcing even further upgrades.
  1. Canon EOS R5
  2. Sony A7S III
  3. Canon EOS C70
  4. Kandao QooCam 8K
  5. Samsung Galaxy S20 Ultra
  6. Z Cam E2
  7. Kandao Obsidian S
  8. Insta360 Pro 2
Using the 180-degree shutter rule
Let’s move on to why you shouldn’t always film at 60fps – even though looking at the basics it would seem sensible. This is where the technical aspect comes in. You need to think about the 180-degree shutter rule. This dictates that the fps should always be half of the shutter speed. At 29.7fps it’s relatively easy to find a balance of gain (ISO) and IRIS (Aperture) to enable that all-important 1/60th shutter speed. But, crank that to 60fps and you suddenly need a shutter speed of 1/120th – and that’s quite a jump. Shooting a frame at 1/120th of second is going to eliminate any motion blur, whereas shooting at 1/60th will give motion blur in the frame. Motion blur is important for video as it helps with that all-important persistence of vision. The blur actually helps with the smoothness of playback. That 4K footage shot at 60fps and 1/120th of a second played back at 60fps will look fine. There’s enough content there for the motion to look silky smooth. Stretch that footage out over 2 seconds and the optical illusion and lack of blur still fools the eye, but, reduce it to 30fps, and you start to break the 180-degree shutter rule. The effect is admittedly slight, but it is noticeable. Break the 180-degree shutter rule with faster shutter speeds and you get the action-packed 300 gladiator-style look. Slow it right down and it all becomes a bit romantic. Ultimately 60fps is a sought-after feature as it enables you to shoot smooth slow motion, and these days it’s an effect that you as a filmmaker can’t be without, but do be careful as using it wrongly can have a dramatic effect.

Why shoot 4K at 120fps?
Recording at 4K at 120fps is pretty niche, but it’s still a fantastic feature to have in a camera. There are a few points to take into consideration when looking at a camera that shoots 4K at 120fps, such as the amount of data recorded. This is usually measured as 100mb/s (megabits per second) rather than 100mb per frame. Most cameras, especially mirrorless and DSLR, will therefore split the bit rate across frames as VBR (variable bit rate). If the bit rate was fixed, then at 30fps 100mb/s, each frame would be 3.3Mbs. Crank it to 60fps and that drops to 1.6Mbs. This isn’t the way it works due to VBR. The more frames you have, the less change you have between frames and the less data needs recording. So one frame may max out at 3.3mbs where there’s tonnes of movement, and the next may be 1mbs where there’s less change in the frame. Shoot at 30fps and the likelihood is that more will change within the frame so the data recorded will be greater. Still, as the frame rate increases even with that flexibility of variable bitrate, it can still push the limits of the camera’s max 100mb/s. Take the GoPro as an example. You’d be hard pushed to see the difference in 1080p action footage shot at 30 or 60fps, but push it to 120fps and you start to see some pixelation. At 240fps you really start to see the drop. Use the same settings for a static scene and the quality of the high fps footage will still look good. Less has changed in the frame so less data needs to be recorded. Likewise you’ll often see cameras such as the Canon EOS 5D Mark IV that will record 1080p at 60fps, but you need to drop the resolution to 720p to record at 120fps. It’s probably limited by Canon as the 4K mb/s is 500mb/s, which seems insanely high. 4K at 120fps is amazing, but can the camera actually cope? What’s the mb/s at 4K? Is there enough scope to capture decent footage? The Sony RX0 can shoot at 1000fps in 1080p, for instance, which seems impressive until you see the quality of the footage in normal lighting conditions. Then there’s the heat issue. All that processing can come at a price. Finally, there’s shutter speed. Shooting 4K at 120fps, the shutter speed needs to be set at 1/250th of a second (there’s no 1/240th so you round to closest). That’s all well and good but then lighting really becomes an issue. You also need to consider file size and processing power. 4K already consumes storage, but slow-motion footage requires rendering and processing. This is where you really need to start looking at Nvme M.2 hard drives.

camerajabber.com/buyersguides/which-cameras-shoot-4k-at-120fps/
Not covered above is the need for Secure Digital Ultra Capacity (SDUC) UHS-III or SD Express format or CFexpress cards
ELITE XG320U
32” 4K UHD Vesa DisplayHDR™ 600 IPS display
Expand gameplay onto next-gen consoles with single-cable HDMI 2.1 connectivity
Brilliant, vibrant imagery from 99% Adobe Color Gamut
Refresh rate of 144Hz (overclockable to 150Hz) and 1ms (MPRT) response time
AMD FreeSync Premium Pro technology and PureXP Motion Blur Reduction
Availability
In Q4 2021, ViewSonic ELITE XG321UG will be made available worldwide.

www.viewsonic.com/fr/newsroom/content/ViewSonic ELITE Launches New 32 E2 80 9D Gaming Monitors with the Latest Gamer-Centric Technologies_4118
Posted on Reply
#24
Valantar
Tardian4K @120fps


Not covered above, is the need for Secure Digital Ultra Capacity (SDUC) UHS-III or SD Express format or CFexpress cards.





Not covered above is the need for Secure Digital Ultra Capacity (SDUC) UHS-III or SD Express format or CFexpress cards
Yep, exactly - a very niche undertaking. 120fps video in general is mainly applicable for slow motion, and outside of amateur filmmakers overly enamored with Zach Snyder or Michael Bay, there really aren't that many reasonable applications (and in most of those cases, 1080p slow motion upscaled into a 2160p timeline is perfectly adequate). Plus that slow motion of course doesn't involve playback at high frame rates. On that list of cameras (which I have no idea if is up to date), there are two mirrorless cameras, two cine cameras (one "affordable", one mid-range), three 360-degree VR/action cams that don't really follow standard resolutions (can't do 360° video in 16:9 or similar), and one smartphone. In other words, you need to be looking at very specific gear in order to film in 2160p120. And it's rather hilarious that the "Why shoot at 4K 120fps" paragraph literally doesn't list a single argument for doing so (but quite a few against it!), except for "it's still a fantastic feature to have in a camera" and (paraphrasing here) "it might not look that much worse". Kind of made me giggle :p
Posted on Reply
Add your own comment