• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ViewSonic ELITE Launches New 32" Gaming Monitors with the Latest Gamer-Centric Technologies

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,362 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
A leading global provider of visual solutions, ViewSonic Corp., announces a new collection of ViewSonic ELITE 32" professional gaming monitors geared up with the latest display technologies. The new models allow gamers to experience next-level gaming with quantum-dot technology on the class-leading ELITE XG320Q monitor, extract the full potential of next gen consoles with the ELITE XG320U with HDMI 2.1 capabilities, or enter true cinematic immersion with the flagship Mini-LED-backlit ELITE XG321UG.

"At ViewSonic ELITE, we continuously seek to deliver high-end gaming monitors to suit every gamer from enthusiast to professional. While 27" monitors are typical for mainstream gaming, we recognized the demand for widescreens and expanded our 32" product line," says Oscar Lin, Head of Monitor Business Unit at ViewSonic. "Equipped with cutting-edge technologies to revolutionize the way gamers see, play, and experience games, the ELITE 32" monitor series deliver an immersive viewing experience alongside ultra-smooth gameplay and incredible color accuracy."



All monitors are packed with gamer-centric design features, its ELITE Design Enhancements (EDE) elevates the battlestation - from the ambient RGB LED light to create the perfect atmosphere and cable-drag-free mouse anchor to reinforced headphones hook for a clutter-free desk setup. Supported with TÜV-certified eye comfort, these monitors deliver marathon gaming sessions without the straining of eyes. Engineered with tilt, swivel, and height adjustments, the displays provide a wide range of movements for the ideal viewing position.

ELITE XG320Q: Realistic Colors with Quantum-Dot Technology
With the wildly-sleek ELITE XG320Q monitor, it is all about color, clarity, and speed. The 2K QHD Fast IPS display boasts hyper-accurate colors and high-contrasting details from quantum-dot technology. When hit with the LED backlight, each 'dot' or nanoparticle produces superbly precise color and deeper contrast for crisp, lifelike images.

The display is equipped with NVIDIA G-Sync compatible technology for an overclockable refresh rate of 175 Hz, delivering extremely smooth frame rates and a 0.5ms (MPRT) response time. Players can say goodbye to input lag, ghosting, and image stutter with PureXP Motion Blur Reduction capabilities, and enjoy fast-moving visuals in captivating detail.

ELITE XG320U: High Fidelity Console Gaming with HDMI 2.1 Connectivity
Designed for next-generation console gaming such as Xbox Series X/S and PlayStation 5, the uber-stylish 4K ELITE XG320U display includes a single-cable HDMI 2.1 connection that unlocks an unparalleled gaming experience. The XG320U supports 4K resolution with an expanded 99% Adobe RGB color gamut and lightning-fast overclockable 150 Hz refresh rate, making every landscape and battle sequence appear in pixel perfection.

Certified with AMD FreeSync Premium Pro, PureXP Motion Blur Reduction, a 1 ms (MPRT) response time, and Vesa DisplayHDR 600, this monitor allows gamers to surge through fast-paced FPS and action-adventure games at the highest visual settings.

ELITE XG321UG: Mini LED Backlight Technology
The 4K ultra-high definition ELITE XG321UG monitor utilizes industry-leading Mini-LED backlight technology, and is combined with 144 Hz refresh rate for ultra-low latency, stutter-free gameplay.

Key Features

ELITE XG320Q
  • 32" 2K QHD Vesa DisplayHDR 600 Fast IPS display
  • Hyper-realistic colors from quantum-dot technology and 99% Adobe Color Gamut
  • 165 Hz refresh rate overclockable up to 175 Hz and 0.5 ms (MPRT) response time
  • NVIDIA G-Sync technology and PureXP Motion Blur Reduction
ELITE XG320U
  • 32" 4K UHD Vesa DisplayHDR 600 IPS display
  • Expand gameplay onto next-gen consoles with single-cable HDMI 2.1 connectivity
  • Brilliant, vibrant imagery from 99% Adobe Color Gamut
  • Refresh rate of 144 Hz (overclockable to 150 Hz) and 1 ms (MPRT) response time
  • AMD FreeSync Premium Pro technology and PureXP Motion Blur Reduction
Availability
ViewSonic ELITE XG320Q and XG320U monitors will be available worldwide in Q3, 2021. Later in Q4 2021, ViewSonic ELITE XG321UG will be made available worldwide.

View at TechPowerUp Main Site
 
Joined
Sep 10, 2015
Messages
496 (0.16/day)
System Name My Addiction
Processor AMD Ryzen 7950X3D
Motherboard ASRock B650E PG-ITX WiFi
Cooling Alphacool Core Ocean T38 AIO 240mm
Memory G.Skill 32GB 6000MHz
Video Card(s) Sapphire Pulse 7900XTX
Storage Some SSDs
Display(s) 42" Samsung TV + 22" Dell monitor vertically
Case Lian Li A4-H2O
Audio Device(s) Denon + Bose
Power Supply Corsair SF750
Mouse Logitech
Keyboard Glorious
VR HMD None
Software Win 10
Benchmark Scores None taken
OMG, will it finally happen? 4K_144Hz@32"?!?

Can't believe it!
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
16,056 (2.26/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/5za05v
Time delayed press release? Although it seems like they should already be available, so this is a press release is to let us know they've been delayed, for a second time?
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Time delayed press release? Although it seems like they should've already been available, so this is a press release to less is know they've been delayed, for a second time.
Yeah, these HDMI 2.1 monitors do seem to get delayed time and time again. Hopefully this means that it's actually arriving at some point in the not too distant future. I do hope it has a USB-C input and KVM funcitonality too (the lack of exhaustive spec sheets leaves some hope at least) - that would put it on par with the Eve Spectrum in terms of features but at 32", making it pretty near perfect for what I want. It would be pretty good even without that thogh. That XG321UG seems fancy, but it's likely way too expensive for me.
 
Joined
Sep 6, 2013
Messages
2,976 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Finally, TVs and monitors are starting to become really attractive to consider an upgrade. Going from FullHD to 2K seemed like half step and going to a 4K monitor with 60Hz refresh rate, like a no change. But maybe in the end of 21 and latter we will have options that will be having all the necessary goodies.
4K resolution
high brightness/HDR
32''-40'' diagonal
over 100Hz refresh rate
FreeSync Premium(G-Sync compatible)
low latency
accurate colors
acceptable price
 
Joined
Feb 23, 2019
Messages
5,623 (2.99/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3800 CL16
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
There have been delays in panel manufacturing that's why the displays have been pushed back. 32" 144 Hz HDMI 2.1 VA models that were announced before the IPS ones are still in limbo, some brands even unlisted them from their sites (at least Philips did it).
 
Joined
Oct 16, 2014
Messages
671 (0.19/day)
System Name Work in progress
Processor AMD Ryzen 5 3600
Motherboard Asus PRIME B350M-A
Cooling Wraith Stealth Cooler, 4x140mm Noctua NF-A14 FLX 1200RPM Case Fans
Memory Corsair 16GB (2x8GB) CMK16GX4M2A2400C14R DDR4 2400MHz Vengeance LPX DIMM
Video Card(s) GTX 1050 2GB (for now) 3060 12GB on order
Storage Samsung 860 EVO 500GB, Lots of HDD storage
Display(s) 32 inch 4K LG, 55 & 48 inch LG OLED, 40 inch Panasonic LED LCD
Case Cooler Master Silencio S400
Audio Device(s) Sound: LG Monitor Built-in speakers (currently), Mike: Marantz MaZ
Power Supply Corsair CS550M 550W ATX Power Supply, 80+ Gold Certified, Semi-Modular Design
Mouse Logitech M280
Keyboard Logitech Wireless Solar Keyboard K750R (works best in summer)
VR HMD none
Software Microsoft Windows 10 Home 64bit OEM, Captur 1 21
Benchmark Scores Cinebench R20: 3508 (WIP)
Unicorn ready? PS5 is listed as an endangered species. Mining is believed to be the culprit.
 
Last edited:
Joined
Jun 18, 2021
Messages
2,282 (2.20/day)
Finally! Let's see what cost and availability looks like. I do wonder though why they don't mention much details for the XG321UG (micro led one), not even the key features are listed like for the others.

It's unfortunate that they're very unlikely to support display port 2.0 though, the monitor industry always moves so fucking slowly - I know there is no hardware with support for it yet either but I don't buy a monitor every year and the next gpu releases are set to use dp 2.0 (even the alder lake igpu)
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Finally! Let's see what cost and availability looks like. I do wonder though why they don't mention much details for the XG321UG (micro led one), not even the key features are listed like for the others.

It's unfortunate that they're very unlikely to support display port 2.0 though, the monitor industry always moves so fucking slowly - I know there is no hardware with support for it yet either but I don't buy a monitor every year and the next gpu releases are set to use dp 2.0 (even the alder lake igpu)
That's a bit of an odd take. PCs need to be forward compatible with future monitors, as the resolution and refresh rate of the monitor determines the necessary bandwidth. This also means that DP 1.4 with DSC is perfectly fine for 2160p144 - it is all this monitor will ever need. Adding DP 2.0 would increase costs for no benefit. And it will of course be compatible with any future PC with DP 2.0 as DP 2.0 is backwards compatible with 1.4. DP 2.0 is only really needed for 2160p240Hz or 2160p144 with 12-bit color.
 
Joined
Oct 16, 2014
Messages
671 (0.19/day)
System Name Work in progress
Processor AMD Ryzen 5 3600
Motherboard Asus PRIME B350M-A
Cooling Wraith Stealth Cooler, 4x140mm Noctua NF-A14 FLX 1200RPM Case Fans
Memory Corsair 16GB (2x8GB) CMK16GX4M2A2400C14R DDR4 2400MHz Vengeance LPX DIMM
Video Card(s) GTX 1050 2GB (for now) 3060 12GB on order
Storage Samsung 860 EVO 500GB, Lots of HDD storage
Display(s) 32 inch 4K LG, 55 & 48 inch LG OLED, 40 inch Panasonic LED LCD
Case Cooler Master Silencio S400
Audio Device(s) Sound: LG Monitor Built-in speakers (currently), Mike: Marantz MaZ
Power Supply Corsair CS550M 550W ATX Power Supply, 80+ Gold Certified, Semi-Modular Design
Mouse Logitech M280
Keyboard Logitech Wireless Solar Keyboard K750R (works best in summer)
VR HMD none
Software Microsoft Windows 10 Home 64bit OEM, Captur 1 21
Benchmark Scores Cinebench R20: 3508 (WIP)
That's a bit of an odd take. PCs need to be forward compatible with future monitors, as the resolution and refresh rate of the monitor determines the necessary bandwidth. This also means that DP 1.4 with DSC is perfectly fine for 2160p144 - it is all this monitor will ever need. Adding DP 2.0 would increase costs for no benefit. And it will of course be compatible with any future PC with DP 2.0 as DP 2.0 is backwards compatible with 1.4. DP 2.0 is only really needed for 2160p240Hz or 2160p144 with 12-bit color.

DSC is a compression algorithm that reduces the size of the data stream by up to a 3:1 ratio.[22] Although not mathematically lossless, DSC meets the ISO 29170 standard for "visually lossless" compression in most images, which cannot be distinguished from uncompressed video.[25][26] Using DSC with HBR3 transmission rates, DisplayPort 1.4 can support 8K UHD (7680 × 4320) at 60 Hz or 4K UHD (3840 × 2160) at 120 Hz with 30 bit/px RGB color and HDR. 4K at 60 Hz 30 bit/px RGB/HDR can be achieved without the need for DSC. On displays which do not support DSC, the maximum limits are unchanged from DisplayPort 1.3 (4K 120 Hz, 5K 60 Hz, 8K 30 Hz).[27]


'which cannot be distinguished from uncompressed video' ... by whom? People have a wide range of visual abilities. Some can't see any benefit from beyond 60hz whilst others find 120hz insufficient.

Some family members can't distinguish SD video from HD video whilst others see the clear advantage of high bit rate HDR 4K video @60fps or better.

Most photographers edit on 10 bit or better monitors.
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
'which cannot be distinguished from uncompressed video' ... by whom? People have a wide range of visual abilities. Some can't see any benefit from beyond 60hz whilst others find 120hz insufficient.

Some family members can't distinguish SD video from HD video whilst others see the clear advantage of high bit rate HDR 4K video @60fps or better.

Most photographers edit on 10 bit or better monitors.
"10 bit or better" - unless they're using $40 000 reference monitors (which they don't, outside of some extremely rich gear fetishists), they're using 10-bit or 8-bit+frc monitors. As long as you're at 10 bit color depth the gamut is more important than the specific bit depth. And 10-bit color is is perfectly fine on DP 1.4 at 60Hz without DSC or 144Hz with DSC (includuing HDR, not that that matters much to photographers).

As for whether the compression of DSC is visible: I have never, ever heard of anyone claiming to be able to see a difference. Not a single person. Now, that is clearly anecdotal and not representative, and the amount of people with DSC-equipped monitors is relatively low, but it's designed to be indistinguishable from uncompressed, which makes it unlikely to have significantly missed that goal. And it's rather obvious that they'll have aimed for it being invisible even with perfect visual acuity. And, thanks to the wonders of displays updating at least 60 times a second, compression errors or artifacts are rendered invisible through their short time on screen as long as they are sufficiently small and infrequent. You can't pixel peep an image that's visible for 16.667ms. Remember, this isnt a massively compressed format like most video codecs (h.264 is ~2000:1, DSC is 3:1 or 4:1 depending on the version). If you've got any evidence to the contrary, feel free to provide it, but if not, then I choose to trust that DSC works as advertised and widely reported.

Besides, I for one would much rather have a visually lossless compressed image than spend $100+ (and likely a lot more at >2m/6ft) for new DP 2.0 cables. Given the bandwidth requirements, those cables are likely to be active, and that means they'll be ridiculously expensive, difficult to get a hold of, and likely unreliable for the first production runs.
 
Joined
Oct 16, 2014
Messages
671 (0.19/day)
System Name Work in progress
Processor AMD Ryzen 5 3600
Motherboard Asus PRIME B350M-A
Cooling Wraith Stealth Cooler, 4x140mm Noctua NF-A14 FLX 1200RPM Case Fans
Memory Corsair 16GB (2x8GB) CMK16GX4M2A2400C14R DDR4 2400MHz Vengeance LPX DIMM
Video Card(s) GTX 1050 2GB (for now) 3060 12GB on order
Storage Samsung 860 EVO 500GB, Lots of HDD storage
Display(s) 32 inch 4K LG, 55 & 48 inch LG OLED, 40 inch Panasonic LED LCD
Case Cooler Master Silencio S400
Audio Device(s) Sound: LG Monitor Built-in speakers (currently), Mike: Marantz MaZ
Power Supply Corsair CS550M 550W ATX Power Supply, 80+ Gold Certified, Semi-Modular Design
Mouse Logitech M280
Keyboard Logitech Wireless Solar Keyboard K750R (works best in summer)
VR HMD none
Software Microsoft Windows 10 Home 64bit OEM, Captur 1 21
Benchmark Scores Cinebench R20: 3508 (WIP)
"10 bit or better" - unless they're using $40 000 reference monitors (which they don't, outside of some extremely rich gear fetishists), they're using 10-bit or 8-bit+frc monitors. As long as you're at 10 bit color depth the gamut is more important than the specific bit depth. And 10-bit color is is perfectly fine on DP 1.4 at 60Hz without DSC or 144Hz with DSC (includuing HDR, not that that matters much to photographers).

As for whether the compression of DSC is visible: I have never, ever heard of anyone claiming to be able to see a difference. Not a single person. Now, that is clearly anecdotal and not representative, and the amount of people with DSC-equipped monitors is relatively low, but it's designed to be indistinguishable from uncompressed, which makes it unlikely to have significantly missed that goal. And it's rather obvious that they'll have aimed for it being invisible even with perfect visual acuity. And, thanks to the wonders of displays updating at least 60 times a second, compression errors or artifacts are rendered invisible through their short time on screen as long as they are sufficiently small and infrequent. You can't pixel peep an image that's visible for 16.667ms. Remember, this isnt a massively compressed format like most video codecs (h.264 is ~2000:1, DSC is 3:1 or 4:1 depending on the version). If you've got any evidence to the contrary, feel free to provide it, but if not, then I choose to trust that DSC works as advertised and widely reported.

Besides, I for one would much rather have a visually lossless compressed image than spend $100+ (and likely a lot more at >2m/6ft) for new DP 2.0 cables. Given the bandwidth requirements, those cables are likely to be active, and that means they'll be ridiculously expensive, difficult to get a hold of, and likely unreliable for the first production runs.
I was woke with the 60hz paradigm until my 205cm son told me I was incorrect!

Now I am wrong maybe: never ... except in the mind of my wife ... however, she is a demigod.

I conducted a test of my perception and he was (of course) right.

Wave your mouse across the page diagonally really quickly.

You can see the refresh rate in terms of the number of arrows you see.

Yep, that hertz.

I am sure if I visited another site I frequent and occasionally comment on: AVSFourm.com, I would find someone who is about 11 sigmas to the right on eyesight who would disagree.

Yes, point taken the plebians wouldn't know if they were stuck in the eye with a blunt stick and especially after that!
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I was woke with the 60hz paradigm until my 205cm son told me I was incorrect!

Now I am wrong maybe: never ... except in the mind of my wife ... however, she is a demigod.

I conducted a test of my perception and he was (of course) right.

Wave your mouse across the page diagonally really quickly.

You can see the refresh rate in terms of the number of arrows you see.

Yep, that hertz.

I am sure if I visited another site I frequent and occasionally comment on: AVSFourm.com, I would find someone who is about 11 sigmas to the right on eyesight who would disagree.

Yes, point taken the plebians wouldn't know if they were stuck in the eye with a blunt stick and especially after that!
The difference between 60Hz and 120Hz or higher (or really even 75Hz, though that depends more on the application) is clearly perceptible to pretty much anyone. The human eye is extremely good at capturing minute, rapid motion. What we aren't good at is spotting a tiny group of pixels that are a tiny bit off from their intended color for a few miliseconds. That's the kind of visual acuity you'd need to spot the difference between DSC and non-DSC displayport output. We're not talking banding, large-scale compression artifacts or anything like that. It might of course be possible to create highly specific test patterns or something similar that bring about visible artifacting with DSC (through identifying specific weaknesses in the compression system) but outside of entirely unrealistic cases like that I have yet to hear of a single example of it not being entirely imperceptible. Our senses can also be trained to be especially attuned towards specific things, you'll always be able to find AV enthusiasts claiming the ability to see the invisible or hear things that aren't sound, but those subcultures are so full of placebo and self-deception it's essentially impossible to differentiate reality from delusion. So for a monitor with a color depth, resolution and refresh rate combination that doesn't exceed the bandwidth provided by DP 1.4+DSC, there is no reason for them to use DP 2.0. It would drive up prices and limit availability with zero benefits to show for it.
 
Joined
Oct 16, 2014
Messages
671 (0.19/day)
System Name Work in progress
Processor AMD Ryzen 5 3600
Motherboard Asus PRIME B350M-A
Cooling Wraith Stealth Cooler, 4x140mm Noctua NF-A14 FLX 1200RPM Case Fans
Memory Corsair 16GB (2x8GB) CMK16GX4M2A2400C14R DDR4 2400MHz Vengeance LPX DIMM
Video Card(s) GTX 1050 2GB (for now) 3060 12GB on order
Storage Samsung 860 EVO 500GB, Lots of HDD storage
Display(s) 32 inch 4K LG, 55 & 48 inch LG OLED, 40 inch Panasonic LED LCD
Case Cooler Master Silencio S400
Audio Device(s) Sound: LG Monitor Built-in speakers (currently), Mike: Marantz MaZ
Power Supply Corsair CS550M 550W ATX Power Supply, 80+ Gold Certified, Semi-Modular Design
Mouse Logitech M280
Keyboard Logitech Wireless Solar Keyboard K750R (works best in summer)
VR HMD none
Software Microsoft Windows 10 Home 64bit OEM, Captur 1 21
Benchmark Scores Cinebench R20: 3508 (WIP)
The difference between 60Hz and 120Hz or higher (or really even 75Hz, though that depends more on the application) is clearly perceptible to pretty much anyone. The human eye is extremely good at capturing minute, rapid motion. What we aren't good at is spotting a tiny group of pixels that are a tiny bit off from their intended color for a few miliseconds. That's the kind of visual acuity you'd need to spot the difference between DSC and non-DSC displayport output. We're not talking banding, large-scale compression artifacts or anything like that. It might of course be possible to create highly specific test patterns or something similar that bring about visible artifacting with DSC (through identifying specific weaknesses in the compression system) but outside of entirely unrealistic cases like that I have yet to hear of a single example of it not being entirely imperceptible. Our senses can also be trained to be especially attuned towards specific things, you'll always be able to find AV enthusiasts claiming the ability to see the invisible or hear things that aren't sound, but those subcultures are so full of placebo and self-deception it's essentially impossible to differentiate reality from delusion. So for a monitor with a color depth, resolution and refresh rate combination that doesn't exceed the bandwidth provided by DP 1.4+DSC, there is no reason for them to use DP 2.0. It would drive up prices and limit availability with zero benefits to show for it.
We mostly agree.*

* Douglas Adams reference
 
Joined
Oct 6, 2020
Messages
62 (0.05/day)
Processor Intel Core i7 12700KF
Motherboard Asus Prime Z690-P D4
Cooling Arctic Liquid Freezer II 280
Memory G.Skill TRIDENT Z 32GB 4000MHz CL16 DDR4
Video Card(s) MSI Suprim X RTX 4080 16GB
Storage Kingston KC3000 1TB
Display(s) Alienware AW3423DWF 34" 21:9 OLED
Case Lian Li Lancool III
Power Supply Fractal Design ION 860+ Platinum
Mouse Roccat Kone Air
Keyboard OZONE StrikePro Spectra (CherryMX Red)
Software Windows 11 Pro 64bit
32" 4k in 144hz would be perfect for me. With upcoming 42" 4k oleds from LG all will depend on the price.
 
Joined
Oct 16, 2014
Messages
671 (0.19/day)
System Name Work in progress
Processor AMD Ryzen 5 3600
Motherboard Asus PRIME B350M-A
Cooling Wraith Stealth Cooler, 4x140mm Noctua NF-A14 FLX 1200RPM Case Fans
Memory Corsair 16GB (2x8GB) CMK16GX4M2A2400C14R DDR4 2400MHz Vengeance LPX DIMM
Video Card(s) GTX 1050 2GB (for now) 3060 12GB on order
Storage Samsung 860 EVO 500GB, Lots of HDD storage
Display(s) 32 inch 4K LG, 55 & 48 inch LG OLED, 40 inch Panasonic LED LCD
Case Cooler Master Silencio S400
Audio Device(s) Sound: LG Monitor Built-in speakers (currently), Mike: Marantz MaZ
Power Supply Corsair CS550M 550W ATX Power Supply, 80+ Gold Certified, Semi-Modular Design
Mouse Logitech M280
Keyboard Logitech Wireless Solar Keyboard K750R (works best in summer)
VR HMD none
Software Microsoft Windows 10 Home 64bit OEM, Captur 1 21
Benchmark Scores Cinebench R20: 3508 (WIP)
32" 4k in 144hz would be perfect for me. With upcoming 42" 4k oleds from LG all will depend on the price.
Based on the C1 48 price the 42 won't be cheap. I could have got a 55 for the same price and I really shopped and got a sweet deal.
 
Joined
Jun 18, 2021
Messages
2,282 (2.20/day)
That's a bit of an odd take. PCs need to be forward compatible with future monitors, as the resolution and refresh rate of the monitor determines the necessary bandwidth. This also means that DP 1.4 with DSC is perfectly fine for 2160p144 - it is all this monitor will ever need. Adding DP 2.0 would increase costs for no benefit. And it will of course be compatible with any future PC with DP 2.0 as DP 2.0 is backwards compatible with 1.4. DP 2.0 is only really needed for 2160p240Hz or 2160p144 with 12-bit color.

My issue is not with DSC being or not being fine. IMO it's fine (might even have some advantages because cables are getting real expensive and bulky already without the massive bandwidth of dp 2.0 - although we shouldn't strive for "enough" anyway) but support for it is flaky as fuck.

If future gpus using dp 2.0 supported it without questions (hell, even current ones with dp 1.4) i'd be ok with it but that's not the reality and this will be a top of the line product so it's a bit sad is all.
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
My issue is not with DSC being or not being fine. IMO it's fine (might even have some advantages because cables are getting real expensive and bulky already without the massive bandwidth of dp 2.0 - although we shouldn't strive for "enough" anyway) but support for it is flaky as fuck.

If future gpus using dp 2.0 supported it without questions (hell, even current ones with dp 1.4) i'd be ok with it but that's not the reality and this will be a top of the line product so it's a bit sad is all.
But that's really no different than how things have been previously. Either your GPU supports an output resolution/refresh rate combo, or it doesn't. There have been lots of instances throughout history when a GPU has had a max output resolution lower than the peak capability of its interfaces. Just because DSC isn't specific to an interface but an extension of one only makes it marginally more confusing. It's still a binary specification - either it's supported or it isn't. I guess they could have called DP 1.4+DSC DP 1.5 for the sake of simplicity, but ... creating new standards arbitrarily is hardly simple. And besides, all GPUs with even a remote hope of rendering games at 2160p144 support DP 1.4+DSC. The only issue is that the fact that DSC support is optional was poorly communicated, and that GPU vendors did a terrible job of listing this in their specs. But AFAIK all RX 5000-series and later AMD GPUs and all RTX 20-series and later Nvidia GPUs support it.
 
Joined
Jun 18, 2021
Messages
2,282 (2.20/day)
And besides, all GPUs with even a remote hope of rendering games at 2160p144 support DP 1.4+DSC.

Gaming is not all that people do, with 10bit color you'll basically be limited to under 100hz which is not terrible but cmon let's not excuse companies who cheap out when they can do better.

DSC even predates DP 1.4 so this mess was very much avoidable.
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Gaming is not all that people do, with 10bit color you'll basically be limited to under 100hz which is not terrible but cmon let's not excuse companies who cheap out when they can do better.

DSC even predates DP 1.4 so this mess was very much avoidable.
...but what are you doing at 144Hz besides gaming? I guess 120Hz content creation could be a thing, but ... editing at 60Hz would be perfectly fine for that, as would stepping down to 4:2:2 for 120Hz previews. Perfect? Of course not. That's why crazy expensive professional hardware exists. And besides, you don't need to game on a GPU just because it's from the past two generations. But if your requirements for whatever you are doing is 2160p144 with a single cable, then you need hardware that supports DSC. Blame GPU makers for being slow, I guess.

Also, they don't really seem to be cheaping out - from what I can tell, there isn't yet any DP 2.0 hardware on the market, and you need that to make a monitor supporting it. A standard being finalized is not equal to hardware being designed, tested, and put into mass production, after all. It took a long time for HDMI 2.1 to get out there, and that has the massive TV and game console markets to aim for - monitors are far more limited.
 
Joined
Oct 6, 2020
Messages
62 (0.05/day)
Processor Intel Core i7 12700KF
Motherboard Asus Prime Z690-P D4
Cooling Arctic Liquid Freezer II 280
Memory G.Skill TRIDENT Z 32GB 4000MHz CL16 DDR4
Video Card(s) MSI Suprim X RTX 4080 16GB
Storage Kingston KC3000 1TB
Display(s) Alienware AW3423DWF 34" 21:9 OLED
Case Lian Li Lancool III
Power Supply Fractal Design ION 860+ Platinum
Mouse Roccat Kone Air
Keyboard OZONE StrikePro Spectra (CherryMX Red)
Software Windows 11 Pro 64bit
Based on the C1 48 price the 42 won't be cheap. I could have got a 55 for the same price and I really shopped and got a sweet deal.
Some sources believe that LG's 42" won't exceed 1k$ (whathifi) and I don't expect to find new Viesonic monitors any cheaper then that so... Even if they come with same price I would still choose oled.
 
Joined
Oct 16, 2014
Messages
671 (0.19/day)
System Name Work in progress
Processor AMD Ryzen 5 3600
Motherboard Asus PRIME B350M-A
Cooling Wraith Stealth Cooler, 4x140mm Noctua NF-A14 FLX 1200RPM Case Fans
Memory Corsair 16GB (2x8GB) CMK16GX4M2A2400C14R DDR4 2400MHz Vengeance LPX DIMM
Video Card(s) GTX 1050 2GB (for now) 3060 12GB on order
Storage Samsung 860 EVO 500GB, Lots of HDD storage
Display(s) 32 inch 4K LG, 55 & 48 inch LG OLED, 40 inch Panasonic LED LCD
Case Cooler Master Silencio S400
Audio Device(s) Sound: LG Monitor Built-in speakers (currently), Mike: Marantz MaZ
Power Supply Corsair CS550M 550W ATX Power Supply, 80+ Gold Certified, Semi-Modular Design
Mouse Logitech M280
Keyboard Logitech Wireless Solar Keyboard K750R (works best in summer)
VR HMD none
Software Microsoft Windows 10 Home 64bit OEM, Captur 1 21
Benchmark Scores Cinebench R20: 3508 (WIP)
...but what are you doing at 144Hz besides gaming? I guess 120Hz content creation could be a thing, but ... editing at 60Hz would be perfectly fine for that, as would stepping down to 4:2:2 for 120Hz previews. Perfect? Of course not. That's why crazy expensive professional hardware exists. And besides, you don't need to game on a GPU just because it's from the past two generations. But if your requirements for whatever you are doing is 2160p144 with a single cable, then you need hardware that supports DSC. Blame GPU makers for being slow, I guess.

Also, they don't really seem to be cheaping out - from what I can tell, there isn't yet any DP 2.0 hardware on the market, and you need that to make a monitor supporting it. A standard being finalized is not equal to hardware being designed, tested, and put into mass production, after all. It took a long time for HDMI 2.1 to get out there, and that has the massive TV and game console markets to aim for - monitors are far more limited.
However, editing 4K 120fps video does require significant bandwidth and has become popular and easily accessible.

Some sources believe that LG's 42" won't exceed 1k$ (whathifi) and I don't expect to find new Viesonic monitors any cheaper then that so... Even if they come with same price I would still choose oled.
The same sources that suggested a PS5 was obtainable? This is code for prepare to be disappointed about the price, but not the image quality.
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
However, editing 4K 120fps video does require significant bandwidth and has become popular and easily accessible.
Sure, that happens, and I guess you could describe it as "growing". But "popular" or "easily accessible"? I really wouldn't say so. There are a few mirrorless cameras supporting it, and some phones, though ... if you're editing 4k120 video shot on a phone, you're losing nothing at all even working at 4:2:2, let alone with DSC (and tbh you're likely doing your editing on an iPad). Not to mention that the actual use cases for 120fps video are pretty limited. If you're using it for slow motion you don't need a 120Hz display, and beyond that there are few genres of video production where going above 60Hz makes sense. Then there's the low adoption rate of >60Hz displays and whether it's worth it to produce for a tiny minority of viewers. And again, it's not unreasonable to need the newest generations of equipment in order to utilize the newest generations of I/O, even if previous generations cover every other need. That's just not how time or product development work. Getting back to the point, all of this will work fine with DSC. It works equally well for static images as for video - on a PC there's no difference after all, each frame is transmitted in full every time, so any possible compression artifacts (of which there aren't supposed to be any on any level visible to humans) would be present for a few milliseconds at worst.

The handful of people doing this use the same displays and the same hardware as everyone else. And if you're one of the few working on a 12-bit HDR reference monitor, then having sufficient GPU bandwidth is trivial, as you clearly have a massive budget for equipment. And you're likely using a Radeon Pro or Quadro anyhow, and are comfortable with multi-cable connections if necessary.

DP 2.0 will arrive when it's ready and there's a need for it. That will likely be with 2160p240 displays, whenever they arrive. Until then, we simply don't have a need for it. And at least the past two generations of GPUs support DSC - no GPUs support DP 2.0, forcing even further upgrades.
 
Joined
Oct 16, 2014
Messages
671 (0.19/day)
System Name Work in progress
Processor AMD Ryzen 5 3600
Motherboard Asus PRIME B350M-A
Cooling Wraith Stealth Cooler, 4x140mm Noctua NF-A14 FLX 1200RPM Case Fans
Memory Corsair 16GB (2x8GB) CMK16GX4M2A2400C14R DDR4 2400MHz Vengeance LPX DIMM
Video Card(s) GTX 1050 2GB (for now) 3060 12GB on order
Storage Samsung 860 EVO 500GB, Lots of HDD storage
Display(s) 32 inch 4K LG, 55 & 48 inch LG OLED, 40 inch Panasonic LED LCD
Case Cooler Master Silencio S400
Audio Device(s) Sound: LG Monitor Built-in speakers (currently), Mike: Marantz MaZ
Power Supply Corsair CS550M 550W ATX Power Supply, 80+ Gold Certified, Semi-Modular Design
Mouse Logitech M280
Keyboard Logitech Wireless Solar Keyboard K750R (works best in summer)
VR HMD none
Software Microsoft Windows 10 Home 64bit OEM, Captur 1 21
Benchmark Scores Cinebench R20: 3508 (WIP)
Sure, that happens, and I guess you could describe it as "growing". But "popular" or "easily accessible"? I really wouldn't say so. There are a few mirrorless cameras supporting it, and some phones, though ... if you're editing 4k120 video shot on a phone, you're losing nothing at all even working at 4:2:2, let alone with DSC (and tbh you're likely doing your editing on an iPad). Not to mention that the actual use cases for 120fps video are pretty limited. If you're using it for slow motion you don't need a 120Hz display, and beyond that there are few genres of video production where going above 60Hz makes sense. Then there's the low adoption rate of >60Hz displays and whether it's worth it to produce for a tiny minority of viewers. And again, it's not unreasonable to need the newest generations of equipment in order to utilize the newest generations of I/O, even if previous generations cover every other need. That's just not how time or product development work. Getting back to the point, all of this will work fine with DSC. It works equally well for static images as for video - on a PC there's no difference after all, each frame is transmitted in full every time, so any possible compression artifacts (of which there aren't supposed to be any on any level visible to humans) would be present for a few milliseconds at worst.

The handful of people doing this use the same displays and the same hardware as everyone else. And if you're one of the few working on a 12-bit HDR reference monitor, then having sufficient GPU bandwidth is trivial, as you clearly have a massive budget for equipment. And you're likely using a Radeon Pro or Quadro anyhow, and are comfortable with multi-cable connections if necessary.

DP 2.0 will arrive when it's ready and there's a need for it. That will likely be with 2160p240 displays, whenever they arrive. Until then, we simply don't have a need for it. And at least the past two generations of GPUs support DSC - no GPUs support DP 2.0, forcing even further upgrades.
4K @120fps
  1. Canon EOS R5
  2. Sony A7S III
  3. Canon EOS C70
  4. Kandao QooCam 8K
  5. Samsung Galaxy S20 Ultra
  6. Z Cam E2
  7. Kandao Obsidian S
  8. Insta360 Pro 2
Using the 180-degree shutter rule
Let’s move on to why you shouldn’t always film at 60fps – even though looking at the basics it would seem sensible. This is where the technical aspect comes in. You need to think about the 180-degree shutter rule. This dictates that the fps should always be half of the shutter speed. At 29.7fps it’s relatively easy to find a balance of gain (ISO) and IRIS (Aperture) to enable that all-important 1/60th shutter speed. But, crank that to 60fps and you suddenly need a shutter speed of 1/120th – and that’s quite a jump. Shooting a frame at 1/120th of second is going to eliminate any motion blur, whereas shooting at 1/60th will give motion blur in the frame. Motion blur is important for video as it helps with that all-important persistence of vision. The blur actually helps with the smoothness of playback. That 4K footage shot at 60fps and 1/120th of a second played back at 60fps will look fine. There’s enough content there for the motion to look silky smooth. Stretch that footage out over 2 seconds and the optical illusion and lack of blur still fools the eye, but, reduce it to 30fps, and you start to break the 180-degree shutter rule. The effect is admittedly slight, but it is noticeable. Break the 180-degree shutter rule with faster shutter speeds and you get the action-packed 300 gladiator-style look. Slow it right down and it all becomes a bit romantic. Ultimately 60fps is a sought-after feature as it enables you to shoot smooth slow motion, and these days it’s an effect that you as a filmmaker can’t be without, but do be careful as using it wrongly can have a dramatic effect.

Why shoot 4K at 120fps?
Recording at 4K at 120fps is pretty niche, but it’s still a fantastic feature to have in a camera. There are a few points to take into consideration when looking at a camera that shoots 4K at 120fps, such as the amount of data recorded. This is usually measured as 100mb/s (megabits per second) rather than 100mb per frame. Most cameras, especially mirrorless and DSLR, will therefore split the bit rate across frames as VBR (variable bit rate). If the bit rate was fixed, then at 30fps 100mb/s, each frame would be 3.3Mbs. Crank it to 60fps and that drops to 1.6Mbs. This isn’t the way it works due to VBR. The more frames you have, the less change you have between frames and the less data needs recording. So one frame may max out at 3.3mbs where there’s tonnes of movement, and the next may be 1mbs where there’s less change in the frame. Shoot at 30fps and the likelihood is that more will change within the frame so the data recorded will be greater. Still, as the frame rate increases even with that flexibility of variable bitrate, it can still push the limits of the camera’s max 100mb/s. Take the GoPro as an example. You’d be hard pushed to see the difference in 1080p action footage shot at 30 or 60fps, but push it to 120fps and you start to see some pixelation. At 240fps you really start to see the drop. Use the same settings for a static scene and the quality of the high fps footage will still look good. Less has changed in the frame so less data needs to be recorded. Likewise you’ll often see cameras such as the Canon EOS 5D Mark IV that will record 1080p at 60fps, but you need to drop the resolution to 720p to record at 120fps. It’s probably limited by Canon as the 4K mb/s is 500mb/s, which seems insanely high. 4K at 120fps is amazing, but can the camera actually cope? What’s the mb/s at 4K? Is there enough scope to capture decent footage? The Sony RX0 can shoot at 1000fps in 1080p, for instance, which seems impressive until you see the quality of the footage in normal lighting conditions. Then there’s the heat issue. All that processing can come at a price. Finally, there’s shutter speed. Shooting 4K at 120fps, the shutter speed needs to be set at 1/250th of a second (there’s no 1/240th so you round to closest). That’s all well and good but then lighting really becomes an issue. You also need to consider file size and processing power. 4K already consumes storage, but slow-motion footage requires rendering and processing. This is where you really need to start looking at Nvme M.2 hard drives.


Not covered above, is the need for Secure Digital Ultra Capacity (SDUC) UHS-III or SD Express format or CFexpress cards.

Sure, that happens, and I guess you could describe it as "growing". But "popular" or "easily accessible"? I really wouldn't say so. There are a few mirrorless cameras supporting it, and some phones, though ... if you're editing 4k120 video shot on a phone, you're losing nothing at all even working at 4:2:2, let alone with DSC (and tbh you're likely doing your editing on an iPad). Not to mention that the actual use cases for 120fps video are pretty limited. If you're using it for slow motion you don't need a 120Hz display, and beyond that there are few genres of video production where going above 60Hz makes sense. Then there's the low adoption rate of >60Hz displays and whether it's worth it to produce for a tiny minority of viewers. And again, it's not unreasonable to need the newest generations of equipment in order to utilize the newest generations of I/O, even if previous generations cover every other need. That's just not how time or product development work. Getting back to the point, all of this will work fine with DSC. It works equally well for static images as for video - on a PC there's no difference after all, each frame is transmitted in full every time, so any possible compression artifacts (of which there aren't supposed to be any on any level visible to humans) would be present for a few milliseconds at worst.

The handful of people doing this use the same displays and the same hardware as everyone else. And if you're one of the few working on a 12-bit HDR reference monitor, then having sufficient GPU bandwidth is trivial, as you clearly have a massive budget for equipment. And you're likely using a Radeon Pro or Quadro anyhow, and are comfortable with multi-cable connections if necessary.

DP 2.0 will arrive when it's ready and there's a need for it. That will likely be with 2160p240 displays, whenever they arrive. Until then, we simply don't have a need for it. And at least the past two generations of GPUs support DSC - no GPUs support DP 2.0, forcing even further upgrades.

  1. Canon EOS R5
  2. Sony A7S III
  3. Canon EOS C70
  4. Kandao QooCam 8K
  5. Samsung Galaxy S20 Ultra
  6. Z Cam E2
  7. Kandao Obsidian S
  8. Insta360 Pro 2
Using the 180-degree shutter rule
Let’s move on to why you shouldn’t always film at 60fps – even though looking at the basics it would seem sensible. This is where the technical aspect comes in. You need to think about the 180-degree shutter rule. This dictates that the fps should always be half of the shutter speed. At 29.7fps it’s relatively easy to find a balance of gain (ISO) and IRIS (Aperture) to enable that all-important 1/60th shutter speed. But, crank that to 60fps and you suddenly need a shutter speed of 1/120th – and that’s quite a jump. Shooting a frame at 1/120th of second is going to eliminate any motion blur, whereas shooting at 1/60th will give motion blur in the frame. Motion blur is important for video as it helps with that all-important persistence of vision. The blur actually helps with the smoothness of playback. That 4K footage shot at 60fps and 1/120th of a second played back at 60fps will look fine. There’s enough content there for the motion to look silky smooth. Stretch that footage out over 2 seconds and the optical illusion and lack of blur still fools the eye, but, reduce it to 30fps, and you start to break the 180-degree shutter rule. The effect is admittedly slight, but it is noticeable. Break the 180-degree shutter rule with faster shutter speeds and you get the action-packed 300 gladiator-style look. Slow it right down and it all becomes a bit romantic. Ultimately 60fps is a sought-after feature as it enables you to shoot smooth slow motion, and these days it’s an effect that you as a filmmaker can’t be without, but do be careful as using it wrongly can have a dramatic effect.

Why shoot 4K at 120fps?
Recording at 4K at 120fps is pretty niche, but it’s still a fantastic feature to have in a camera. There are a few points to take into consideration when looking at a camera that shoots 4K at 120fps, such as the amount of data recorded. This is usually measured as 100mb/s (megabits per second) rather than 100mb per frame. Most cameras, especially mirrorless and DSLR, will therefore split the bit rate across frames as VBR (variable bit rate). If the bit rate was fixed, then at 30fps 100mb/s, each frame would be 3.3Mbs. Crank it to 60fps and that drops to 1.6Mbs. This isn’t the way it works due to VBR. The more frames you have, the less change you have between frames and the less data needs recording. So one frame may max out at 3.3mbs where there’s tonnes of movement, and the next may be 1mbs where there’s less change in the frame. Shoot at 30fps and the likelihood is that more will change within the frame so the data recorded will be greater. Still, as the frame rate increases even with that flexibility of variable bitrate, it can still push the limits of the camera’s max 100mb/s. Take the GoPro as an example. You’d be hard pushed to see the difference in 1080p action footage shot at 30 or 60fps, but push it to 120fps and you start to see some pixelation. At 240fps you really start to see the drop. Use the same settings for a static scene and the quality of the high fps footage will still look good. Less has changed in the frame so less data needs to be recorded. Likewise you’ll often see cameras such as the Canon EOS 5D Mark IV that will record 1080p at 60fps, but you need to drop the resolution to 720p to record at 120fps. It’s probably limited by Canon as the 4K mb/s is 500mb/s, which seems insanely high. 4K at 120fps is amazing, but can the camera actually cope? What’s the mb/s at 4K? Is there enough scope to capture decent footage? The Sony RX0 can shoot at 1000fps in 1080p, for instance, which seems impressive until you see the quality of the footage in normal lighting conditions. Then there’s the heat issue. All that processing can come at a price. Finally, there’s shutter speed. Shooting 4K at 120fps, the shutter speed needs to be set at 1/250th of a second (there’s no 1/240th so you round to closest). That’s all well and good but then lighting really becomes an issue. You also need to consider file size and processing power. 4K already consumes storage, but slow-motion footage requires rendering and processing. This is where you really need to start looking at Nvme M.2 hard drives.


Not covered above is the need for Secure Digital Ultra Capacity (SDUC) UHS-III or SD Express format or CFexpress cards

ELITE XG320U
32” 4K UHD Vesa DisplayHDR™ 600 IPS display
Expand gameplay onto next-gen consoles with single-cable HDMI 2.1 connectivity
Brilliant, vibrant imagery from 99% Adobe Color Gamut
Refresh rate of 144Hz (overclockable to 150Hz) and 1ms (MPRT) response time
AMD FreeSync Premium Pro technology and PureXP Motion Blur Reduction
Availability
In Q4 2021, ViewSonic ELITE XG321UG will be made available worldwide.

https://www.viewsonic.com/fr/newsroom/content/ViewSonic ELITE Launches New 32 E2 80 9D Gaming Monitors with the Latest Gamer-Centric Technologies_4118
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
4K @120fps


Not covered above, is the need for Secure Digital Ultra Capacity (SDUC) UHS-III or SD Express format or CFexpress cards.





Not covered above is the need for Secure Digital Ultra Capacity (SDUC) UHS-III or SD Express format or CFexpress cards
Yep, exactly - a very niche undertaking. 120fps video in general is mainly applicable for slow motion, and outside of amateur filmmakers overly enamored with Zach Snyder or Michael Bay, there really aren't that many reasonable applications (and in most of those cases, 1080p slow motion upscaled into a 2160p timeline is perfectly adequate). Plus that slow motion of course doesn't involve playback at high frame rates. On that list of cameras (which I have no idea if is up to date), there are two mirrorless cameras, two cine cameras (one "affordable", one mid-range), three 360-degree VR/action cams that don't really follow standard resolutions (can't do 360° video in 16:9 or similar), and one smartphone. In other words, you need to be looking at very specific gear in order to film in 2160p120. And it's rather hilarious that the "Why shoot at 4K 120fps" paragraph literally doesn't list a single argument for doing so (but quite a few against it!), except for "it's still a fantastic feature to have in a camera" and (paraphrasing here) "it might not look that much worse". Kind of made me giggle :p
 
Top