- Joined
- Nov 10, 2008
- Messages
- 1,984 (0.35/day)
Processor | Intel Core i9 9900k @ 5.1GHZ all core load (8c 16t) |
---|---|
Motherboard | MSI MEG Z390 ACE |
Cooling | Corsair H100i v2 240mm |
Memory | 32GB Corsair 3200mhz C16 (2x16GB) |
Video Card(s) | Powercolor RX 6900 XT Red Devil Ultimate (XTXH) @ 2.6ghz core, 2.1ghz mem |
Storage | 256GB WD Black NVME drive, 4TB across various SSDs/NVMEs, 4TB HDD |
Display(s) | Asus 32" PG32QUX (4k 144hz mini-LED backlit IPS with freesync & gsync & 1400 nit HDR) |
Case | Corsair 760T |
Power Supply | Corsair HX850i |
Mouse | Logitech G502 Lightspeed on powerplay mousemat |
Keyboard | Logitech G910 |
VR HMD | Wireless Vive Pro & Valve knuckles |
Software | Windows 10 Pro |
im sorry that im having a hard time getting this... its just it sounds good on paper but the actual technology that is supposedly doing these operations dosent seem viable or possible.
In order to know where something is going to be, youd need to buffer it... but with a digital broadcast or a DVD the processing happens inside the decoder or player... not the TV... then that signal is played on the screen.... no time to mess with the signal or it would be out of synch with the sound coming out of the decoder / dvd player thats attached to your 5.1 surround system...
If its going to assume where something is going to be... itll have to have it beffered up to know where the last frame is in order to fill in the rest... you cant buffer in real time
But it does have the time - while a 60hz tv would be idling between frames a 100hz tv is able to calculate the extra frames and send them for rendering. There is a buffering involved - but you wouldn't notice a 1ms delay on a frame, and this is also why most decent source decoders (eg blu-ray players, digital tv decoders, surround sound kits) have options to delay audio output by 5,10,50 etc miliseconds so the video and audio is perfectly in sync (if the audio comes from the TV then the TV delayes the audio automatically).
This interpolation is all extremely fast and is done with a chip which is designed to interpolate and only interpolate - dedicated silicon with carefully designed algorithms and instruction sets make it very efficient to process the data as they only compare bitmaps for similar shapes which dont move far between source frames. It is possible as it is done today.
The resaon pc monitors don't do it is to keep cost down - 100hz tvs cost £600+ easily for bottom end models - pc monitor makers don't want extra chips consuming power, creating heat and eating into their margins when the PC should be creating a source which matches the screen's capability - TV is limited to 25-30fps max, the source can't be improved as the transmission technology would also need an overhaul for little gain, so it needs to be done on the TV where the source can't be changed, but on the pc the source can be changed to match the tv and so it is better to do that than have the monitor approximate the extra frames.
Last edited: