But it does have the time - while a 60hz tv would be idling between frames a 100hz tv is able to calculate the extra frames and send them for rendering. There is a buffering involved - but you wouldn't notice a 1ms delay on a frame, and this is also why most decent source decoders (eg blu-ray players, digital tv decoders, surround sound kits) have options to delay audio output by 5,10,50 etc miliseconds so the video and audio is perfectly in sync (if the audio comes from the TV then the TV delayes the audio automatically). This interpolation is all extremely fast and is done with a chip which is designed to interpolate and only interpolate - dedicated silicon with carefully designed algorithms and instruction sets make it very efficient to process the data as they only compare bitmaps for similar shapes which dont move far between source frames. It is possible as it is done today. The resaon pc monitors don't do it is to keep cost down - 100hz tvs cost £600+ easily for bottom end models - pc monitor makers don't want extra chips consuming power, creating heat and eating into their margins when the PC should be creating a source which matches the screen's capability - TV is limited to 25-30fps max, the source can't be improved as the transmission technology would also need an overhaul for little gain, so it needs to be done on the TV where the source can't be changed, but on the pc the source can be changed to match the tv and so it is better to do that than have the monitor approximate the extra frames.