• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

100Hz LCD TV Screens (worth It?)

Joined
Oct 4, 2007
Messages
2,452 (0.41/day)
System Name PC
Processor i7 9700KF
Motherboard MSI Z390 A PRO
Cooling Noctua NH-U14S
Memory 32GB Corsair Vengeance DDR4 3000mhz
Video Card(s) PALIT RTX 4070 Dual 12Gb
Storage 2X Crucial MX500 2TB SSD, Samsung 850 pro 512gb SSD
Display(s) DELL C34H89x 34" Ultrawide
Case Corsair Obsidian 550D
Audio Device(s) Audioengine A5+ Speakers
Power Supply Corsair RM750
Mouse Logitech G403
Keyboard Corsair Vengeance K70
Software Windows 10 64bit
Ok as the title implys...100Hz LCD TV Screens (worth It?) I want to hear your opinions on this topic.
Standard LCD TV are 60Hz (or capable of 60Fps max to my understanding).... 100Hz screens are now the rage with advertising saying that you'll see faster frames for sports etc... even though the signal received is only running 24, 29.97 or 30fps.

Personally I dont see this as making any sense... blueray movies are generally 24fps... with different movie formats coming out at 29.97 or 30fps... there are also other formats direct from digital camera (like the one i own) like 50, 60fps etc... but these are not commercial formats that are broadcast or even on DVD or Blueray.

So my question is why spend the extra of a 100Hz TV screen? I say marketting gimmick, but id like to prove myself wrong and I hope that there is a point to it ;)

(Edit: PLEASE KEEP THIS TOPIC ON TV SCREENS... Not using the screen for gaming etc as thats obviously gonna make a difference with the higher refresh rate and higher FPS)
 

Galaxy

New Member
Joined
Jul 31, 2009
Messages
36 (0.01/day)
System Name Galaxy X7
Processor Intel I7 920 (C0/C1) 4GHz @ 1.32v w/HT
Motherboard Asus ROG Rampage II Extreme
Cooling Scythe Mugen 2
Memory Corsair Dominator 3GB 1600MHz
Video Card(s) XFX4890 1GB @ 950/1030
Storage WD6400AAKS
Display(s) Samsung 2232BW
Case CoolerMaster HAF 932
Audio Device(s) SupremeFX X-Fi
Power Supply Corsair TX 750W
Software Windows 7 Ultimate
I really don't know why, but the picture seems a lot more smoother on a 100Hz TV. My friend owns some Phillips 100Hz LCD 42" and it seems that everything is "faster" - just a little...

I own a LG Scarlet 32LG6000. Maybe I'm just used to it - but you see a difference between "ordinary" LCD and a 100Hz LCD.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,703 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
My mate just bought a 40" .... I cant remember the brand but it advertized 100hz for sure.

I was completely unable to manually set 100hz anywhere on my pc, however there is a 100hz button on the remote that made games MORE blurry, imagine that.

I had high hopes but really i think its a gimmick, they try and emulate what 100hz is like or something :shadedshu

EDIT: sorry this was more gaming related, TV looks sweet on it 100hz or 60hz.
 
Joined
Feb 21, 2008
Messages
4,985 (0.85/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
Ok as the title implys...100Hz LCD TV Screens (worth It?) I want to hear your opinions on this topic.
Standard LCD TV are 60Hz (or capable of 60Fps max to my understanding).... 100Hz screens are now the rage with advertising saying that you'll see faster frames for sports etc... even though the signal received is only running 24, 29.97 or 30fps.

Personally I dont see this as making any sense... blueray movies are generally 24fps... with different movie formats coming out at 29.97 or 30fps... there are also other formats direct from digital camera (like the one i own) like 50, 60fps etc... but these are not commercial formats that are broadcast or even on DVD or Blueray.

So my question is why spend the extra of a 100Hz TV screen? I say marketting gimmick, but id like to prove myself wrong and I hope that there is a point to it ;)

(Edit: PLEASE KEEP THIS TOPIC ON TV SCREENS... Not using the screen for gaming etc as thats obviously gonna make a difference with the higher refresh rate and higher FPS)


Are you using it for HD TV?

Latency is more important than refresh rate for TV viewing. Some Visio branded TV's are around 14ms or worse. I suggest reading independent in-depth reviews of a model before buying one. The range of quality is vast.
 
Joined
Oct 4, 2007
Messages
2,452 (0.41/day)
System Name PC
Processor i7 9700KF
Motherboard MSI Z390 A PRO
Cooling Noctua NH-U14S
Memory 32GB Corsair Vengeance DDR4 3000mhz
Video Card(s) PALIT RTX 4070 Dual 12Gb
Storage 2X Crucial MX500 2TB SSD, Samsung 850 pro 512gb SSD
Display(s) DELL C34H89x 34" Ultrawide
Case Corsair Obsidian 550D
Audio Device(s) Audioengine A5+ Speakers
Power Supply Corsair RM750
Mouse Logitech G403
Keyboard Corsair Vengeance K70
Software Windows 10 64bit
No im not using it, I have a very nice Samsung 40" 1080p 60hz.... I have been advising people that to my knowldge for TV (DVD, Blueray & Digital Sattelite Transmission) useage its not going to make any difference as opposed to normal 60hz telivisions and to save their cash instead or buy a better quality 60hz panel...

now this is something that to me makes perfect sense, but reading all the marketting bullshit online and in ad's I wanted to hear from others who could give me technical reasons to why its actually gonna be better or to why im wrong thinking this ;)
 

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.23/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
http://forums.techpowerup.com/showthread.php?t=106528

Follow the link above for an interesting look at screens with above 60Hz refresh rate.

A personal opinion/suggestion.... Evo, the tournament for street fighter, uses ASUS VH223H model LCDs as their competition screen as it has 0 command latency which is imperative for gaming. This is far more important to me than 2ms GTG response or refresh rates above 60Hz.

You could measure command latency if you were to dual monitor a CRT and an LCD and then watch/record/take a picture of the two with a large scale clock on either screen that read into the milliseconds. You'd notice the CRT and the LCD say the same time, but the CRT may get ahead of the LCD at times. When this happens it means that the display signal is being preprocessed too much and you lose some frames. :toast:
 
Joined
Nov 10, 2008
Messages
1,982 (0.35/day)
Processor Intel Core i9 9900k @ 5.1GHZ all core load (8c 16t)
Motherboard MSI MEG Z390 ACE
Cooling Corsair H100i v2 240mm
Memory 32GB Corsair 3200mhz C16 (2x16GB)
Video Card(s) Powercolor RX 6900 XT Red Devil Ultimate (XTXH) @ 2.6ghz core, 2.1ghz mem
Storage 256GB WD Black NVME drive, 4TB across various SSDs/NVMEs, 4TB HDD
Display(s) Asus 32" PG32QUX (4k 144hz mini-LED backlit IPS with freesync & gsync & 1400 nit HDR)
Case Corsair 760T
Power Supply Corsair HX850i
Mouse Logitech G502 Lightspeed on powerplay mousemat
Keyboard Logitech G910
VR HMD Wireless Vive Pro & Valve knuckles
Software Windows 10 Pro
I have 60hz and 100hz tv setups and the 100hz is better. The way it does it is that the tv knows the previous frame, it knows the next frame so it can interpolate where the various visual components should be in intermediate frames (the delay is minimal, especially if you're not playing games in which case the sound is also delayed and you'd never know). This means that the images look smoother as the "jumps" in position of moving objects are a lot smaller if they move the same distance in 100 frames as opposed to 60.

Now with 100hz it's not a feature i say you need - you won't miss it if you get a 60hz screen, however once you've got it or had it you grow acustomed to it and will notice it when it's gone. It is something which is nice to have in a tv as well, as it does make everything a lot "smoother" when in motion.
 
Joined
Oct 4, 2007
Messages
2,452 (0.41/day)
System Name PC
Processor i7 9700KF
Motherboard MSI Z390 A PRO
Cooling Noctua NH-U14S
Memory 32GB Corsair Vengeance DDR4 3000mhz
Video Card(s) PALIT RTX 4070 Dual 12Gb
Storage 2X Crucial MX500 2TB SSD, Samsung 850 pro 512gb SSD
Display(s) DELL C34H89x 34" Ultrawide
Case Corsair Obsidian 550D
Audio Device(s) Audioengine A5+ Speakers
Power Supply Corsair RM750
Mouse Logitech G403
Keyboard Corsair Vengeance K70
Software Windows 10 64bit
I have 60hz and 100hz tv setups and the 100hz is better. The way it does it is that the tv knows the previous frame, it knows the next frame so it can interpolate where the various visual components should be in intermediate frames (the delay is minimal, especially if you're not playing games in which case the sound is also delayed and you'd never know). This means that the images look smoother.

Now with 100hz it's not a feature i say you need - you won't miss it if you get a 60hz screen, however once you've got it or had it you grow acustomed to it and will notice it when it's gone. It is something which is nice to have in a tv as well, as it does make everything a lot "smoother" when in motion.

Ok but if the source material is only blueray (24fps) or TV 30fps... then why would having 100fps as opposed to 60Fps be visibily different?.... surely if the source is in 24/30fps youre not getting any extra information as there arent any extra frames to display?
 
Joined
Nov 10, 2008
Messages
1,982 (0.35/day)
Processor Intel Core i9 9900k @ 5.1GHZ all core load (8c 16t)
Motherboard MSI MEG Z390 ACE
Cooling Corsair H100i v2 240mm
Memory 32GB Corsair 3200mhz C16 (2x16GB)
Video Card(s) Powercolor RX 6900 XT Red Devil Ultimate (XTXH) @ 2.6ghz core, 2.1ghz mem
Storage 256GB WD Black NVME drive, 4TB across various SSDs/NVMEs, 4TB HDD
Display(s) Asus 32" PG32QUX (4k 144hz mini-LED backlit IPS with freesync & gsync & 1400 nit HDR)
Case Corsair 760T
Power Supply Corsair HX850i
Mouse Logitech G502 Lightspeed on powerplay mousemat
Keyboard Logitech G910
VR HMD Wireless Vive Pro & Valve knuckles
Software Windows 10 Pro
Ok but if the source material is only blueray (24fps) or TV 30fps... then why would having 100fps as opposed to 60Fps be visibily different?.... surely if the source is in 24/30fps youre not getting any extra information as there arent any extra frames to display?

you do - you get the current frame and the next frame - by looking at the different visual parts of the image you can identify what parts move between the frames and add in where it would be if there was a frame.

For example, take the letter f to be an object (eg a ball in a sport) and each time it is shown below is it's position in a new frame. The TV can see where the ball was at frame 1, it knows where it is in frame 2 from the source, so it can divide the distance travelled between those frames by 3, add that distance to where it was in frame 1 to get frame 1 and 1/4, then that distance frame 1 1/4 to get frame 1 2/4 and then again to get frame 1 3/4 before showing the source frame 2.

This is how it would look - each time the letter is printed is it's position on the screen, and the only 1 f can be seen at once. As you increase the number of frames shown with calculated positions of the object you can see that it would look a lot smoother as the distance travelled between each frame is reduced.

<---visual distance across a screen-->
30fps:
f---f---f---f---f---f---f---f---f---f--
60fps:
f-f-f-f-f-f-f-f-f-f-f-f-f-f-f-f-f-f-f-f-f-
100fps:
ffffffffffffffffffffffffffffffffffffffffffffffffff

To make the movement look smoother the aim is to reduce the distance the object has visually moved between each frame, which is how 100hz and even faster screens look smoother from the same source. As i said above they can predict where an object is half way between 2 source frames and can divide that up further to get the desired positions for the added-in frames.

I hope the above makes sense - if not i'll try and explain better.
 
Last edited:
Joined
Oct 4, 2007
Messages
2,452 (0.41/day)
System Name PC
Processor i7 9700KF
Motherboard MSI Z390 A PRO
Cooling Noctua NH-U14S
Memory 32GB Corsair Vengeance DDR4 3000mhz
Video Card(s) PALIT RTX 4070 Dual 12Gb
Storage 2X Crucial MX500 2TB SSD, Samsung 850 pro 512gb SSD
Display(s) DELL C34H89x 34" Ultrawide
Case Corsair Obsidian 550D
Audio Device(s) Audioengine A5+ Speakers
Power Supply Corsair RM750
Mouse Logitech G403
Keyboard Corsair Vengeance K70
Software Windows 10 64bit
yeah that does make sense but only if the material being played had all of those frames to begin with... if you know what i mean?
So lets say like in your diagram you record something in 30fps:
30fps:
f---f---f---f---f---f---f---f---f---f--

When you play it back on a telivision... you can change the fact that the "F" is only on those frames becuase those frames have specifically set the F in those locations if you get what i mean?
If the original source was recorded (as in your diagram):
100fps:
ffffffffffffffffffffffffffffffffffffffffffffffffff

Then i could see the point in playing it back in the same framerate, but its not... its recorded and broadcast at 29.97 fps... having more frames isnt gonna make it sound better... kinda like taking an MP3 and rendering it to 24bit WAV... its not gonna sound any better even though 24bit WAV audio is a better format The source material is what it is
 

Clouds4brains

New Member
Joined
Oct 9, 2009
Messages
13 (0.00/day)
100/120hz tv's are awsome, I love the way that everything you watch on one makes whatever you watch look like a soap opera, my opinion is that thse faster tv's make good actors look bad because nothing looks treal anymore.
My gf dad has one and it looks so silly and unwatchable, makes me wonder if the its only people with these tv's that big them up cos the alternative is to say " yeah i spent loads of money on a crap tv and it looks like crap, he come watch my film it looks same as a soap opera gone bad "

Games console looked good ;), tv was total rubbish!!!!
 
Joined
Nov 10, 2008
Messages
1,982 (0.35/day)
Processor Intel Core i9 9900k @ 5.1GHZ all core load (8c 16t)
Motherboard MSI MEG Z390 ACE
Cooling Corsair H100i v2 240mm
Memory 32GB Corsair 3200mhz C16 (2x16GB)
Video Card(s) Powercolor RX 6900 XT Red Devil Ultimate (XTXH) @ 2.6ghz core, 2.1ghz mem
Storage 256GB WD Black NVME drive, 4TB across various SSDs/NVMEs, 4TB HDD
Display(s) Asus 32" PG32QUX (4k 144hz mini-LED backlit IPS with freesync & gsync & 1400 nit HDR)
Case Corsair 760T
Power Supply Corsair HX850i
Mouse Logitech G502 Lightspeed on powerplay mousemat
Keyboard Logitech G910
VR HMD Wireless Vive Pro & Valve knuckles
Software Windows 10 Pro
yeah that does make sense but only if the material being played had all of those frames to begin with... if you know what i mean?

So lets say like in your diagram you record something in 30fps:
30fps:
f---f---f---f---f---f---f---f---f---f--

When you play it back on a telivision... you can change the fact that the "F" is only on those frames becuase those frames have specifically set the F in those locations if you get what i mean?

If the original source was recorded (as in your diagram):
100fps:
ffffffffffffffffffffffffffffffffffffffffffffffffff

Then i could see the point in playing it back in the same framerate, but its not... its recorded and broadcast at 29.97 fps...

It is very easy to predict where everything would be between 2 frames. The Tv can look at frame 1 which was:
(each new line is a new frame for these examples)

f----

and then look at frame 2 which was

----f

and from that it can see that it can add (assuming object f is moving at a steady speed for ease in this example)
-f---
--f--
---f-
between frames 1 and 2 (this can be done extremely quick, with practically no delay as a 60hz tv would just be waiting to render the next frame anyway, while the 100hz tv is doing the calculating and adding the extra frames).

so your end result from above is instead of:
f----
----f

you get:
f----
-f---
--f--
---f-
----f

it's all simple maths as you know where an object was, you know where it is in x time so you have distance moved as well, meaning to get an extra frame in you do:

original pos+(distance moved/2)=new frame not from source

You just divide it up even more to add as many extra frames as is needed. Obviously the system is slightly more complicated to deal with varying speed objects and changing scene detection so it doesn't mess any images up, but essentially this is how the concept works.
 
Joined
Oct 4, 2007
Messages
2,452 (0.41/day)
System Name PC
Processor i7 9700KF
Motherboard MSI Z390 A PRO
Cooling Noctua NH-U14S
Memory 32GB Corsair Vengeance DDR4 3000mhz
Video Card(s) PALIT RTX 4070 Dual 12Gb
Storage 2X Crucial MX500 2TB SSD, Samsung 850 pro 512gb SSD
Display(s) DELL C34H89x 34" Ultrawide
Case Corsair Obsidian 550D
Audio Device(s) Audioengine A5+ Speakers
Power Supply Corsair RM750
Mouse Logitech G403
Keyboard Corsair Vengeance K70
Software Windows 10 64bit
nice explaination thanks... but with that explaination its almost like saying it dosent matter what the source input is... what if its only 15fps... will the 100hz screen make objects appear to move more smoothely still?... it couldnt surely as youre still only getting 15fps. (like if you were playing a game on a screen and only getting 15fps... dosent matter if your screen is set at 100hz, wont appear to move any smoother)

If it does then my next question is:
Can you really notice a difference with your eyes... for instance if in your explaination the source material is 30fps then:

60hz screen will draw the screen 2X the frame rate (60fps)
100hz will draw the frame just over 3X the frame rate (100fps)
(30hz would draw 1X for example (30fps))

This seems very marginal really... if this technology is better then 200hz would be a much bigger difference (almost 7X) no matter what the source input material again?
Im not trying to prove anyone wrong here im just trying to understand better why for myself :)
 
Joined
Nov 21, 2007
Messages
3,688 (0.62/day)
Location
Ohio
System Name Felix777
Processor Core i5-3570k@stock
Motherboard Biostar H61
Memory 8gb
Video Card(s) XFX RX 470
Storage WD 500GB BLK
Display(s) Acer p236h bd
Case Haf 912
Audio Device(s) onboard
Power Supply Rosewill CAPSTONE 450watt
Software Win 10 x64
i think twicksisted has a valid point and i understand what he's saying as that is what makes sense to myself as well. i've read this whole page and some parts again and its interesting.

but i too don't see how a tv can create frames 1.25,1.5,1.75, etc based off frame 1 and 2. As twick said, by making these other frames it is generating more frames, so technically if it does do as you(human_error) say then it would make whatever your viewing have a greater FPS than what the source outputs. I don't believe that it would make them 100FPS cause the tv have to create these new frames itself, which in turn with higher resolutions like 1080p it'd require more processing power to create these new frames and more time than say a 480p video.

I personally had no idea a tv had the processing power to create frames, in fact if a TV can do this i'd think that there'd be software out for pc by now that could do this, know of any? and as twick stated i'm not saying your wrong i'm very intrigued by what i've read n simply want to better understand.
 
Joined
Feb 19, 2006
Messages
6,270 (0.95/day)
Location
New York
Processor INTEL CORE I9-9900K @ 5Ghz all core 4.7Ghz Cache @1.305 volts
Motherboard ASUS PRIME Z390-P ATX
Cooling CORSAIR HYDRO H150I PRO RGB 360MM 6x120mm fans push pull
Memory CRUCIAL BALLISTIX 3000Mhz 4x8 32gb @ 4000Mhz
Video Card(s) EVGA GEFORECE RTX 2080 SUPER XC HYBRID GAMING
Storage ADATA XPG SX8200 Pro 1TB 3D NAND NVMe,Intel 660p 1TB m.2 ,1TB WD Blue 3D NAND,500GB WD Blue 3D NAND,
Display(s) 50" Sharp Roku TV 8ms responce time and Philips 75Hz 328E9QJAB 32" curved
Case BLACK LIAN LI O11 DYNAMIC XL FULL-TOWER GAMING CASE,
Power Supply 1600 Watt
Software Windows 10
higher MHz ='s less ghosting ...less blur .. crisper fast movment....this is in theory!
 
Joined
Nov 10, 2008
Messages
1,982 (0.35/day)
Processor Intel Core i9 9900k @ 5.1GHZ all core load (8c 16t)
Motherboard MSI MEG Z390 ACE
Cooling Corsair H100i v2 240mm
Memory 32GB Corsair 3200mhz C16 (2x16GB)
Video Card(s) Powercolor RX 6900 XT Red Devil Ultimate (XTXH) @ 2.6ghz core, 2.1ghz mem
Storage 256GB WD Black NVME drive, 4TB across various SSDs/NVMEs, 4TB HDD
Display(s) Asus 32" PG32QUX (4k 144hz mini-LED backlit IPS with freesync & gsync & 1400 nit HDR)
Case Corsair 760T
Power Supply Corsair HX850i
Mouse Logitech G502 Lightspeed on powerplay mousemat
Keyboard Logitech G910
VR HMD Wireless Vive Pro & Valve knuckles
Software Windows 10 Pro
nice explaination thanks... but with that explaination its almost like saying it dosent matter what the source input is... what if its only 15fps... will the 100hz screen make objects appear to move smoothely still?... it couldnt surely as youre still only getting 15fps. (like if you were playing a game on a screen and only getting 15fps... dosent matter if your screen is set at 100hz, wont appear to move any smoother)

If it does then my next question is:
Can you really notice a difference with your eyes... for instance if in your explaination the source material is 30fps then:

60hz screen will draw the screen 2X the frame rate (60fps)
100hz will draw the frame just over 3X the frame rate (100fps)
(30hz would draw 1X for example (30fps))

This seems very marginal really... if this technology is better then 200hz would be a much bigger difference (almost 7X) no matter what the source input material again?
Im not trying to prove anyone wrong here im just trying to understand better why for myself :)

Well the tv does have limits - to avoid ruining scenes there is a limit to the distance the tv will interpolate between, so if the fps is too low then the tv may as well be the same speed as the fps as it can't add in the extra detail as the "jumping" of the objects may be intentional. I only notice it when something is moving smoothly - an actor turning their head or a tennis ball is a good example of this. Also due to the higher framerate the "ghosting" which can happen on screens is gone as it may need 3 or 4 refreshes to completely remove the previous position of the object completely so on 100hz this happens a lot faster than on 60, helping reduce blurryness as well.

You won't notice the difference if it only showed 1 frame - you can only see the difference on things which are moving a relatively small distance across the screen between frames - if it was a 15fps source then the tv couldn't calculate the distance between:
f---------------------
and
---------------------f
properly as there is too much unknown detail there for the tv to be able to calculate - the scene will have changed too much with too much unknown data for the tv to accurately calculate what should go where, so it does have its limits.

As i said in my original post my opinion on 100hz is that it is nice, but it is not a "killer feature" and so other factors such as screen clarity, contrast ratio (to a limit), colour reproduction and sticking within your budget must all come first, if you can meet all of those and get the choice between 100hz and 60hz within your budget then go for the 100hz as you will get some benefit from it, but you won't miss it if you can't get it as you can't notice it not being there (if that makes sense).

Really if you want to see if you think it is worth it go to a brick and mortar store and look at 2 screens next to each other (one 100hz, 1 60hz) with the same source and see if you notice any difference - these things are always down to personal preference at the end of the day as some people will say it's very important while others wont notice the difference.

**edit**

as for the complexity the tv is looking at 2 bitmaps, quickly scanning for similar patterns very close to each other in the bitmaps and creating intermediary bitmaps to display - with dedicated silicon designed around the problem it isn't too complex, but as i said it does have limits and a lot of the bonus in sports especially is the reduction in ghosting as the more refreshes per second the faster the previous image's ghosting is removed, reducing blurryness). This is of course how i understand the technology to work, i could be mistaken.
 
Joined
Oct 4, 2007
Messages
2,452 (0.41/day)
System Name PC
Processor i7 9700KF
Motherboard MSI Z390 A PRO
Cooling Noctua NH-U14S
Memory 32GB Corsair Vengeance DDR4 3000mhz
Video Card(s) PALIT RTX 4070 Dual 12Gb
Storage 2X Crucial MX500 2TB SSD, Samsung 850 pro 512gb SSD
Display(s) DELL C34H89x 34" Ultrawide
Case Corsair Obsidian 550D
Audio Device(s) Audioengine A5+ Speakers
Power Supply Corsair RM750
Mouse Logitech G403
Keyboard Corsair Vengeance K70
Software Windows 10 64bit
thanks for your input here.... i do appreciate it as im trying to understand this myself... and i see where youre coming from... but its not really explaining the technology itself... more explaining the concept and what you think its doing....

Saying that at 15fps (or another non-smooth video format) its not going to work its magic in the above post, but at 30fps its going to make it smoother by adding in extra frames where it thinks they should go dosent really explain anything (if you know what I mean)... surely if the technology can work at 30fps and make it appear like 100FPS it could make 15fps 3X better?.... theres 200hz tellys out now... does that mean that those should technically do even better?

Also if this technology works as you have explained it to... what about playing video that itself has the frames purposly offset as a video effect... it would surely cancel that and make it smooth messing up what the video intended

This is what im getting at... im hearing a lot of hype and ad speak but no actual proof as to how it could be better technically... obviously a screen running at 100hz is going to be better than the same screen running at 60hz... thats non disputable... but will you actually be able to notice it with the current formats out today?

surely 30fps is 30fps... its not gonna get any faster just becuase your screen is capable of a higher framerate... if it can then the future isnt in graphics cards... just buy a faster screen and youll get higher fps instead... (obviously that makes no sense)
 
Joined
Nov 21, 2007
Messages
3,688 (0.62/day)
Location
Ohio
System Name Felix777
Processor Core i5-3570k@stock
Motherboard Biostar H61
Memory 8gb
Video Card(s) XFX RX 470
Storage WD 500GB BLK
Display(s) Acer p236h bd
Case Haf 912
Audio Device(s) onboard
Power Supply Rosewill CAPSTONE 450watt
Software Win 10 x64
i understand better now, though it still amazes me that, even if it is only under certain circumstances, a tv can do this. But now i wonder why they don't have Computer monitors doing this...would think it'd be a hit for gamers. well nvm i forgot that in a game the next frame in a game isn't set in stone.

EDIT:
i love your first sentence twick: "its not explaining the technology itself, only the concept." which is very true. Surely there'd be some technical differences that could be dug up on one of those 100mhz tv's compared to a 60mhz tv. Though if you think about it why would a 100mhz tv be able to have this tech, but the 60mhz which is still at least 2x the FPS of anything you'd view not? shouldn't 60mhz tv's be able to up the FPS of a sports game from 30 to at least 45? and if 60mhz tv don't have the implemented tech then these 100mhz tv's must have specific hardware components that can do this new frame magic as i'll call it. Which should mean we can look up what that hardware chip is and get more details.
 
Last edited:

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.23/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
The feature most TVs with 100Hz refresh rates have is a processing element that interpolates the missing frames. Most 60Hz LCDs will change colors slowly vs TVs with higher refresh rates, and this causes a blur effect. Today even without the extra interpolation of frames the 100Hz set will show a moving object with less blurring/streaking than a set that runs on 60Hz. This is not to say the sets of tomorrow couldn't change colors faster and eliminate blur on a 60Hz set, but with how things are I doubt we'll see that without a reason. I believe the manufacturer would want to sell a product so they would give you 600Hz along with the fast changing pixels.
 
Joined
Oct 4, 2007
Messages
2,452 (0.41/day)
System Name PC
Processor i7 9700KF
Motherboard MSI Z390 A PRO
Cooling Noctua NH-U14S
Memory 32GB Corsair Vengeance DDR4 3000mhz
Video Card(s) PALIT RTX 4070 Dual 12Gb
Storage 2X Crucial MX500 2TB SSD, Samsung 850 pro 512gb SSD
Display(s) DELL C34H89x 34" Ultrawide
Case Corsair Obsidian 550D
Audio Device(s) Audioengine A5+ Speakers
Power Supply Corsair RM750
Mouse Logitech G403
Keyboard Corsair Vengeance K70
Software Windows 10 64bit
hmmm i see no blur on my 1080p 40" samsung.... and the colour vibrancy and everything is perfect (best out of the whole shop when i bought it)..... the pixel changing speed is down to the pixel latency GTG not how many HZ the panel is running...

I believe my screen is 5ms GTG.... now if what youre saying is correct then a 100Hz would be faster... and a 200+ would be even quicker...?

Its surely the same technology as an LCD panel for computer games and those go as low as 2ms GTG in 60hz.... so that would imply that at 200hz is would be what 0.5ms GTG? I dont think so? as those 100hz tellys are also 5ms GTG etc...
 
Joined
Nov 10, 2008
Messages
1,982 (0.35/day)
Processor Intel Core i9 9900k @ 5.1GHZ all core load (8c 16t)
Motherboard MSI MEG Z390 ACE
Cooling Corsair H100i v2 240mm
Memory 32GB Corsair 3200mhz C16 (2x16GB)
Video Card(s) Powercolor RX 6900 XT Red Devil Ultimate (XTXH) @ 2.6ghz core, 2.1ghz mem
Storage 256GB WD Black NVME drive, 4TB across various SSDs/NVMEs, 4TB HDD
Display(s) Asus 32" PG32QUX (4k 144hz mini-LED backlit IPS with freesync & gsync & 1400 nit HDR)
Case Corsair 760T
Power Supply Corsair HX850i
Mouse Logitech G502 Lightspeed on powerplay mousemat
Keyboard Logitech G910
VR HMD Wireless Vive Pro & Valve knuckles
Software Windows 10 Pro
thanks for your input here.... i do appreciate it as im trying to understand this myself... and i see where youre coming from... but its not really explaining the technology itself... more explaining the concept and what you think its doing....

Saying that at 15fps (or another non-smooth video format) its not going to work its magic in the above post, but at 30fps its going to make it smoother by adding in extra frames where it thinks they should go dosent really explain anything (if you know what I mean)... surely if the technology can work at 30fps and make it appear like 100FPS it could make 15fps 3X better?.... theres 200hz tellys out now... does that mean that those should technically do even better?

Also if this technology works as you have explained it to... what about playing video that itself has the frames purposly offset as a video effect... it would surely cancel that and make it smooth messing up what the video intended

This is what im getting at... im hearing a lot of hype and ad speak but no actual proof as to how it could be better technically... obviously a screen running at 100hz is going to be better than the same screen running at 60hz... thats non disputable... but will you actually be able to notice it with the current formats out today?

surely 30fps is 30fps... its not gonna get any faster just becuase your screen is capable of a higher framerate... if it can then the future isnt in graphics cards... just buy a faster screen and youll get higher fps instead... (obviously that makes no sense)

With the 15fps example where i said it would do this it is because i assumed that at 15fps the objects would have moved quite a distance between frames - if the objects were moving very short distances between the frames (like in slow motion) then the TV would add in the extra frames - you just wouldn't notice as the source is moving soo slowly the extra frames would have to be adding very tiny distances (<1 pixel per frame perhaps). The limitation on if the TV adds the frames or not is the distance between the objects in the source frames - the tv only adds the extra frames if the objects moved very small distances otherwise as you said it would ruin scenes where things jump around for dramatic effect (it's a limitation placed on the algorithm to prevent adding frames which should not be added).

The distance the TV will be prepared to calculate between would have to be so small between refreshes that you wouldn't be able to point at a 60hz tv and show where the added frames would be added on a 100hz tv (without slowly going through each frame and looking at which objects had moved within the maximum distance)- this is why people say it looks smoother but they couldn't explain why easily - the effect is cumulative over a lot of frames (24-30 frames per second from original source) so the human eye can see it is smoother, but the viewer couldn't say why as the distance between individual source frames is too small to identify as the image is played back at normal speed.

**edit**

i love your first sentence twick: "its not explaining the technology itself, only the concept." which is very true. Surely there'd be some technical differences that could be dug up on one of those 100mhz tv's compared to a 60mhz tv. Though if you think about it why would a 100mhz tv be able to have this tech, but the 60mhz which is still at least 2x the FPS of anything you'd view not? shouldn't 60mhz tv's be able to up the FPS of a sports game from 30 to at least 45? and if 60mhz tv don't have the implemented tech then these 100mhz tv's must have specific hardware components that can do this new frame magic as i'll call it. Which should mean we can look up what that hardware chip is and get more details.

Absolutely - the TVs have extra dedicated chips for this purpose which is why 100hz tvs cost more than 60hz (other than just marketing)- more hardware is needed to make the interpolation possible as well as faster switching panels. This technology is inside things like Sony's "bravia engine" and the other "brands" each company puts on the tvs (a 60hz bravia and a 100hz bravia set have different chips for example).
 
Last edited:
Joined
Sep 25, 2006
Messages
2,312 (0.36/day)
Location
Norn Iron
Processor Q9550 @3.8
Motherboard Asus Maximus Extreme
Cooling Custom water cooling
Memory 4GB Patriot Viper DDR3 1600MHz
Video Card(s) 2x HD4870 512MB
Storage 2x 500GB
Display(s) 3x LG L226WTQ 22" Widescreen LCD
Case Modded TJ07
Audio Device(s) On board
Power Supply PC P&C Silencer 750
Software Windows 7 Ultimate
but the 60mhz which is still at least 2x the FPS of anything you'd view not? shouldn't 60mhz tv's be able to up the FPS of a sports game from 30 to at least 45? ...


I remember reading this -

NTSC is 29.97 Frames per second (Fps) which is 59.94 interlaced fields per second (fps). Standard US TV's are therefore usually said to be 60Hz field rate and as European TV is 25 Frames per second which is 50 fields per second it is usually referred to as 50Hz.

So maybe confusing frames and fields per second is where some people go wrong.
 
Joined
Dec 5, 2006
Messages
7,704 (1.22/day)
System Name Back to Blue
Processor i9 14900k
Motherboard Asrock Z790 Nova
Cooling Corsair H150i Elite
Memory 64GB Corsair Dominator DDR5-6400 @ 6600
Video Card(s) EVGA RTX 3090 Ultra FTW3
Storage 4TB WD 850x NVME, 4TB WD Black, 10TB Seagate Barracuda Pro
Display(s) 1x Samsung Odyssey G7 Neo and 1x Dell u2518d
Case Lian Li o11 DXL w/custom vented front panel
Audio Device(s) Focusrite Saffire PRO 14 -> DBX DriveRack PA+ -> Mackie MR8 and MR10 / Senn PX38X -> SB AE-5 Plus
Power Supply Corsair RM1000i
Mouse Logitech G502x
Keyboard Corsair K95 Platinum
Software Windows 11 x64 Pro
Benchmark Scores 31k multicore Cinebench - CPU limited 125w
I might make a side note here that not all bluray is 24fps!
100hz is an odd number, 120hz LCD's now..
120hz is a pretty sweet number to be at, it's divisible by 24/30/60 for various movies.

Now remember, Plasma's are between 400-600+ hz.
They are well known to out perform LCD's in many aspects *except power draw and cost*.

Higher HZ the better, but you kinda want them to be a multiple of you common viewing fps also.
Some sets will also step down for 24fps media.

As always, if you are looking for a lot of in depth info you maybe better off looking at avsforum.com
 
Joined
Oct 4, 2007
Messages
2,452 (0.41/day)
System Name PC
Processor i7 9700KF
Motherboard MSI Z390 A PRO
Cooling Noctua NH-U14S
Memory 32GB Corsair Vengeance DDR4 3000mhz
Video Card(s) PALIT RTX 4070 Dual 12Gb
Storage 2X Crucial MX500 2TB SSD, Samsung 850 pro 512gb SSD
Display(s) DELL C34H89x 34" Ultrawide
Case Corsair Obsidian 550D
Audio Device(s) Audioengine A5+ Speakers
Power Supply Corsair RM750
Mouse Logitech G403
Keyboard Corsair Vengeance K70
Software Windows 10 64bit
im sorry that im having a hard time getting this... its just it sounds good on paper but the actual technology that is supposedly doing these operations dosent seem viable or possible.

In order to know fill in the missing frames, it will need to know what the last frame its working on to be, and it'd need to buffer it... but with a digital broadcast or a DVD the processing happens inside the decoder or player... not the TV... then that signal is played on the screen directly.... no time to mess with the signal or it would be out of synch with the sound coming out of the decoder / dvd player thats attached to your 5.1 surround system...

If its going to assume where something is going to be... itll have to have it buffered up to know where the last frame is in order to fill in the rest... you cant buffer with a realtime media
 
Joined
Dec 5, 2006
Messages
7,704 (1.22/day)
System Name Back to Blue
Processor i9 14900k
Motherboard Asrock Z790 Nova
Cooling Corsair H150i Elite
Memory 64GB Corsair Dominator DDR5-6400 @ 6600
Video Card(s) EVGA RTX 3090 Ultra FTW3
Storage 4TB WD 850x NVME, 4TB WD Black, 10TB Seagate Barracuda Pro
Display(s) 1x Samsung Odyssey G7 Neo and 1x Dell u2518d
Case Lian Li o11 DXL w/custom vented front panel
Audio Device(s) Focusrite Saffire PRO 14 -> DBX DriveRack PA+ -> Mackie MR8 and MR10 / Senn PX38X -> SB AE-5 Plus
Power Supply Corsair RM1000i
Mouse Logitech G502x
Keyboard Corsair K95 Platinum
Software Windows 11 x64 Pro
Benchmark Scores 31k multicore Cinebench - CPU limited 125w
im sorry that im having a hard time getting this... its just it sounds good on paper but the actual technology that is supposedly doing these operations dosent seem viable or possible.

In order to know where something is going to be, youd need to buffer it... but with a digital broadcast or a DVD the processing happens inside the decoder or player... not the TV... then that signal is played on the screen.... no time to mess with the signal or it would be out of synch with the sound coming out of the decoder / dvd player thats attached to your 5.1 surround system...

If its going to assume where something is going to be... itll have to have it beffered up to know where the last frame is in order to fill in the rest... you cant buffer in real time

What you don't realize is that it IS out of sync. It doesn't need to be far out of sync, you are sending digital signals over 2 different cables being processed by two different decoders.

This is truly why Analog is FTW to any true Audio/Videophile, even if it's not noticeable, it is still simple fact.
 
Top