• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

It's persistance of vision, or lack thereof. Reaction time isn't just the visual component, it's understanding what you're looking at, then the brain telling the muscles to move, and then the muscles actually moving. Persistance of vision just being *slightly* off can cause motion sickness or dizziness.
Slightly off from what? It's not like you're also seeing a reference frame while gaming.
I don't often agree with Vya, but making a big fuss of a 16ms delay when your reaction times is 15x that is a little out there.
The only time that would make a difference would be if your mouse movement lagged perceivably behind. But you can test that on the desktop: it doesn't. The only reason these things are advertised is because professional monitors value color accuracy above everything else, thus they don't tend to use overdrive and can have latencies in the 30-50ms (or worse). That could cause the mouse to lag a little behind fast movements.
 
It's it true HDR? And is it ironic Freesync is a trademark?

The reality is a lot of this display nonsense is expensive and it will get cheaper over time, I just can't get invested in the salty tears of the dodgy supply chain.
 
I think Intel is missing an opportunity here. Intel owns Altera

All G-Sync monitors should have
Content_intelins.jpg
 
Overpriced Intel Altera tech and DRAM companies price fixing. We spend years waiting for the competition to turn up, then suddenly some get into bed together with some weird Kaby G VegaPolaris M frankenstein lark.

We all have Intel inside, God bless them.
 
Persistance of vision just being *slightly* off can cause motion sickness or dizziness.

And single digit ms variations can cause that ? We are talking a few 1/1000ths of a second , I was unable to find any study or price of information than can confirm humans can spot variations that small and be able to respond to them in a consistent way. Not to mention humans can't even maintain a consistent reaction time in the first place , it always varies by 20 , 30 ,50ms , you are telling something like 16ms can have a measurable effect ?
 
Last edited:
Just check how many Freesync and how many G-Sync monitors are on the market. Even the second biggest TV manufacturer allowed Freesync support on some of their 2018 models. People can taunt AMD but getting a much cheaper monitor with the same specifications is good for everyone.
Yeah, that's what I'm saying, FreeSync is becoming the de facto standard. NVIDIA better wake up and smell the roses before it's too late.
 
Gsync and Freesync is a joke, unless you play in 30-60 fps range. Tearing is not an issue at 120+ fps using 120+ Hz.

No serious gamer should use VSYNC. Adds input lag.

I have Gsync. I use ULMB instead. Way better. Any gaming LCD should use black frame insertion. Much less blur in fast paced games.


I used to play exclusively at 120 fps/120 hz on a huge CRT back in the day and even then I needed vsync. Get a clue, tearing is monumentally distracting and drives me nuts at any rate. You must be blind. Adaptive sync is awesome and the future for all gaming. Go get some more milk from mommy.
 
I laugh my ass off every time I see this. The average human reaction time is something like 250ms , whoever seriously thinks that a time frame of 16ms of less can make a perceivable difference is being delusional.

I remember playing the COD WWII beta , using v-sync at 60fps and a shitty wireless mouse I was pretty much always in the top 3. With all that "unbearable lag" , go figure.

Yeah and why do you think there is a professional gaming market playing with systems like 144FPS?

I can feel a difference in both Vsync on (capped at 72hz) or 72hz without Vsync and 110FPS produced by my GPU. The difference is is that i can see litterally the input lag from sweeping left to right in Pubg for example. Your GPU is actually being constrained to limit at your refresh rate, and this constain adds input lag. Maybe you or your neighbour does'nt note this but anyone playing a FPS game will note / feel the difference.

Play long enough and you'll understand that at some point the 60Hz / 60FPS becomes a limitation in any fast pace FPS game.
 
Yet G-sync feels like a over rated technology on a whole. Not overly impressed by my own screen, just glad I didn't pay the full price for it.
The fact they're using an FPGA suggests Nvidia doesn't expect to sell the kind of volume of these screens were a custom ASIC would make sense from a cost perspective, which further shows how over rated G-sync is.
I completelly agree. I have also a G-Spot monitor, but to be honest playing on a 1440p with an GTX 1080 cannot go more than 100 fps anyways and it's always averaging between 40-80 fps. So I never get any tearing even with those x-SYNCs enabled or not... Plus fluidity....hmmm
 
And single digit ms variations can cause that ? We are talking a few 1/1000ths of a second , I was unable to find any study or price of information than can confirm humans can spot variations that small and be able to respond to them in a consistent way. Not to mention humans can't even maintain a consistent reaction time in the first place , it always varies by 20 , 30 ,50ms , you are telling something like 16ms can have a measurable effect ?
https://en.wikipedia.org/wiki/Subliminal_stimuli

Just because the higher orders of the brain ignores something doesn't mean it didn't register with lower orders of brain function.

Higher refresh rate and more frames translates to a smoother, clearer picture. Example: get in a vehicle and drive the speed limit, now focus on grass just beyond the road out the passenger window. Your brain will take all of that data, force your eyes to hook on to a specific reference point, and take a snap shot of it. In that instance, you'll have clarity. Try to do the same thing video and the clarity simply isn't there. There's huge gaps in the data between frames. Your brain will end up hooking on to a single frame and recalling that picture.

Again, this has nothing to do with reaction time and everything to do with how the brain handles eye sight:
https://en.wikipedia.org/wiki/Persistence_of_vision
 
Just check how many Freesync and how many G-Sync monitors are on the market. Even the second biggest TV manufacturer allowed Freesync support on some of their 2018 models. People can taunt AMD but getting a much cheaper monitor with the same specifications is good for everyone.
Freesync equivalents lack strobing, which is probably the best technology for gaming ATM as long as you can keep a constant, high fps.
And single digit ms variations can cause that ? We are talking a few 1/1000ths of a second , I was unable to find any study or price of information than can confirm humans can spot variations that small and be able to respond to them in a consistent way. Not to mention humans can't even maintain a consistent reaction time in the first place , it always varies by 20 , 30 ,50ms , you are telling something like 16ms can have a measurable effect ?
You're mistaking reaction time with what one can observe. Still confused. And why do you need a study,there's a million people who can tell you this and, better even, you can get a high refresh display yourself instead of relying on some digits taken out of context. What "human reaction" varies by 50ms ? Certainly not players' muscle memory when they play shooters. You're taking stuff out of context again.
 
Last edited:
Ah,good find.

A ‘1ms MRPT’ (Moving Picture Response Time) is specified, achieved using the ‘Impulsive Scanning’ strobe backlight mode.

Though I wonder how dark a VA panel gets in strobing mode.
 
Last edited:
I used to play exclusively at 120 fps/120 hz on a huge CRT back in the day and even then I needed vsync. Get a clue, tearing is monumentally distracting and drives me nuts at any rate. You must be blind. Adaptive sync is awesome and the future for all gaming. Go get some more milk from mommy.
You used VSYNC? Hahaha... You like input lag? Casual gamer detected. Case closed.
 
Last edited:
Low quality post by TheGuruStud
You used VSYNC? Hahaha... You like input lag? Casual gamer detected. Case closed.

Ban this retard, please. He knows nothing. He's never even used a CRT nor even knows the amount of input lag incurred from anything. And thinks that you can actually see anything while gaming with tearing. Mommy should have aborted this mouth breather.
 
Ban this retard, please. He knows nothing. He's never even used a CRT nor even knows the amount of input lag incurred from anything. And thinks that you can actually see anything while gaming with tearing. Mommy should have aborted this mouth breather.

Read and learn - https://www.blurbusters.com/faq/motion-blur-reduction/

Professional gamers don't use Gsync or Freesync. Why do you think?

Btw how old are you? Talking about CRT's yet acting like a teen :laugh:
 
Low quality post by TheGuruStud
Read and learn - https://www.blurbusters.com/faq/motion-blur-reduction/

Professional gamers don't use Gsync or Freesync. Why do you think?

Btw how old are you? Talking about CRT's yet acting like a teen :laugh:

Do you think you're a pro gamer? Lolololololololololololololololololol. I bet you also think you're a world class driver in mommy's Civic, dumb ass. Tearing ruins gaming a lot more than 10ms.

And it's funny, vsync didn't stop me from ROFL stomping wannabes like yourself online in every FPS.
 
Gsync and Freesync is a joke, unless you play in 30-60 fps range. Tearing is not an issue at 120+ fps using 120+ Hz.

No serious gamer should use VSYNC. Adds input lag.

I have Gsync. I use ULMB instead. Way better. Any gaming LCD should use black frame insertion. Much less blur in fast paced games.

The problem is even on my 3440*1440 monitor 120+ fps requires SLI and good game support of SLI, not to mention 4K ones.

To my surprise I can't find ULMB ultrawides either.
 
Do you think you're a pro gamer? Lolololololololololololololololololol. I bet you also think you're a world class driver in mommy's Civic, dumb ass. Tearing ruins gaming a lot more than 10ms.

And it's funny, vsync didn't stop me from ROFL stomping wannabes like yourself online in every FPS.

You are clueless. Pro gamers use what's best, and they don't use Gsync or Freesync.
Nah I'm not pro, but I know how games are supposed to run. No motion blur and lowest possible input lag.

Seriously, how old are you? Hahaha. You act like a mad teen. Ragekid?
 
You are clueless. Pro gamers use what's best, and they don't use Gsync or Freesync.

That is where you're wrong. In the absence of statistics showing gamers using fast refresh screens consistently win more than the others, the more likely explanation is gamers use whatever the sponsors make available for them.
If refresh was paramount, there'd be a plethora of CRTs at every competition.

Also, judging what's best for you based of what professional gamers use is like looking at Formula 1 or Nascar to see what car you need to buy next.
 
That is where you're wrong. In the absence of statistics showing gamers using fast refresh screens consistently win more than the others, the more likely explanation is gamers use whatever the sponsors make available for them.
If refresh was paramount, there'd be a plethora of CRTs at every competition.

Also, judging what's best for you based of what professional gamers use is like looking at Formula 1 or Nascar to see what car you need to buy next.

Not really the same. They are not superhumans. Anyone will benefit from no motion blur. LCD/OLED has tons of motion blur without proper blur reduction mode.
Have you tried gaming with motion blur reduction mode? It's like playing on CRT again.

pieg.jpg
 
Not really the same. They are not superhumans. Anyone will benefit from no motion blur. LCD/OLED has tons of motion blur without proper blur reduction mode.
Have you tried gaming with motion blur reduction mode? It's like playing on CRT again.
You continue to assume gaming can only happen on the very high end...
 
Back
Top