• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

A 1 ms GTG monitor vs. A G-Sync enabled monitor

  • Thread starter Thread starter Deleted member 138597
  • Start date Start date
D

Deleted member 138597

Guest
Is the smoothness of a 1 ms GTG monitor equal to the smoothness of a G-Sync enabled monitor? If it is or very nearly so, then it's a good news for AMD fans. Anyway, will a 1440p monitor be better or worse against a 1080p monitor for desktop use (I mean, the monitor's quite close to sight, so will it be too awkward for a 27" monitor?)?
 
Grey to gray time has nothing to do with syncing monitor refresh rate to game's frame rate which is what g-sync does.
 
If you play games that run at 60 fps all the time on 60 Hz monitor (or 120 fps on 120 Hz monitor) then g-sync will make no difference, you are better off with faster response. If frame rate dips below 60 then g-sync will provide much better experience.
In any case even 6 ms GTG is fine for most of the games.
... and btw, in order to use g-sync, you'll need to go for nvidia card at least 6 series, 650 Ti boost or better.
 
If you play games that run at 60 fps all the time on 60 Hz monitor (or 120 fps on 120 Hz monitor) then g-sync will make no difference, you are better off with faster response. If frame rate dips below 60 then g-sync will provide much better experience.
In any case even 6 ms GTG is fine for most of the games.
... and btw, in order to use g-sync, you'll need to go for nvidia card at least 6 series, 650 Ti boost or better.
What if I want to go for a AMD card for its lesser price (i.e. An R9 290X) over Nvidia ones? Will G-Sync be a wastage of money? And why should I choose an relatively expensive nvidia card except for G-Sync? Will AMD bring similar tech later or will Nvidia be kind enough to make it open for its competetitor?
 
that g-sync bs. it is as useful as physics.
what you need for gaming is monitor with low input lag (check this)
1440p monitor will give you about 36% more pixels over 1080p so your desktop will be 36% bigger.
 
that g-sync bs. it is as useful as physics.
what you need for gaming is monitor with low input lag (check this)
1440p monitor will give you about 36% more pixels over 1080p so your desktop will be 36% bigger.

So you've seen one irl?????
 
Completely off topic but how did you get a animated avatar?

I have seen someone with animated avatar (tatty_one?) and tried some GIFs ... apparently GIF-s under 50 kB that don't get automatically resized (under 200x200) work fine .... trial and error :laugh:
 
So you've seen one irl?????

I have and it's a HUGE difference (when running between 35 and 59 FPS). As others have mentioned before, it has nothing to do with pixel response time, but low response time = less blur. G-Sync = less stuttering.

Not sure if it's worth the price point though.
 
So you've seen one irl?????
nop i havnt. i havnt seen physics either.
oh wait. there was a game that had asked me for physics - two worlds if i remember well. havent seen anything different in it.
oh and i heve seen nvidia ad about physics in metro last light. it was showing that when shooting at walls concrete particles fly around. very impressive. when i play games my only goal is to shoot at walls so i can enjoin how cool concrete particles fly around.

I have and it's a HUGE difference (when running between 35 and 59 FPS). As others have mentioned before, it has nothing to do with pixel response time, but low response time = less blur. G-Sync = less stuttering.

Not sure if it's worth the price point though.
if i have problems with framerates in game i would rather invest in better graphics (which will also let me use my rig longer) or will drop video settings if i cant afford upgrade atm, then to pay for something that will limit my choice when upgrading vga card later
 
Thankyou all of you people for helping me figuring this out. I really appreciete your suggestions. Thanks again for your kind support.
 
nop i havnt. i havnt seen physics either.
oh wait. there was a game that had asked me for physics - two worlds if i remember well. havent seen anything different in it.
oh and i heve seen nvidia ad about physics in metro last light. it was showing that when shooting at walls concrete particles fly around. very impressive. when i play games my only goal is to shoot at walls so i can enjoin how cool concrete particles fly around.

Ah, you mean PhysX.

Physics is a science.
 
ageia.... it was physx beforce nvidia bought it, level up your anti nv knowledge

gsync is hardly bs, it's the future if there's a new public standard for displayport, LCDs have spent too long wasting time at fixed refresh rates & acting like CRTs to the OS
 
nop i havnt. i havnt seen physics either.
oh wait. there was a game that had asked me for physics - two worlds if i remember well. havent seen anything different in it.
oh and i heve seen nvidia ad about physics in metro last light. it was showing that when shooting at walls concrete particles fly around. very impressive. when i play games my only goal is to shoot at walls so i can enjoin how cool concrete particles fly around.

You clearly miss to point of PhysX. It's not to make shooting walls pretty, it's to add realistic physics simulations to games. Anything to can add realism adds depth to the game, and makes you less aware you're playing in a fictional world. Also, you have an AMD video card which means PhysX doesn't run properly on your system, it's emulated on the CPU using an instruction set created by cavemen.

if i have problems with framerates in game i would rather invest in better graphics (which will also let me use my rig longer) or will drop video settings if i cant afford upgrade atm, then to pay for something that will limit my choice when upgrading vga card later

So you can run every game on the market at absolute max settings and maintain a constant 60 fps? I call BS looking at those system specs.
 
well if you're getting high FPS on 1080 then I suggest just get a new monitor like a 1440 that way your FPS will go down a bit more. I like ISP more than TN and so far and not sure if there are any 120hz IPS monitors. but I prefer going bigger resolution compared to a 1080 120hz since I like big resolution my self.
 
I have and it's a HUGE difference (when running between 35 and 59 FPS). As others have mentioned before, it has nothing to do with pixel response time, but low response time = less blur. G-Sync = less stuttering.

Not sure if it's worth the price point though.
The way I understand it, the gold standard as it were is to have the card locked at 120Hz vsync with no dropped frames? G-Sync helps massively with the situation where the framerate cannot maintain 120fps. I'll likely end up buying this when they bring out a 27" G-Sync monitor and I have the appropriate graphics card.
 
well if you're getting high FPS on 1080 then I suggest just get a new monitor like a 1440 that way your FPS will go down a bit more. I like ISP more than TN and so far and not sure if there are any 120hz IPS monitors. but I prefer going bigger resolution compared to a 1080 120hz since I like big resolution my self.

IPS Panels are better for image quality, but they tend to lack a bit in response time. I know there was a 120hz IPS Panel floating around (it made the TPU news feed) but I'm not sure it was ever released outside of Korea or China (wherever it was made). 120hz isn't all its cracked up to be, then again, when I play on monitors that only offer 60hz it seems less enjoyable...
 
ageia.... it was physx beforce nvidia bought it, level up your anti nv knowledge
?!? :confused:

gsync is hardly bs, it's the future if there's a new public standard for displayport...
ugh... what are you trying to say. what is connection between new dp standard and g-sync?

You clearly miss to point of PhysX. It's not to make shooting walls pretty, it's to add realistic physics simulations to games. Anything to can add realism adds depth to the game, and makes you less aware you're playing in a fictional world.
rly.
i remember velve said same thing about "realistic physics simulations" back when source was revealed but still that doesnt stoped crytek and epic to make their own engines.
it may work for you but i'd pity the game that need to count on "realistic physics simulation" in order to keep immerison.
one can make realystic physics simulations with any current game engine. it is just question of willingness and time+money. physics simulation doesnt start and end with physx. physx+sdk make it easier but it isnt THE ONE AND ONLY :respect: condition to have physics simulations.
if you enjoy so much on flying particles pay extra for your physx card. it's your money you decide how to spend it. that doesnt change the fact that phisx is last reason for buying nvida card.
i personally dont care how heads will explode as soon as there are enough heads to shoot at so when i look for new video card i'd rather look for price/performance ratio in my price range then for secondary features that are implemented in just few games.

Also, you have an AMD video card which means PhysX doesn't run properly on your system, it's emulated on the CPU using an instruction set created by cavemen.
ever thought that i might own(ed) nvidia video card(s) too?

So you can run every game on the market at absolute max settings and maintain a constant 60 fps? I call BS looking at those system specs.
i care less how you call it or not.
i havent said anything about my current specs so far so i dont see how my specs mess with g-sync and should it be decisive factor for buying vga and/or monitor. does my sys specs and game settings your stronges argument pro g-sync...
i run all games i like on setting that make em look the way i like and have frame rate that let me play w/o lag, shuttering or any negative "efects" you may think for. in fact i quite offten lower max settings intentionally cause those "realistic effects" are implemented in the way that actually for me brakes gameplay instead to add immersion


edit:
The way I understand it, the gold standard as it were is to have the card locked at 120Hz vsync with no dropped frames? G-Sync helps massively with the situation where the framerate cannot maintain 120fps. I'll likely end up buying this when they bring out a 27" G-Sync monitor and I have the appropriate graphics card.
you cant lock card that cant keep constant 60+ fps to 120fps. g-sync is lowerting monitor refresh to match vga fps so you dont see one and the same frame till vga manage to render next frame. this way they make illusion of fluid shutturing free fps. g-sync wont boost your vga fps so if your card cant reach constant 30+ fps your game will lag, shutter (call it whatever you like) and only thing can fix it is better vga.
 
Last edited:
1-2 ms 120hz monitors are veery sweet, buy 1 of them or wait 4-5 month for g-sync. No another ways if you have money and love gaming. ips or other "1600p bomb" is shit compared to the 1ms /120hz. My advice is not to go for higher resolutions at least few years. No valuable reason to lose fps for some hyper-saturated colors.
 
You clearly miss to point of PhysX. It's not to make shooting walls pretty, it's to add realistic physics simulations to games. Anything to can add realism adds depth to the game, and makes you less aware you're playing in a fictional world. Also, you have an AMD video card which means PhysX doesn't run properly on your system, it's emulated on the CPU using an instruction set created by cavemen.



So you can run every game on the market at absolute max settings and maintain a constant 60 fps? I call BS looking at those system specs.

I can!
 
IPS Panels are better for image quality, but they tend to lack a bit in response time. I know there was a 120hz IPS Panel floating around (it made the TPU news feed) but I'm not sure it was ever released outside of Korea or China (wherever it was made). 120hz isn't all its cracked up to be, then again, when I play on monitors that only offer 60hz it seems less enjoyable...

There is no 120hz IPS panel, the Korean panels are being overclocked because they use a very basic PCB that is so dumb it doesn't even prevent you from sending a 120hz signal because it doesn't know it's not supposed to. They also only work as gaming monitors because they lack a scaler and have only a single input which means that contrary to normal 1440p IPS panels they don't have input lag.

As for the previous comment about 120fps and VSYNC, even if you run 120fps solid VSYNC is always a bad idea because it introduces significant processing lag to your gameplay. It's not a big deal for single player games, but it's virtually unplayable for multiplayer. If you can't tell the difference then you're already hopeless, it's really, really noticable. I recently got my 1440p monitor and overclocked it, for some reason when I switched resolutions it enabled VSYNC. It took me like 10 min to figure out why I was lagging so badly in CS:GO with my beast of a computer, then I noticed VSYNC was on and the second I turned it off everything was perfect again.

G-Sync also introduces lag, it's just much smarter than VSYNC so it's a lot less lag. Personally, I'll make sure I'm always running 120fps and just leave it off even if I have a monitor capable of using it. Any processing you do introduces lag...the more inputs...more lag....more scaling...you guessed it..more lag. Lag = evil. Don't have lag!
 
Back
Top