• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

How G-Sync and FreeSync differ right deep down inside

Which is better at doing the job, G-Sync or FreeSync?


  • Total voters
    55
Joined
Sep 17, 2014
Messages
20,949 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Gsync increases input lag so for any kind of competitive gaming, it is a BIG no-no. (If you don't have the FPS and still show it 'as such', you inherently introduce artificial input lag just like Vsync does) Since I also notice input lag with Vsync from time to time, Gsync will NEVER get into my house :) Let alone the price premium.

FreeSync may also introduce input lag but it seems like a more direct and a more 'WYSIWYG' approach. If frames drop too low, you see low FPS which will be equal to the input latency - your input shows up at the next available frame just like the frame delivery itself. This makes gameplay choppy at low FPS, but it does keep input on that choppy gameplay relative to what you see.

All things considered I think they are both relatively useless tech for any high end gaming rig. They are great for laptops and other lower specced machines though, where variable FPS is definitely an issue. But really, for high end systems that aim for 60/120/144 fps gaming, what the *F* do you need Gsync or FreeSync for? You just tweak settings to get a solid framerate and you're done, for exactly 0 eur/dollar and complete freedom in your choice of monitor. That's a win in my book. Especially since there is FINALLY a market for high end gaming monitors as well: just grab a solid 144hz IPS monitor and lock it at the desired framerate/hz setting, one that you can maintain in-game, and done.

Let's not forget that first and foremost these implementations are marketing above anything else. It's not like we had a crappy game experience until the year 2015, and (adaptive-) Vsync works fine in non-competitive environments, while competitive environments generally just want the highest possible FPS at the lowest input lag, eliminating any kind of Sync.

Nvidia is going to laugh their ass off at all those l33t gamers that burn another couple hundred dollars on a small piece of silicon. And AMD is likely going to benefit only in its own imagination while forgetting about their sales figures and returns on investment.
 
Last edited:
Joined
Feb 18, 2011
Messages
1,259 (0.26/day)
Gsync dips below the monitors limits 50-144hz. Your paying a $200 premium for your monitor to be under driven below spec
It doesn't dips below the monitor limits, it's the lowest Hz officialy supported (probably because of vesa standards). If you buy a 4790K, do you pay the extra money to run it on stock 4Ghz, or will you damage it if you go beyond spec and run it on 4.5Ghz? The flicker happens when a game suddenly drops to 0 fps (game engine holds a frame), and that could be surely avoided from game code or from the drivers perhaps. Those monitors work just fine at the refresh rates the Gsync is asking for, but I'm also think that Nvidia can fine-tune those values as time goes on if some flickering will remain.

Gsync increases input lag so for any kind of competitive gaming, it is a BIG no-no. (If you don't have the FPS and still show it 'as such', you inherently introduce artificial input lag just like Vsync does) Since I also notice input lag with Vsync from time to time, Gsync will NEVER get into my house
Let alone the price premium.
It's not like vsync at all, and I would like to see your test how much input-lag it really creates. Blurbusters did a nice test about this last year, you might want to read it.

What is happening here is that Nvidia (finally) syncing together the GPU and the Monitor via hardware. It’s the first time we see this being attempted with gaming in mind. I wonder since when do we oppose new hardware solutions this badly?

PS.: Oh.. and I can't help it but I just smile at some redneck "pro" gamers who think they are better because of 5-10ms advantage, 9-button laser mouse , or whatnot, it’s just funny.
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
It doesn't dips below the monitor limits, it's the lowest Hz officialy supported (probably because of vesa standards). If you buy a 4790K, do you pay the extra money to run it on stock 4Ghz, or will you damage it if you go beyond spec and run it on 4.5Ghz? The flicker happens when a game suddenly drops to 0 fps (game engine holds a frame), and that could be surely avoided from game code or from the drivers perhaps. Those monitors work just fine at the refresh rates the Gsync is asking for, but I'm also think that Nvidia can fine-tune those values as time goes on if some flickering will remain.

Bad analogy.

A more fitting one is a 4790k which has a base frequency of 4ghz and a boost clock of 4.4ghz retailing at $339.99 being sold as a 4790k Gsync edition with a underclock 3ghz frequency at $539.99 marketing a smoother transition back to the original base clock of 4ghz.
 
Last edited:
Joined
Nov 9, 2010
Messages
5,654 (1.15/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
Sorry to be so curious, but do you even know what are you talking about? Why do you think there is a minimum refresh rate set for every LCD monitor in the market? You lit a pixel for too long on the panel without refreshing it and colors and whatnot starts to drift pretty bad. There is a minimum refresh rate there to ensure image quality (sorry to break it to you like this, but some of us still finds that important here). If technology will allow it, they will make monitors with lower minimum refresh-rates ofc, but that day is yet to come for the mainstream market. Until then, enjoy your jerkiness and judders because of fps drops in heavily multiplayer games.

Even with that "curious" look on your Koala's face, he looks smarter than you sound. It's not that low refresh displays are hard to make or impractical, it's just that they've not been needed until now. This is a tech worthy of supporting with them, and if VESA had thought otherwise, they'd not have adopted it in the first place.

So, I'm curious, do you actually think you know more about the display market and what's good for it than VESA does? If you do, I'd like to see what kind of suckers join you in that fantasy.
 
Joined
Feb 18, 2011
Messages
1,259 (0.26/day)
Even with that "curious" look on your Koala's face, he looks smarter than you sound. It's not that low refresh displays are hard to make or impractical, it's just that they've not been needed until now. This is a tech worthy of supporting with them, and if VESA had thought otherwise, they'd not have adopted it in the first place.

So, I'm curious, do you actually think you know more about the display market and what's good for it than VESA does? If you do, I'd like to see what kind of suckers join you in that fantasy.
I spent more than 4 decades of my life with computer technologies, I might be far from being the best on the field, but I can safely say that I know my way around in the subject. Vesa is a display standard. There are standards for everything from car tires to glasses, or whatever. Something not being part of a standard has nothing to do with it's practicality, LCD panels and its pixels and refreshes would work exactly the same way without VESA being around. But that's not the issue anymore, but the fact that you went flat out childish, and resorted your argument to a personal insult based on my avatar or whatnot, I have a 10 years old kid here who has more manners and sense than what you just demonstrated, so I don't ever want to hear from you again. I put you on ignore, I wish you all the best!

Bad analogy.

A more fitting one is a 4790k which has a base frequency of 4ghz and a boost clock of 4.4ghz retailing at $339.99 being sold as a 4790k Gsync edition with a underclock 3ghz frequency at $539.99 marketing a smoother transition back to the original base clock of 4ghz.

There might be better anologies indeed, but I just wanted to say that using something out of its spec doesn't mean anything, and it's certainly doesn't mean you are using it beyond its limits. I agree with you that Nvidia is expensive, they are indeed, but they give you the fastest on the planet as a return at least, something only they do for many years now. Some finds that fair, some doesn't.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
PS.: Oh.. and I can't help it but I just smile at some redneck "pro" gamers who think they are better because of 5-10ms advantage, 9-button laser mouse , or whatnot, it’s just funny.
What's more funny is how science has shown that movement initiated by the brain can have a latency north of 300ms. So to think that people are benefitting from 5-10ms are probably just trying to bolster their ego with no real empirical evidence to support their claims. I would sooner trust science than grandiose claims.

Wikipedia said:
Simple reaction time is the motion required for an observer to respond to the presence of a stimulus. For example, a subject might be asked to press a button as soon as a light or sound appears. Mean RT for college-age individuals is about 160 milliseconds to detect an auditory stimulus, and approximately 190 milliseconds to detect visual stimulus.[2][3] The mean reaction times for sprinters at the Beijing Olympics were 166 ms for males and 189 ms for females, but in one out of 1,000 starts they can achieve 109 ms and 121 ms, respectively.[4]
Source
I spent more than 4 decades of my life with computer technologies, I might be far from being the best on the field, but I can safely say that I know my way around in the subject.
Don't be so quick to invoke experience as a fallback. Research, links, and sources do a lot more for an argument than "I have 40 years experience." I say this because:
Vesa is a display standard.
VESA is a company that has produced many standards.
Wikipedia said:
VESA (/ˈviːsə/), or the Video Electronics Standards Association, is an international non-profit corporationstandards body for computer graphics formed in 1988 by NEC Home Electronics, maker of the MultiSync monitor line, and eight video display adaptermanufacturers: ATI Technologies, Genoa Systems, Orchid Technology, Renaissance GRX, STB Systems, Tecmar, Video 7 and Western Digital/Paradise Systems.[1]
Something not being part of a standard has nothing to do with it's practicality, LCD panels and its pixels and refreshes would work exactly the same way without VESA being around.
What are you trying to say here? All monitors are the same because standards have pushed display manufacturers that way. So I have some serious reservations about them working the same anyways where most hardware in the past hasn't demonstrated the ability to stay in sync with what all other companies are doing, hence the requirement for a specification.
But that's not the issue anymore, but the fact that you went flat out childish, and resorted your argument to a personal insult based on my avatar or whatnot, I have a 10 years old kid here who has more manners and sense than what you just demonstrated, so I don't ever want to hear from you again. I put you on ignore, I wish you all the best!
Sorry you feel offended, but he's right about the VESA bit. You're making some assumptions about how the world works (not even with computers) when talking about hardware evolution with or without standards.
Even with that "curious" look on your Koala's face, he looks smarter than you sound.
Stop trying to start an argument. You know the rules with respect to name calling... Don't do it.
What is happening here is that Nvidia (finally) syncing together the GPU and the Monitor via hardware. It’s the first time we see this being attempted with gaming in mind.
Actually, VSYNC was already doing this. The difference between V-Sync and G-Sync/FreeSync is that in addition to match the frame (if V-Sync is on,) it adjusted the refresh rate to match the frame rate. V-sync doesn't alter the refresh rate in any form, but it does synchronize frame draws to the beginning of each frame (if it's available.) If frame latencies were steady, we wouldn't need G-Sync or FreeSync.
 
Last edited by a moderator:
Joined
Feb 18, 2011
Messages
1,259 (0.26/day)
TheTechReport Scott Watson talking about the G-Sync & FreeSync comparisons (Time linked to the discussion)

He too thinks the BenQ and Asus share the same panel but besides that. He is much more sound in his assessment. He even takes a small shot at PCPerspective on the matter.
They have a valid point, I think I said something similar a few posts earlier:

Don't get me wrong please, it's far from the end of the world ofc, I understand it's way blown out when you think about it like that, and I think Freesync is really great, props to AMD for taking a stand on the issue. All I'm saying (and said), if you compare the two technologies (on a tech enthusiast site like our beloved TPU): "I personally find Nvidia's approach much more elegant again".

And there is the point of this whole conversation, the bottom line, where you reach a conclusion as I already tried to point out. As far as we know it at this moment, Nvidia's approach/solution is better if you only see it from the technology side, but it's much more expensive and it's also a closed system which is clearly bad.

It comes down to personal preference if you are willing to support such company and if you are willing to pay the extra because it's worth it for you or not.


There was a critical and quite long time period (about 5 years ago) when I was thinking a lot about this Nvidia vs AMD "first world problem" and I was in a real shock when I realized that I tend to prefer the Apple of GPU makers over the "good" and free guys, and believe me, I really dislike almost everything what Apple does, their overpriced product with their stupidly closed down ecosystems.. and yet, I prefer Nvidia and Intel over AMD for almost a decade.

But whatever I buy from the greens works and makes me satisfied while I had a lot of bad time with AMD products and their drivers, so I can't help it (even if I really want to). There were two separate tech forums where I was constantly got laughed at and ridiculed by AMD fanboys because I consistently told them that the games just feel shit on their cards. I played a lot of fast paced games at high refresh rates (on CRT monitors at those times, and I could just feel it that something was wrong)

They told me for years that I make things up and I'm a delusional green fanboy, but the truth was that I was testing everything a lot for months and years and I knew I was right, until the AMD stuttering issue finally came to light and proven me.


So I guess I might be a little bit biased by now, because I trust Nvidia that G-sync will give me a smooth experience all the times, and I trust that it will work with any SLI configurations, with any custom resolution I want to make, etc.. so it will just work how I think it should work, while I just can't trust AMD at the same level, I swear I try but I just can't.


I don't think that Nvidia's closed down and elitist "way" is the right track on the long run vs the "free" and user-friendly ways of AMD's approach, and as soon as AMD will make something better than Nvidia, I will buy that without any hesitation! But now, I just can't help it when Nvidia's products and solutions are just better all the times, (g-sync is better after all and Maxwell is also better and more efficient again) and the only thing they ask as a return is a little bit of more money, what I'm willing to pay.
 

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,461 (2.37/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Interesting debate but falls down due to proprietary implementation on both sides. It would be bliss if Nvidia just plain accepted the adaptive sync implementation and a simple generic process/driver could be used so no gfx hardware was preferred.
For now, having to commit to a gfx brand (doesn't matter which camp) means I'll never buy a monitor with either tech.
Nothing as useless as a second piece of expensive hardware rendered pointless because you change a gfx card brand.
Screw them both.
 
Last edited:

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Interesting debate but falls down due to proprietary implementation on both sides. It would be bliss if Nvidia just plain accepted the adaptive sync implementation and a simple generic process/process could be used so no gfx hardware was preferred.
For now, having to commit to a gfx brand (doesn't matter which camp) means I'll never buy a monitor with either tech.
Nothing as useless as a second piece of expensive hardware rendered pointless because you change a gfx card brand.
Screw them both.
If I can't have the option, I don't want it. :toast:
This is why specs and standards are a good thing.
 
Joined
Feb 18, 2011
Messages
1,259 (0.26/day)
What's more funny is how science has shown that movement initiated by the brain can have a latency north of 300ms. So to think that people are benefitting from 5-10ms are probably just trying to bolster their ego with no real empirical evidence to support their claims. I would sooner trust science than grandiose claims

There was post about this I made some time ago here, and I even found a test I made just recently. I don't know where is your science coming from but If I do 160ms in my fifth X on a slow Samsung TV (LE40A656 wich has quite a big inputlag) then I'm sure a young and fit person does much less without problem (I certainly was much quicker in my twenties)

The Vesa thing is correct, but I'm sure you know well what he and I meant there, don't you? I was talking about the standard related to the conversation... as my sign says, sorry for my English , please accept my apology. And no there are no monitors out there with 5Hz minimum refresh rate because there was no demand until now, but because LCD technology simply doesn't allow such monitors at our current level of understanding. I'm not saying it's not possible that LG or somebody else will make one next week ofc, but even if they do that would have nothing to do with VESA, and right now there is no such panel and I don't know about any in the making.

There is no way frame latencies can be made "steady" in a dynamicaly rendered scene, so that "if" is meaningless, and v-sync also induced latency (with double or triple buffer, doesn't matter), while the GPU doesn't "waits" anymore with g-sync.
 
Last edited:
Joined
Nov 9, 2010
Messages
5,654 (1.15/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
I spent more than 4 decades of my life with computer technologies, I might be far from being the best on the field, but I can safely say that I know my way around in the subject. Vesa is a display standard.

LOL, and in ALL that time, apparently you haven't even learned what some of the common acronyms associated with it actually MEAN. VESA itself is not a "standard", it's the Video Electronics Standards Association. They're the group that DECIDE on what techs to adopt as standards for video electronics in general, esp regarding displays. That is why the VESA display mounting system is abbreviated VESA. It doesn't mean VESA itself IS a standard.

Thus obviously when manufacturers start making displays lower in refresh, it WILL have a lot to do with VESA having adopted Freesync, just like display manufacturers having adhered to VESA mounting patterns has EVERYTHING to do with VESA. VESA decides on these standards, particularly WITH manufacturers in mind, and manufacturers know there's a lot of research and unbiased opinions that go into VESA's decisions. And VESA are non profit, they don't gain anything from what they decide on. Manufacturers DO take them seriously.

Furthermore, there's been a lot of idiotic assumptions regarding ultra low refresh displays being made just because Freesync supports down to 9 HZ. Obviously those are not playable frame rates, so expect most low refresh displays to have a bottom end of more like 30Hz.

In short, it doesn't matter how long anyone's worked in PC tech, if you don't stay in the loop and use some common sense, it's doesn't mean a thing.
 
Joined
Dec 17, 2011
Messages
359 (0.08/day)
it's not Freesync itself, it's that it's brand new
We are having this discussion to decide what is someone supposed to buy today. If AMD comes with a better technology tomorrow then they will get my vote of favor. But they'll get that tomorrow. Today, G-Sync is better (that's why you pay more for it) and that's why I support it.
 
Joined
Jun 13, 2012
Messages
1,328 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
We are having this discussion to decide what is someone supposed to buy today. If AMD comes with a better technology tomorrow then they will get my vote of favor. But they'll get that tomorrow. Today, G-Sync is better (that's why you pay more for it) and that's why I support it.

Nvidia put time and effort in to making it, Like it or not they took their time to work out most issues. AMD made freesync in response and rushed it out the door. Even if that one fool said "it's not Freesync itself, it's that it's brand new". It maybe new but it was rushed out the door and end result is issues like it has. They need to be fixed and will be in time.
 
Joined
Nov 9, 2010
Messages
5,654 (1.15/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
We are having this discussion to decide what is someone supposed to buy today. If AMD comes with a better technology tomorrow then they will get my vote of favor. But they'll get that tomorrow. Today, G-Sync is better (that's why you pay more for it) and that's why I support it.


Well, that's YOUR take on this thread. The thread author left it open to 50% of the choices being Not sure, or Other, meaning he clearly made the poll with the understanding that these things don't have, nor should have to, be decided on by what we see today.

In fact you're the only one I see implying otherwise, yet when someone makes a solid argument against current displays being good enough to call a winner, you easily cave and say Freesync may pull ahead once such things ARE available.

So what the hell is your point really? It seems to me you are just pointlessly arguing to try to make a point that doesn't even seem to stand on solid ground. It's as if you're calling my point of view wrong while offering both points of view yourself just to try and cover all bases. I'm sorry but debate doesn't work that way. You pick a point of view and stick with it, or you begin to sound confused.

And for the record, despite your attempt to claim Nvidia as the current winner, the poll results show otherwise.
 
Joined
Dec 22, 2011
Messages
3,890 (0.86/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
HAHA the poll doesn't mean shit frankly. :p

Common consensus is Freesync is "good enough", but when it comes down to the nitty gritty of a full VRR implementation G-SYNC wins, AMD leave it to the monitor manufacturers to do the dirty work and no surprise they also do the minimum required.

At the end of the day Nvidia will continue to do what they want regardless, and AMD will do what they want to too (within their means).

Awesome poll though, very conclusive.
 
Joined
Jun 13, 2012
Messages
1,328 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
HAHA the poll doesn't mean shit frankly. :p
Common consensus is Freesync is "good enough", but when it comes down to the nitty gritty of a full VRR implementation G-SYNC wins, AMD leave it to the monitor manufacturers to do the dirty work and no surprise they also do the minimum required.
At the end of the day Nvidia will continue to do what they want regardless, and AMD will do what they want to too (within their means).
Awesome poll though, very conclusive.

I agree, AMD puts 95% of work on the monitor maker to do the job. Nvidia did most of the work in their version and it costs extra. I have no problem with paying extra to know that effort to make something work right was put in to it.
 
Joined
Nov 9, 2010
Messages
5,654 (1.15/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
HAHA the poll doesn't mean shit frankly. :p

I'm well aware of that and have maintained that myself, arguing that there's not sufficient displays to test Freesync yet. The only reason I brought it up is because blan keeps acting like the poll is based on his interpretation of things, vs the thread author's.
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
I agree, AMD puts 95% of work on the monitor maker to do the job. Nvidia did most of the work in their version and it costs extra. I have no problem with paying extra to know that effort to make something work right was put in to it.

Your paying $200 premium for a different method that does the same thing in the VRR window. The kit didn't work right but people still pay'd $150+ for it. It's still using the same SoC for the G-Sync module. The firmware was updated in the retail version but the kinks were not solved, they were minimized. Your still getting flicker/stutter and it depends on game engine compatibility like Nvidia has stated all along in there G-Sync FAQ. How long was it between kit and retail version 8 months from Kit to Retail and since Retail another 10 months. You would think they would have solved the issues for that extra cost by now.
 
Last edited:
Joined
Oct 5, 2008
Messages
1,802 (0.32/day)
Location
ATL, GA
System Name My Rig
Processor AMD 3950X
Motherboard X570 TUFF GAMING PLUS
Cooling EKWB Custom Loop, Lian Li 011 G1 distroplate/DDC 3.1 combo
Memory 4x16GB Corsair DDR4-3466
Video Card(s) MSI Seahawk 2080 Ti EKWB block
Storage 2TB Auros NVMe Drive
Display(s) Asus P27UQ
Case Lian Li 011-Dynamic XL
Audio Device(s) JBL 30X
Power Supply Seasonic Titanium 1000W
Mouse Razer Lancehead
Keyboard Razer Widow Maker Keyboard
Software Window's 10 Pro
I have only ever seen Gsync on a UHD monitor, I know I used to play on a UD590, screen tearing was aweful in high action/movement games. Driving through GTA V it's MUCH more apparrent what Gsync does for the experience. It does look like you can tune your game to stay above 48 FPS, and save some cash. That's harder to do without SLI though at UHD, not impossible, just harder(more settings have to be sacrificed).

That being said, I love my XB280HK, and don't regret the purchase at all.
 
Joined
Jul 18, 2007
Messages
2,693 (0.44/day)
System Name panda
Processor 6700k
Motherboard sabertooth s
Cooling raystorm block<black ice stealth 240 rad<ek dcc 18w 140 xres
Memory 32gb ripjaw v
Video Card(s) 290x gamer<ntzx g10<antec 920
Storage 950 pro 250gb boot 850 evo pr0n
Display(s) QX2710LED@110hz lg 27ud68p
Case 540 Air
Audio Device(s) nope
Power Supply 750w superflower
Mouse g502
Keyboard shine 3 with grey, black and red caps
Software win 10
Benchmark Scores http://hwbot.org/user/marsey99/
having seen both work i am inclined to say gsync worked better right now. but (and this is the but that will matter for most) the gsync setup cost over £1500 for the gpu and screen setup while the freesync was running on about £600 worth of hardware.

now i think part of what let the free sync down was the screen, as this solution does place the emphasis on what the screen wants. while the nvidia solution has the screen set to show what the gpu can give. this means that nvidia are trying to show their gpu in the best light, while amd are trying to give the best experience you can get from the screen.

from what i have seen i think it is still too early to say either is the winner, but from history 1 thing is always true; free standards get wider adoption than expensive proprietary ones.

if you had not noticed i was not blown away by either enough to make me want to upgrade from my korean pls panel at 110hz anyway :D
 
Joined
Oct 5, 2008
Messages
1,802 (0.32/day)
Location
ATL, GA
System Name My Rig
Processor AMD 3950X
Motherboard X570 TUFF GAMING PLUS
Cooling EKWB Custom Loop, Lian Li 011 G1 distroplate/DDC 3.1 combo
Memory 4x16GB Corsair DDR4-3466
Video Card(s) MSI Seahawk 2080 Ti EKWB block
Storage 2TB Auros NVMe Drive
Display(s) Asus P27UQ
Case Lian Li 011-Dynamic XL
Audio Device(s) JBL 30X
Power Supply Seasonic Titanium 1000W
Mouse Razer Lancehead
Keyboard Razer Widow Maker Keyboard
Software Window's 10 Pro
having seen both work i am inclined to say gsync worked better right now. but (and this is the but that will matter for most) the gsync setup cost over £1500 for the gpu and screen setup while the freesync was running on about £600 worth of hardware.

now i think part of what let the free sync down was the screen, as this solution does place the emphasis on what the screen wants. while the nvidia solution has the screen set to show what the gpu can give. this means that nvidia are trying to show their gpu in the best light, while amd are trying to give the best experience you can get from the screen.

from what i have seen i think it is still too early to say either is the winner, but from history 1 thing is always true; free standards get wider adoption than expensive proprietary ones.

if you had not noticed i was not blown away by either enough to make me want to upgrade from my korean pls panel at 110hz anyway :D

Cost, and being apart of the VESA DP standard, will allow AMD to become more widely adopted in the long run. However, I do think there is a market for both technologies. To me it's akin to the smartphone OS market, Android and iOS are two different approaches to the same question. Both have there merits, both have there targeted markets, and approaches to said market. While their certainly are more Android users, the Apple apps market seems to be significantly more profitable, and thus attractive. What might kill Gsync in my mind, is if they can ever prove that theory that was floating around the internet that you can enable Gsync on any DP 1.2a compliant monitor without the chip in the display.
 
Joined
Jul 18, 2007
Messages
2,693 (0.44/day)
System Name panda
Processor 6700k
Motherboard sabertooth s
Cooling raystorm block<black ice stealth 240 rad<ek dcc 18w 140 xres
Memory 32gb ripjaw v
Video Card(s) 290x gamer<ntzx g10<antec 920
Storage 950 pro 250gb boot 850 evo pr0n
Display(s) QX2710LED@110hz lg 27ud68p
Case 540 Air
Audio Device(s) nope
Power Supply 750w superflower
Mouse g502
Keyboard shine 3 with grey, black and red caps
Software win 10
Benchmark Scores http://hwbot.org/user/marsey99/
Cost, and being apart of the VESA DP standard, will allow AMD to become more widely adopted in the long run. However, I do think there is a market for both technologies. To me it's akin to the smartphone OS market, Android and iOS are two different approaches to the same question. Both have there merits, both have there targeted markets, and approaches to said market. While their certainly are more Android users, the Apple apps market seems to be significantly more profitable, and thus attractive. What might kill Gsync in my mind, is if they can ever prove that theory that was floating around the internet that you can enable Gsync on any DP 1.2a compliant monitor without the chip in the display.

prove?

you mean the fact that at least 3 different sources i read have already done it does not prove that to you?

i kinda agree with the iphone analogy, iphone zealots are about the only group i know more deluded than the nvidia fanbois xD
 
Joined
Oct 5, 2008
Messages
1,802 (0.32/day)
Location
ATL, GA
System Name My Rig
Processor AMD 3950X
Motherboard X570 TUFF GAMING PLUS
Cooling EKWB Custom Loop, Lian Li 011 G1 distroplate/DDC 3.1 combo
Memory 4x16GB Corsair DDR4-3466
Video Card(s) MSI Seahawk 2080 Ti EKWB block
Storage 2TB Auros NVMe Drive
Display(s) Asus P27UQ
Case Lian Li 011-Dynamic XL
Audio Device(s) JBL 30X
Power Supply Seasonic Titanium 1000W
Mouse Razer Lancehead
Keyboard Razer Widow Maker Keyboard
Software Window's 10 Pro
prove?

you mean the fact that at least 3 different sources i read have already done it does not prove that to you?

i kinda agree with the iphone analogy, iphone zealots are about the only group i know more deluded than the nvidia fanbois xD

See, you say "that at least 3 different sources I read have already done it", but then don't list any of them. Are you refering to "Mobile Gsync" ? That doesn't suprise me at a all they'd develop an DP 1.2a complaint offering to compete.

http://www.extremetech.com/extreme/198603-leaked-nvidia-driver-offers-taste-of-mobile-g-sync
http://wccftech.com/nvidia-gsync-mobility-confirmed-require-dedicated-module-raises-questions/

These sites...I am not sure I'd consider credible...I'll believe it when Wizzard validates their claims, or someone of similar chops does.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I think it falls on VESA. FreeSync exists because VESA didn't go far enough with DisplayPort 1.2a and it may have stemmed from a lack of understanding of the issues in pushing an unfinished frame out to the monitor. In order to compensate, AMD had to create a proprietary standard (FreeSync) which requires monitors to not only support Adaptive Sync (to send the signal), but also enforce a minimum refresh rate internally so should the GPU fail to deliver it, it doesn't destroy itself. VESA needs to require Adapative Sync monitors to have a memory buffer that can hold the previous frame in full and pull from that buffer whenever the input falls below the minimum for the panel.

Why should the buffer be in the monitor? Two reasons:
1) Reduces DisplayPort bandwidth usage which may be needed elsewhere.
2) It is extremely simple to calculate how large of a frame buffer needs to be in a monitor based on its specs (height * width * bits / pixel) where a single GPU running 6 displays would have to maintain 6 separate buffers of ridiculous size (potentially) to satisfy the need (assuming it isn't pulled from its existing memory).

If VESA did this, FreeSync and GSync would go the way of the dodo and GPU design doesn't get more complicated.
 
Top