• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

FreeSync or G-Sync or both?

Joined
Dec 18, 2005
Messages
8,253 (1.24/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
it seems to me that currently AMD is a "poor mans choice" (hardcore fan base apart) people buy AMD (cpu or gpu) because its the cheaper option not because its the better option..

looked at this way g-sync costing more than free-sync just follows the otherwise normal pattern..

i think most people tend to look at a monitor as an extra and not as an integral part of their system.. its the way i used to be.. carefully plan out a system and then chuck in a cheap keyboard mouse and monitor more as an after-thought more than anything else..

i have g-sinc but never "bought into it" it just came as part and parcel of the high end monitor i decided to buy.. i aint gonna knock it but aint gonna enthuse over it ether..

i run g-sync and a frame rate cap.. a practise that dosnt seem to be that common.. i aint quite figured out what are the best frame rates to run at but i am getting there.. :)

trog
 
Last edited:
Joined
Dec 17, 2015
Messages
115 (0.04/day)
Did you really think about what you've just said?
You're really going to make me do all the work for you aren't you?

Your comparison to Apple...
- With an Apple ecosystem you are hard locked to all software you can use on that computer
- With an Apple ecosystem you are hard locked to any hardware it does and does not accept
- You cannot forego anything to remove those limitations

Your comparison to Consoles...
- With a console all your hardware is locked (except possible HDD upgrades)
- With a console you can only play games designed for that console
- You cannot forego anything to remove those limitations

G-Sync
- You need an Nvidia GPU to take advantage of the G-Sync technology
- You can forego an Nvidia GPU and simply use the monitor as a 144Hz (or whatever the refresh rate is) monitor
- It does not impact your ability to use any software you want
- It does not remove your ability to use any hardware you want

You would first go out, pay a premium for either Free- or Gsync and then opt for a GPU of the competing brand that won't support the very feature you just paid a premium for?

Is it the best or even the smartest move? Of course not but the option is still available. But that wasn't the point. The point is your analogies don't apply here. They are flawed. Either you are trying too hard to think of analogies or your understanding is lacking. I'm guessing the former despite your rudeness but perhaps you should take your own advice.
 
Joined
Sep 17, 2014
Messages
20,773 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Of course it is póssible, but it is not sénsible... We are posting in a thread where the question is posed 'Freesync or Gsync'. I'll agree the analogy is slightly flawed, but you get the point. It makes no sense to pay premium for features that you cannot even use, and it can be a reason to not buy into either technology and wait, or not pay any premium for it at all. At thát point, buying either tech is sensible. And that lack of premium will only happen when it gets adapted as standard for both GPU vendors.

With regards to ecosystems, this is also a function of the mind/perception, because you can easily mod an Apple comp to run Windows or build a Hackintosh, it just takes some extra effort. When you buy Gsync, very slim chance you are ever buying an AMD gpu to go with it, it doesn't fit the perception of what you bought earlier. That is effectively buying into an ecosystem as well. I can also connect my Apple stuff to a non-Apple PC can't I? It just won't work as well, and that effectively pushes you into buying same brand stuff.
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
You don't need to buy G-Sync games so the analogy with Betamax is not the same. Investing in Betamax and then switching to VHS meant a library overhaul.
You have to buy a Betamax player (NVIDIA card) and Betamax cassettes (G-sync equiped monitor). Switching to adaptive sync means new monitors and cards (at least until NVIDIA supports it). The similarties are undeniably there.

Going with G-Sync now and then switching to FreeSync with a future monitor purchase is just that - a monitor switch and people do that all the time. The competing with its own heart also alludes me. DisplayPort 1.3 won't make G-Sync's job any harder, it will just make FreeSync's job easier. How much who knows at this point.
Presently, G-sync does not support VESA adaptive sync but G-sync only exists on the back of VESA's DisplayPort standard (where AMD already extended the VESA standard to HDMI as well). The analogy stems from the fact that no competition can exist because VESA DisplayPort is not going to compete with VESA DisplayPort. G-sync is a cobblejob placed on top of the adaptive sync standard that has to go away because it requires a device that is not directly compliant with the eDP standard. The G-sync brand could appear on eDP hardware but doing so would create a lot of confusion in terms of GPU support. It would be best if the G-sync brand was abandoned but I don't know if NVIDIA will do that--kind of moot until they debut a DisplayPort 1.3 card.

What will AMD's fix be for FreeSync dropping its variable refresh rate below a monitor's min refresh rate?
Reduce settings to get reasonable framerates. Who, seriously, thinks :love:0 frames is acceptable? Additionally, there is some indicators that eDP does take minimum refreshrate into consideration but manufacturers are forced to provide a minimum refreshrate via EDID so they give a higher number than the eDP can handle. More info here.

The two technologies can co-exist if Nvidia removes the price premium in favor of knowing G-Sync monitors will keep its clientele (long term investment vs short) and can prove to monitor manufacturers that its worth their time and money to have a G-Sync lineup.
They cannot. G-sync is not a standard where adaptive sync is. The former cannot exist in a market where the latter does--at least not without huge and continued investments from NVIDIA which they'll have to take a loss on to keep going.


There's an important distinction to be made here: AMD FreeSync branding is moot. FreeSync is an implementation of the adaptive sync standard. It's kind of like how the NX bit is called "XD bit" by Intel and "Enhanced Virus Protection" by AMD. They're different names for the same thing. G-sync, presently, does not fit VESA's adaptive sync mold. Sure, monitor manufacturers advertise "FreeSync" to catch the eye of potential buyers but that is a misnomer. If you buy a FreeSync monitor and plug it into an Intel DisplayPort 1.3 port, it will work just like it would on an AMD card. Adaptive sync is vendor neutral--just like plug and play functionality of all DisplayPort monitors.


guess working hand in hand with vesa on dp like no one else did means nothing at all
AMD did the same with Mantle and Vulkan. Consumers benefit hugely from AMD advancing all of these open standards because they can use any hardware they want (even NVIDIA). How is that a bad thing? The obligatory meme:

NVIDIA can't ignore DisplayPort 1.3 because it comes with a huge boost to bandwidth 4K displays need. If they support DP 1.3 and deliberately axe/ignore adaptive sync in the name of G-sync, they deserve to burn in hell.
 
Last edited:
Joined
Apr 29, 2014
Messages
4,179 (1.15/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
Too much is in the air at this point. What will AMD's fix be for FreeSync dropping its variable refresh rate below a monitor's min refresh rate? On lesser importance, how it will it catch up on overshoot? How will Nvidia address the proprietary concerns and price premium? Right now I will agree with you that AMD has the upper hand but its not as simplistic as you think it is. The two technologies can co-exist if Nvidia removes the price premium in favor of knowing G-Sync monitors will keep its clientele (long term investment vs short) and can prove to monitor manufacturers that its worth their time and money to have a G-Sync lineup.
I am confused because they both do that, once you exit the range you lose Freesync/G-Sync, G-Sync just at the moment goes lower in the monitors available.


You're really going to make me do all the work for you aren't you?

Your comparison to Apple...
- With an Apple ecosystem you are hard locked to all software you can use on that computer
- With an Apple ecosystem you are hard locked to any hardware it does and does not accept
- You cannot forego anything to remove those limitations

Your comparison to Consoles...
- With a console all your hardware is locked (except possible HDD upgrades)
- With a console you can only play games designed for that console
- You cannot forego anything to remove those limitations

G-Sync
- You need an Nvidia GPU to take advantage of the G-Sync technology
- You can forego an Nvidia GPU and simply use the monitor as a 144Hz (or whatever the refresh rate is) monitor
- It does not impact your ability to use any software you want
- It does not remove your ability to use any hardware you want



Is it the best or even the smartest move? Of course not but the option is still available. But that wasn't the point. The point is your analogies don't apply here. They are flawed. Either you are trying too hard to think of analogies or your understanding is lacking. I'm guessing the former despite your rudeness but perhaps you should take your own advice.
However, the problem in purchasing a G-Sync monitor is the expense as they cost extra (Quite a bit) over a non Sync enabled or Adaptive Sync (FreeSync) monitor. Even though you can use the monitor as a normal monitor its also more difficult as most (I have seen I believe one so far that has more than 1) have only 1 DP output making them harder to use for other tasks without adaptors and such (Or if you want to have multiple things hooked into it). They are strictly designed to be a 1 computer gaming monitor for Nvidia cards which is fine but it makes it hard to recommend to people unless they don't mind selling off an expensive monitor if they decided to switch vendors or just keep it without being able to use the tech they paid extra for.

NVIDIA can't ignore DisplayPort 1.3 because it comes with a huge boost to bandwidth 4K displays need. If they support DP 1.3 and deliberately axe/ignore adaptive sync in the name of G-sync, they deserve to burn in hell.

I have a feeling they are going to block it, not because of anything more than they want to push G-Sync even when they add DP 1.3.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I have a feeling they are going to block it, not because of anything more than they want to push G-Sync even when they add DP 1.3.
I hope not but that possibility exists. I have a feeling they won't block it because display manufacturers won't play ball unless NVIDIA greases their palms. NVIDIA goes where the profits are and I don't see profits in G-sync's future--I see losses.

There's only 9 G-sync monitors available for sale now:
http://www.geforce.com/hardware/technology/g-sync/where-to-buy-g-sync-monitors-and-modules

There are over 50 adaptive sync monitors (9 are HDMI):
http://www.amd.com/en-us/innovations/software-technologies/technologies-gaming/freesync

Note the number of manufacturers too. Adaptive sync already has broad industry backing where G-Sync does not.


Additionally, the FreeSync page says right on it that Polaris will have DisplayPort 1.3 support. 90% sure Pascal will too.

I think it is very possible 2016 is the end for G-sync. It will be put on legacy support.
 
Last edited:
Joined
Jan 2, 2015
Messages
1,099 (0.33/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
I hope not but that possibility exists. I have a feeling they won't block it because display manufacturers won't play ball unless NVIDIA greases their palms. NVIDIA goes where the profits are and I don't see profits in G-sync's future--I see losses.

There's only 9 G-sync monitors available for sale now:
http://www.geforce.com/hardware/technology/g-sync/where-to-buy-g-sync-monitors-and-modules

There are over 50 adaptive sync monitors (9 are HDMI):
http://www.amd.com/en-us/innovations/software-technologies/technologies-gaming/freesync

Note the number of manufacturers too. Adaptive sync already has broad industry backing where G-Sync does not.


Additionally, the FreeSync page says right on it that Polaris will have DisplayPort 1.3 support. 90% sure Pascal will too.

I think it is very possible 2016 is the end for G-sync. It will be put on legacy support.
the next nv fad will be an operating system :oops: the next amd venture be getting another hsa member to make gpu's and push out nv.. poor nv needs to wear a helmet o_O even apple is making jokes about them :laugh:
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
18,851 (3.08/day)
Location
UK\USA
Processor AMD 3900X \ AMD 7700X
Motherboard ASRock AM4 X570 Pro 4 \ ASUS X670Xe TUF
Cooling D15
Memory Patriot 2x16GB PVS432G320C6K \ G.Skill Flare X5 F5-6000J3238F 2x16GB
Video Card(s) eVga GTX1060 SSC \ XFX RX 6950XT RX-695XATBD9
Storage Sammy 860, MX500, Sabrent Rocket 4 Sammy Evo 980 \ 1xSabrent Rocket 4+, Sammy 2x990 Pro
Display(s) Samsung 1080P \ LG 43UN700
Case Fractal Design Pop Air 2x140mm fans from Torrent \ Fractal Design Torrent 2 SilverStone FHP141x2
Audio Device(s) Yamaha RX-V677 \ Yamaha CX-830+Yamaha MX-630 Infinity RS4000\Paradigm P Studio 20, Blue Yeti
Power Supply Seasonic Prime TX-750 \ Corsair RM1000X Shift
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ Wooting Two HE
Benchmark Scores Meh benchmarks.
You and me both... its not right. Its like calling an apple a orange....

2K = 2048x1080, for the record. 1080p is 1920x1080. Ive never heard of 1K...because it doesn't seem to exist.

2K is NOT 2560x1440!!!! 1440p or QHD is the correct way to refer to it.

Here is a reference: https://en.wikipedia.org/wiki/Display_resolution#/media/File:Vector_Video_Standards8.svg

2k is 1920x1080, like 4k is 3840 x 2160 which you can see is less than 4000 so 4k further away than what 2k is.

That said 1920x1080 has every right to be all so called 2k as 4k is way of it's mark with 3840. He mean 1920x1080 being 1k when really it's not, they just started a new naming scheme is all.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
1K (1024 x 576) -- Probably only exists on cheap phones.
Full HD/2K (1920 x 1080)
Ultra HD/4K (3840 x 2160)
5K (5120 x 2880)

Note they are all 16:9 and they all round to their #K.


For the record, I think calling a resolution by any name other than it's resolution is stupid.
 
Last edited:
Joined
Dec 18, 2005
Messages
8,253 (1.24/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
you are probably right.. maybe even more so now that extra wide screens are becoming popular.. the term 4K is accepted and commonly used.. it is easy to remember and write down..

its become the new must have buzz word for some.. 4K is roughly 8 million pixels.. 1920 x 1080 is roughly 2 million pixels.. 2560 x 1440 is roughly or a little less than 4 million pixels..

the ratio of 1 2 4 makes sense from a power needed to move them all about or the amount of pixels on a screen point of view hence my incorrect use of the 1 K 2 K and 4 K tis just the way my poor old brain makes sense of it.. :)

i go back to the 640 x 480 days.. he he

trog
 
Joined
Dec 17, 2015
Messages
115 (0.04/day)
@Vayra86
Even if Nvidia says FreeSync wins they will still provide back compatibility support of G-Sync. You only have two options of graphics cards anyway. It's one or the other. Not like you are losing out on a breadth of options. The reason Nvidia dominates the market share is because most of its user base is loyal and will keep buying Nvidia cards. This has been a thing before G-Sync and will most likely still be a thing. So being locked into Nvidia isn't as big of a negative to many (read, not all) of them.

While the price premium does suck (not arguing this) at least you get something for that premium - superior tech. Many people are quite ok with paying a premium for better technology. I am. I supported SSDs when they were new and expensive. If I get something for my money its not as evil as many make it out to be. To be clear since people are analogy happy, I am not comparing G-Sync to SSDs. Only saying I personally don't mind paying a price premium for better tech.


@FordGT90Concept
The Betamax player analogy is really pointless. You should not need me to tell you Betamax Cassettes != a friggin monitor. The similarities exist only in your head. Stop with the analogies. We get it - you hate G-Sync and Nvidia. Instead of trying to think of analogies, focus on differences in the technology.

Now if you go back and read my posts you will see that I agree with you that AMD has Nvidia by the short ones and that Nvidia has a track record of bad habits. What I don't agree with you on, is that its not as simple as you make it out to be. There is a very clear and distinct difference on that.

First you make it sound like all of a sudden some recent information came out that put the death nail in G-Sync - using DisplayPort 1.3 as your argument. Truth is what you are referring to is old news - well for the computer industry. VESA created Adaptive Sync in 2009 but it was not implemented. Nvidia took the opportunity to develop and release G-Sync. In response to that AMD announced FreeSync, which was VESA's Adaptive Sync. So all of a sudden a technology that wasn't pushed at all was used to combat Nvidia's release of G-Sync. Adaptive Sync was actually supported by DisplayPort 1.2a - so no... 1.3 is not when Adaptive Sync was first supported. It should be noted that 1.2a was released in 2014 ...so was the spec for 1.3. Even with that you still need a FreeSync enabled monitor as it needs the chip. FreeSync monitors win on price. G-Sync monitors win on tech. Since Nvidia is not supporting Adaptive Sync we are back at square one... you need an AMD card for FreeSync and you need an Nvidia card for G-Sync. So we can discuss DisplayPort all we want but in the end very little has changed in that regard. In spite of the DisplayPort changes guess what... monitor manufacturers are still releasing G-Sync monitors. :eek:

A noteworthy change is Intel's integrated graphics will now support AdaptiveSync, but how many people running integrated graphics will be buying a gaming monitor?

You kind of have AMD's triumph on DisplayPort 1.3 a little off. The big win on DisplayPort 1.3 is the fact it enables 5120×2880 displays to 60Hz refresh rate, 1080p monitors will go up to 240hz refresh rate, 170hz for HDR 1440p screens and 144Hz for 3840×2160. The upper end displays will most likely have an announcement date toward the end of the year - in case anyone is curious.

The above is great but honestly what I feel is the biggest win for AMD is getting FreeSync to work over HDMI 2.0 since not all monitors support DisplayPort. This will definitely increase the number of monitors that are FreeSync certified. Granted this is a low cost solution since HDMI doesn't have the bandwidth DisplayPort does so all serious gamers will still go for a DisplayPort monitor. But it does open up options to the lower price tier group of monitors. So now lost cost game systems can enjoy variable refresh rates on low cost monitors. AMD always kind of dominated on the low cost GPUs but now there is even more reason for people looking for a low cost GPU to invest in AMD instead of Nvidia.


As far as your clarification of FreeSync and AdaptiveSync, you should realize that the two while the same are not mutually inclusive. A FreeSync certified monitor is always based on AdaptiveSync but not every AdaptiveSync monitor is FreeSync certified. Your same paragraph makes it sound like FreeSync is completely plug-n-play with DisplayPort 1.3. Again, you need an AMD graphics card for FreeSync to work just like you need an Nvidia graphics card for G-Sync. See how we came full circle again? Now when Intel releases its first line of CPUs that are AdaptiveSync enabled than your statement becomes true but DisplayPort 1.3 does not magically enable FreeSync on its own.

Also you should not assume every monitor's min refresh rate is 30Hz because some are actually 40Hz and above. It's not hard to dip below 40 FPS on a recent title. If you are going to buy a FreeSync monitor this is one of the most important things you should look up.


That list of G-Sync monitors you posted is out of date. AMD still has the advantage but there are more than 9 G-Sync monitors :rolleyes:
- List of FreeSync monitors
- List of G-Sync monitors

Again, the important distinction between your POV and mine is I don't think it is clear cut like you do. As I said, AMD has the upper hand and Nvidia has a bad track record with the tech it likes to push. In fact, I think I provided more examples of how AMD has the upper hand. But the analogies in this thread are severely flawed. Your information is a bit off and you paint this picture that if someone invests in a G-Sync monitor they're screwed - not true.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
VESA created Adaptive Sync in 2009 but it was not implemented.
Only in eDP (think laptop monitors)...
VESA said:
http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/
Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP™) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.

DisplayPort 1.2a was designed specifically for AMD to push FreeSync out the door. DisplayPort 1.3 is to put everyone (AMD, Intel, NVIDIA, and so on) on the same page with support for external adaptive sync. There are no DisplayPort 1.3 devices out yet--they're coming this year en masse.

A noteworthy change is Intel's integrated graphics will now support AdaptiveSync, but how many people running integrated graphics will be buying a gaming monitor?
Businesses, everywhere. When nothing changes on the screen (think your typical workstation screen), the GPU can literally shut off because the display already has everything it needs thanks to its onboard memory in the eDP. This tech goes far behind high refresh rates.


The only reason why AMD bothered with FreeSync and HDMI is for consoles...likely beginning with the Nintendo NX. It's not really aimed at the computer market...other than the Nano which is aimed at HTPCs. Remember, the Nano has 3 DisplayPorts, each capable of Multi-Stream Technology, where its sole HDMI port can only power one display. AMD is very committed to DisplayPort and the only reason why they put up with HDMI at all is because home theaters are sticking to it.


As far as your clarification of FreeSync and AdaptiveSync, you should realize that the two while the same are not mutually inclusive. A FreeSync certified monitor is always based on AdaptiveSync but not every AdaptiveSync monitor is FreeSync certified. Your same paragraph makes it sound like FreeSync is completely plug-n-play with DisplayPort 1.3. Again, you need an AMD graphics card for FreeSync to work just like you need an Nvidia graphics card for G-Sync. See how we came full circle again? Now when Intel releases its first line of CPUs that are AdaptiveSync enabled than your statement becomes true but DisplayPort 1.3 does not magically enable FreeSync on its own.
Negative. Adaptive sync is agnostic. If you have an adaptive sync graphics processor and an adaptive sync monitor, adapative sync will be enabled by default. The purpose of the technology is to act without user input. Again, the goal is to reduce bandwidth requirements as well as reduce idle power consumption. The branding matters not.

Of course this isn't true of G-Sync, in its current state, because it is non-standard.


Also you should not assume every monitor's min refresh rate is 30Hz because some are actually 40Hz and above. It's not hard to dip below 40 FPS on a recent title. If you are going to buy a FreeSync monitor this is one of the most important things you should look up.
That's describing the panel which inadvertantly describes the minimum refreshrate the eDP will refresh at. The frame rate can be lower from the GPU--eDP will fill in the gaps to keep it at or above minimum.
 
Last edited:
Joined
Dec 17, 2015
Messages
115 (0.04/day)
Ya I addressed that...

Nvidia took the opportunity to develop and release G-Sync. In response to that AMD announced FreeSync, which was VESA's Adaptive Sync. So all of a sudden a technology that wasn't pushed at all was used to combat Nvidia's release of G-Sync. Adaptive Sync was actually supported by DisplayPort 1.2a - so no... 1.3 is not when Adaptive Sync was first supported.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I never said 1.3 is when it was first supported. It will be the first supported by NVIDIA, Intel, and the rest of the industry.



FYI, AMD Crimson drivers added "Low Framerate Compensation" for FreeSync:

http://videocardz.com/57776/amd-launches-radeon-software-crimson-driver

G-sync lost its technical edge with a driver update. eDP/adaptive sync is just that awesome. :laugh:


Edit: Interesting caveat there: "greater than or equal to 2.5 times the minimum refresh rate."
30 Hz -> 75 Hz
35 Hz -> 87.5
40 Hz -> 100 Hz
42 Hz -> 105 Hz
47 Hz -> 117.5 Hz
48 Hz -> 120 Hz
56 Hz -> 140 Hz

That's definitely something buyers should be aware of.

Edit: Looks like LFC should work on all 144 Hz displays.
 
Last edited:

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,378 (2.37/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Without sounding trollish - I'd only buy a Freesync or G-Sync monitor if I was a brand loyalist OR had no problem buying a new monitor when i changed gfx cards.

I've owned Nvidia since i left my 7970's behind but I still wouldn't splash on a G-Sync. And likewise, I'm not buying an AMD card just to buy a FreeSync monitor.

Unless Nvidia support adaptive sync (in which case G-Sync dies), it's a lottery between cards and monitors. Without knowing what the next gen cards are doing, the F/G sync options are like technology prisons you pay to lock yourself into. No thanks. I'll stick with monitor agnostic powerful cards instead (be it Fury X or 980ti).
 
Joined
May 9, 2012
Messages
8,380 (1.93/day)
Location
Ovronnaz, Wallis, Switzerland
System Name main/SFFHTPCARGH!(tm)/Xiaomi Mi TV Stick/Samsung Galaxy S23/Ally
Processor Ryzen 7 5800X3D/i7-3770/S905X/Snapdragon 8 Gen 2/Ryzen Z1 Extreme
Motherboard MSI MAG B550 Tomahawk/HP SFF Q77 Express/uh?/uh?/Asus
Cooling Enermax ETS-T50 Axe aRGB /basic HP HSF /errr.../oh! liqui..wait, no:sizable vapor chamber/a nice one
Memory 64gb Corsair Vengeance Pro 3600mhz DDR4/8gb DDR3 1600/2gb LPDDR3/8gb LPDDR5x 4200/16gb LPDDR5
Video Card(s) Hellhound Spectral White RX 7900 XTX 24gb/GT 730/Mali 450MP5/Adreno 740/RDNA3 768 core
Storage 250gb870EVO/500gb860EVO/2tbSandisk/NVMe2tb+1tb/4tbextreme V2/1TB Arion/500gb/8gb/256gb/2tb SN770M
Display(s) X58222 32" 2880x1620/32"FHDTV/273E3LHSB 27" 1920x1080/6.67"/AMOLED 2X panel FHD+120hz/FHD 120hz
Case Cougar Panzer Max/Elite 8300 SFF/None/back/back-front Gorilla Glass Victus 2+ UAG Monarch Carbon
Audio Device(s) Logi Z333/SB Audigy RX/HDMI/HDMI/Dolby Atmos/KZ x HBB PR2/Edifier STAX Spirit S3 & SamsungxAKG beans
Power Supply Chieftec Proton BDF-1000C /HP 240w/12v 1.5A/4Smart Voltplug PD 30W/Asus USB-C 65W
Mouse Speedlink Sovos Vertical-Asus ROG Spatha-Logi Ergo M575/Xiaomi XMRM-006/touch/touch
Keyboard Endorfy Thock 75% <3/none/touch/virtual
VR HMD Medion Erazer
Software Win10 64/Win8.1 64/Android TV 8.1/Android 13/Win11 64
Benchmark Scores bench...mark? i do leave mark on bench sometime, to remember which one is the most comfortable. :o
Edit: I myself am waiting for better monitor offerings (not happy with the current options) and Pascal + Polaris..
same here ... seems that my 980 and a 60Hz 1080p monitor are enough and G-Sync Freesync not really worth anything ... (specially G-sync ... seriously, proprietary + cost added? no thanks nVidia ) with all my excuses for those who think G-Sync is "tha bomb"

(i mean what's the point of that tech, when my framerate is already stable enough and no stutter no matter what game i play? )

if i decide one day to go 1440p 144hz (the day the manufacturer will be "less nuts" on pricing ...) maybe i will consider a G-Sync or Freesync monitor depending my GPU (or the 3rd tech that will be "open" and replace both and add no cost to a, already expensive enough, monitor)
well ... also any "ROG SWIFT" or "gaming" 144hz 1440p monitor, cost a little more than my GPU ... while my Philips 27E3LH cost 1/3 of it, i know i am surely missing something ...

Without sounding trollish - I'd only buy a Freesync or G-Sync monitor if I was a brand loyalist OR had no problem buying a new monitor when i changed gfx cards.).
totally true

Unless Nvidia support adaptive sync (in which case G-Sync dies), it's a lottery between cards and monitors. Without knowing what the next gen cards are doing, the F/G sync options are like technology prisons you pay to lock yourself into. No thanks. I'll stick with monitor agnostic powerful cards instead (be it Fury X or 980ti).
and true (and G-Sync need to die anyway, but nVidia want to make the most money out of the brand's loyalists )
 
Joined
Dec 18, 2005
Messages
8,253 (1.24/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
liking a certain brand dosnt always turn a person into a flag waving mindless zealot.. i like intel and i like nvidia.. i do so for what i think are good reasons..

in days long past i used to like amd and ati.. i did have good reasons back then.. reasons which sadly dont exist any more.. :)

having spent a fair bit of dosh on a good gaming IPS panel which happens to have g-sync.. i will be bit pissed off if support for it does die in the near future..

i could live without it.. but i do find it useful..

trog
 
Joined
Jan 2, 2015
Messages
1,099 (0.33/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
the only chance of gsync surviving is if they support both and only make them as highest premium gaming experience on every monitor coming perfectly calibrated.. 1ms.. rimless.. curved and there is the realm of 5k.
 
Joined
Dec 18, 2005
Messages
8,253 (1.24/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
the only chance of gsync surviving is if they support both and only make them as highest premium gaming experience on every monitor coming perfectly calibrated.. 1ms.. rimless.. curved and there is the realm of 5k.

it will definitely die off then.. he he

i recon i am gonna turn mine off just to see what perceptible difference it actually makes.. i wont be entirely surprised if i dont see any difference.. but unlike most i can at least find out for real.. he he

trog

ps.. running mad max at 90 fps 1440 resolution with a monitor refresh rate set at 120 hrz and g-sync off i cant see any noticeable difference.. i will leave it off for longer and look f-cking harder.. he he he
 
Last edited:
Joined
Jan 31, 2005
Messages
2,050 (0.29/day)
Location
Denmark
System Name Commercial towing vehicle "Nostromo"
Processor 5800X3D
Motherboard X570 Unify
Cooling EK-AIO 360
Memory 32 GB Fury 3666 MHz
Video Card(s) 4070 Ti Eagle
Storage SN850 NVMe 1TB + Renegade NVMe 2TB + 870 EVO 4TB
Display(s) 25" Legion Y25g-30
Case Lian Li LanCool 216 v2
Audio Device(s) B & W PX7 S2e
Power Supply HX1500i
Mouse Harpe Ace Aim Lab Edition
Keyboard Scope II 96 Wireless
Software Windows 11 23H2
Hmm......this thread is getting more and more interesting. I just wonder - am I the only who not switch monitor every second year?? I maybe switch GPU every second or third year,
but my monitor ..... well the one I primarily use now is 5+ years - and it works great for the games I play (the newest is FallOut 4) - but then again I only run at 1920 x 1080.

All I get out of this thread is still: buy a monitor that have a decent refresh rate and do not look at the fancy gimmicks (FreeSync/G-Sync) but go for Adaptive Sync.
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
18,851 (3.08/day)
Location
UK\USA
Processor AMD 3900X \ AMD 7700X
Motherboard ASRock AM4 X570 Pro 4 \ ASUS X670Xe TUF
Cooling D15
Memory Patriot 2x16GB PVS432G320C6K \ G.Skill Flare X5 F5-6000J3238F 2x16GB
Video Card(s) eVga GTX1060 SSC \ XFX RX 6950XT RX-695XATBD9
Storage Sammy 860, MX500, Sabrent Rocket 4 Sammy Evo 980 \ 1xSabrent Rocket 4+, Sammy 2x990 Pro
Display(s) Samsung 1080P \ LG 43UN700
Case Fractal Design Pop Air 2x140mm fans from Torrent \ Fractal Design Torrent 2 SilverStone FHP141x2
Audio Device(s) Yamaha RX-V677 \ Yamaha CX-830+Yamaha MX-630 Infinity RS4000\Paradigm P Studio 20, Blue Yeti
Power Supply Seasonic Prime TX-750 \ Corsair RM1000X Shift
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ Wooting Two HE
Benchmark Scores Meh benchmarks.
Not at all, I have mine as long as possible. MY HDTV is about 4 years old now and cannot see it being replaced any time soon unless it breaks down. In no rush G Sync or freesync not are not even matured yet and on top of that Home Theater has not court up yet so.

I just wish AMD drivers would recognize that it's a 10bit panel lol.

So i am just waiting it see what happens.
 
Joined
Jan 26, 2016
Messages
407 (0.14/day)
Location
UK
System Name it needs a name?
Processor Xeon E3-1241 v3 @3.5GHz- standard clock
Motherboard Asus Z97A 3.1
Cooling Bequiet! Dark Rock 3 CPU cooler, 2 x 140mm intake and 1 x 120mm exhaust PWM fans in the case
Memory 16 GB Crucial DDR3 1600MHz CL9 Ballistix Sport 2 x 8 GB
Video Card(s) Palit 980ti Super Jetscream
Storage Sandisk X110 256GB SSD, Sandisk Ultra II 960GB SSD, 640GB WD Blue, 12TB Ultrastar
Display(s) Acer XB270HU
Case Lian Li PC 7H
Audio Device(s) Focusrite Scarlett 6i6 USB interface
Power Supply Seasonic P660
Mouse cheapo logitech wireless
Keyboard some keyboard from the 90's
Software Win10pro 64bit
I'd hope to get 5+ yrs out of a monitor
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Hmm......this thread is getting more and more interesting. I just wonder - am I the only who not switch monitor every second year?? I maybe switch GPU every second or third year,
but my monitor ..... well the one I primarily use now is 5+ years - and it works great for the games I play (the newest is FallOut 4) - but then again I only run at 1920 x 1080.

All I get out of this thread is still: buy a monitor that have a decent refresh rate and do not look at the fancy gimmicks (FreeSync/G-Sync) but go for Adaptive Sync.
I haven't changed monitors in probably 7+ years. I change GPUs every 5 or fewer years. I'm going to keep using what I got until it dies. At which point, I'm hoping for an affordable HDR, IPS, adaptive sync panel. I'm hoping they don't die soon because the former is only just happening. HDR and IPS are definitely more important to me than adaptive sync.
 
Joined
Sep 3, 2010
Messages
3,527 (0.71/day)
Location
Netherlands
System Name desktop | Odroid N2+ |
Processor AMD Ryzen 5 3600 | Amlogic S922X |
Motherboard Gigabyte B550M DS3H |Odroid N2+ |
Cooling Inter-Tech Argus SU-200, 3x Arctic P12 case fans | stock heatsink + fan |
Memory Gskill Aegis DDR4 32GB | 4 GB DDR4 |
Video Card(s) Sapphire Pulse RX 6600 (8GB) | Arm Mali G52 |
Storage SK Hynix SSD 240GB, Samsung 840 EVO 250 GB, Toshiba DT01ACA100 1T | Samsung 850 Evo 500GB |
Display(s) AOC G2260VWQ6 | LG 24MT57D |
Case Asus Prime 201 | Stock case (black version) |
Audio Device(s) integrated
Power Supply BeQuiet! Pure Power 11 400W | 12v barrel jack |
Mouse Logitech G500 |Steelseries Rival 300
Keyboard Qpad MK-50 (Cherry MX brown)| Blaze Keyboard
Software Windows 10, Various Linux distros | Gentoo Linux
But response times on IPS and PLS are slow. And a lot of cheaper IPS/PLS monitors are not superior to the better TN panels in terms of viewing angles and colours. So unless you stay out of the cheaper segment IPS is not going to be (much) better than TN while still having the response time trade off. Although I suspect you buy only $300+ monitors.
 
Top