• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

How G-Sync and FreeSync differ right deep down inside

Which is better at doing the job, G-Sync or FreeSync?


  • Total voters
    55
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
I think it falls on VESA. FreeSync exists because VESA didn't go far enough with DisplayPort 1.2a and it may have stemmed from a lack of understanding of the issues in pushing an unfinished frame out to the monitor. In order to compensate, AMD had to create a proprietary standard (FreeSync) which requires monitors to not only support Adaptive Sync (to send the signal), but also enforce a minimum refresh rate internally so should the GPU fail to deliver it, it doesn't destroy itself. VESA needs to require Adapative Sync monitors to have a memory buffer that can hold the previous frame in full and pull from that buffer whenever the input falls below the minimum for the panel.

Why should the buffer be in the monitor? Two reasons:
1) Reduces DisplayPort bandwidth usage which may be needed elsewhere.
2) It is extremely simple to calculate how large of a frame buffer needs to be in a monitor based on its specs (height * width * bits / pixel) where a single GPU running 6 displays would have to maintain 6 separate buffers of ridiculous size (potentially) to satisfy the need (assuming it isn't pulled from its existing memory).

If VESA did this, FreeSync and GSync would go the way of the dodo and GPU design doesn't get more complicated.

VESA has done this and better just not in desktop form. The LVDS panels are the same. Its just a matter of adaptation to stand alone monitors. Pointed it out more then a year ago here



Its what Nvidia try to copy. Just that Nvidia is using the AUX to sink and thus disabling the standards for audio pass-through. They still don't have any kind of backlight control.

If sales for VRR panels take hold I'd expect DP 1.4a to look more like the above.
 
Joined
Jul 18, 2007
Messages
2,693 (0.44/day)
System Name panda
Processor 6700k
Motherboard sabertooth s
Cooling raystorm block<black ice stealth 240 rad<ek dcc 18w 140 xres
Memory 32gb ripjaw v
Video Card(s) 290x gamer<ntzx g10<antec 920
Storage 950 pro 250gb boot 850 evo pr0n
Display(s) QX2710LED@110hz lg 27ud68p
Case 540 Air
Audio Device(s) nope
Power Supply 750w superflower
Mouse g502
Keyboard shine 3 with grey, black and red caps
Software win 10
Benchmark Scores http://hwbot.org/user/marsey99/
See, you say "that at least 3 different sources I read have already done it", but then don't list any of them. Are you refering to "Mobile Gsync" ? That doesn't suprise me at a all they'd develop an DP 1.2a complaint offering to compete.

http://www.extremetech.com/extreme/198603-leaked-nvidia-driver-offers-taste-of-mobile-g-sync
http://wccftech.com/nvidia-gsync-mobility-confirmed-require-dedicated-module-raises-questions/

These sites...I am not sure I'd consider credible...I'll believe it when Wizzard validates their claims, or someone of similar chops does.

yes, they was using laptops with mobile maxwells as at the time it was only laptops which had freesync screens.

1 video was on linus forums and the other i think was from neweggtv. 3rd was the random guy who did it first on his blog. i found it hard to really believe at first too till i saw others doing it too.

sorry dude my history gets cleaned all the time so finding links from more than the past few days can be a nightmare :/
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
If sales for VRR panels take hold I'd expect DP 1.4a to look more like the above.
But do any eDP 1.3 panels exist yet? eDP 1.3 was published in February, 2011, so if none exist, it is simply because screen manufacturers aren't making them. That drives me nuts that they'll support G-Sync and FreeSync but not eDP 1.3 which makes both moot when coupled with a DisplayPort 1.2a GPU.
 
Joined
Nov 9, 2010
Messages
5,649 (1.16/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
I think it falls on VESA. FreeSync exists because VESA didn't go far enough with DisplayPort 1.2a and it may have stemmed from a lack of understanding of the issues in pushing an unfinished frame out to the monitor. In order to compensate, AMD had to create a proprietary standard (FreeSync) which requires monitors to not only support Adaptive Sync (to send the signal), but also enforce a minimum refresh rate internally so should the GPU fail to deliver it, it doesn't destroy itself. VESA needs to require Adapative Sync monitors to have a memory buffer that can hold the previous frame in full and pull from that buffer whenever the input falls below the minimum for the panel.

Why should the buffer be in the monitor? Two reasons:
1) Reduces DisplayPort bandwidth usage which may be needed elsewhere.
2) It is extremely simple to calculate how large of a frame buffer needs to be in a monitor based on its specs (height * width * bits / pixel) where a single GPU running 6 displays would have to maintain 6 separate buffers of ridiculous size (potentially) to satisfy the need (assuming it isn't pulled from its existing memory).

If VESA did this, FreeSync and GSync would go the way of the dodo and GPU design doesn't get more complicated.

Disagree completely, that would be taking a step in the direction of G-Sync via needless complicating and up pricing displays, which would cause a lot of would-be display manufacturers to bow out. I also think avid gamers are more eager to spend big money on a GPU than they are on a display.

AMD have also proven that it takes fairly sophisticated GPU architecture and drivers to make it all work as it's used now, which is the most scalable way to serve the consumers, vs putting all the expense in the displays. It makes more sense to put that level of sophistication in the GPU, because pretty much ALL avid gamers buy fairly advanced GPUs, but the same is not always true of the displays they buy. The latter is very personal preference.

VESA research things pretty carefully before they decide which direction to go on standards they support. There's a lot of businesses that depend on them making those decisions well. Having supported Freesync vs evolving Adaptive Sync has left the display market open to a lot more price ranges of displays that support a G-Sync alternative.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
The cost is going to be in the GPU or the display. I'd argue (VESA too) it makes more sense to be in the display because the display knows its own limits. VESA really just needs to require DisplayPort ports include Embedded DisplayPort compliant internals and it all becomes non-issue. Adaptive Sync, the power saving (GPU can shut off when nothing is changing on the display), and bandwidth saving (updates only the parts that need updating) features of DisplayPort should be propagated out to all GPUs and displays.

If the displays had the memory buffer, the GPU wouldn't have to change much, if any, outside of the DisplayPort standards they are already compliant with. The problems stem from trying to mimic display technologies in the GPU. The advantages of putting a frame buffer in the GPU extends far, far, far beyond "gamers." The bandwidth and power savings would greatly benefit businesses and workstations, for example.
 
Joined
Nov 9, 2010
Messages
5,649 (1.16/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
The cost is going to be in the GPU or the display. I'd argue (VESA too) it makes more sense to be in the display because the display...

That depth of R&D and architecture already IS in the GPU. You're suggesting that be ignored and adding unnecessary cost to the displays. All that's really needed at this point is lower refresh displays, and TVs for some time have been able to produce 30Hz for 1080i. Freesync as it now exists is not going to be hard for display manufacturers to facilitate. Give it some time.

There's no need to overcomplicate this architecturally or financially. That's what Nvidia is doing with G-Sync at a high price result on displays, and how well is that working for them? They can't even get very many display manufacturers interested, esp with the advent of Freesync, and only elite gamers with expensive rigs that are looking for high refresh displays seem interested.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,866 (3.00/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Our forum poll is looking interesting now.

During the first few days, G-Sync was clearly in the lead, but now FreeSync has a commanding lead with 20 over 13 votes, along with a substancial not sure vote of 11. Finally, there are 4 Other votes.

Looking at some of the posts, I'm beginning to shift from G-Sync to Not Sure. I would really have to see the two in action to make a properly informed decision.

Anyway, I'm all for an open standard, where FreeSync has a clear edge.
 
Joined
Feb 18, 2011
Messages
1,401 (0.29/day)
Location
Romania
Processor Ryzen 5700x
Motherboard MSI B350 Gaming Pro Carbon
Cooling be quiet dark rock pro 3
Memory GSKill Aegis 32GB (4x8GB) DDR4 3200MHz CL16
Video Card(s) PowerColor Radeon RX 7800 XT Hellhound 16GB GDDR6 256-bit
Storage Seagate Barracuda SATA-II 1TB , HyperX Savage 240GB SATA 3
Display(s) Benq EX2780Q
Case Be Quiet! Dark Base Pro 900
Audio Device(s) Sound BlasterX G6
Power Supply Seasonic prime TX-650
Mouse Marvo Scorpion G981
Keyboard Razer Blackwidow Elite - Yellow Switch
Software Windows 10 Pro
i also find this interesting
http://www.gamespot.com/articles/review-amd-freesync-is-a-credible-threat-to-g-sync/1100-6426125/

"In the long run, though, it's tough to see where Nvidia is going to go with G-Sync. Yes, it has the superior technology and performance, but propriety technologies, however well they work, generally tend to lose out to open standards. Plus, as AdaptiveSync matures and AMD's drivers improve, Freesync's little niggles like minimum refresh rates and ghosting could be solved. With monitor makers not having to pay a premium for propriety modules, and with laptops particularly likely to have the technology baked in for power-saving reasons (lowering the refresh rate for static images), Freesync is the more attractive proposition"
 
Joined
Jun 13, 2012
Messages
1,316 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
That depth of R&D and architecture already IS in the GPU. You're suggesting that be ignored and adding unnecessary cost to the displays. All that's really needed at this point is lower refresh displays, and TVs for some time have been able to produce 30Hz for 1080i. Freesync as it now exists is not going to be hard for display manufacturers to facilitate. Give it some time.

More likely hood the displays run at 2x the that. a 30fps movie, the display refreshs at 60. just like a 24-25 PAL would probably be around 48. gives same thing with less complicated issue.

There's no need to overcomplicate this architecturally or financially. That's what Nvidia is doing with G-Sync at a high price result on displays, and how well is that working for them? They can't even get very many display manufacturers interested, esp with the advent of Freesync, and only elite gamers with expensive rigs that are looking for high refresh displays seem interested.

As you you say over complicated. Depends on how you view that. For example for AMD to do refresh double/triple stuff g-sync does they would have to use gpu memory to store that data. As we seen with the Ghosting issue not all panels are the same. Could make things in drivers more complicated by having to work write everything in drives for EVERY panel made. Where as g-sync way could leave all that up to the g-sync module to handle with 0 extra crap in drivers. If you think using gpu ram won't mean much for frame doubling, think about as we get higher rez's or even say 3 monitor setup's. that uncompressed image that has to be updated a lot can get rather large.

Understanding why they do things the way they do should come first before you whine and complain about costs of it. Saying "freesync is a new tech" doesn't fly when amd was touting it 6-8 months before hand yet still has issues they could worked on long time ago. Nvidia has to be doing something right else wouldn't have the massive market share advantage.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
8K = 7680 * 4320 x 4 bytes of color data = 132,710,400 bytes = 126.5625 MiB Note: Some 8K formats could go up to 48-bit or 6 bytes per pixel: 189.84375 MiB
1080p = 1920 * 1080 * 4 bytes of color data = 8,294,400 = 7.91015625 MiB

Assuming GPU manufacturers draw the line at 8K for maximum resolution, that means every GPU would require a 128 MiB of dedicated frame buffer memory for every display. GPUs support anywhere from two (256 MiB) to six displays (768 MiB). That's a lot of memory to be sitting around, especially if you only have 1080p displays and only require 8 MiB per display for frame buffer. It makes a lot more sense to match the frame buffer with the monitor.

With eDP implemented on both ends and frame buffer in the display, the benefits go far beyond merely adaptive refresh rate. No matter how much more the monitor costs with eDP (10-20% I suspect), it should save power and thus be cheaper in the long run.
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
I think some are confusing the comments made by PCPerspective guessing at a solution and interpreting it as definitive answer. It never made sense to begin with because PCPerspective never bothered looking at VESA standards or what they have done through-out the development of G-Sync. The articles they write are pretty much a verbatim from Nvidia marketing guy Tom Petersen.

One of Tom Petersen first statements when AMD demoed FreeSync on a Toshiba Satelite laptop was that it uses a different panel tech LVDS like it was alien tech. All modern panels mobile to desktop have been using it and all G-Sync panels have been on LVDS.

Tom Petersen said:
However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS

I haven't read that AMD is try'n to software buffer other then the guys over at PCPerspective. AMD already has had its SoCs and APUs compliant for VRR and PSR on mobile and other devices.

VESA just needs to standardize more eDP features faster to DP.
 
Last edited:
Joined
Oct 5, 2008
Messages
1,802 (0.32/day)
Location
ATL, GA
System Name My Rig
Processor AMD 3950X
Motherboard X570 TUFF GAMING PLUS
Cooling EKWB Custom Loop, Lian Li 011 G1 distroplate/DDC 3.1 combo
Memory 4x16GB Corsair DDR4-3466
Video Card(s) MSI Seahawk 2080 Ti EKWB block
Storage 2TB Auros NVMe Drive
Display(s) Asus P27UQ
Case Lian Li 011-Dynamic XL
Audio Device(s) JBL 30X
Power Supply Seasonic Titanium 1000W
Mouse Razer Lancehead
Keyboard Razer Widow Maker Keyboard
Software Window's 10 Pro
I think Nvidia will do both. It just depends on how widely the market adopts Freesync over Gsync, and if the Gsync displays continue to sell at a profitable price point to maintain/grow the tech, or if the cost effective alternative is "good enough" and "cheap enough" to kick gsync out of the market. As with anything, the first mover advantage gives them the ability to charge more since they made it to market long before AMD's tech.
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
I think Nvidia will do both. It just depends on how widely the market adopts Freesync over Gsync, and if the Gsync displays continue to sell at a profitable price point to maintain/grow the tech, or if the cost effective alternative is "good enough" and "cheap enough" to kick gsync out of the market. As with anything, the first mover advantage gives them the ability to charge more since they made it to market long before AMD's tech.

Again the tech isn't new. It was developed long before it hit eDP 1.3 standards back in 2010. Its adaptation to gaming in standalone monitors is new to the consumer.

As for Nvidia doing both. Possible but they would have to rework their sink method in order to be compatible with newer standards. The rumored new G-Sync v2 monitor is still on DP 1.2. Have to see if they are still using VESA DDM in monitor or something different.

Tom Petersen said:
When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval.

If its already supported how can it be new. VESA standardizing it in DP 1.2a+ was for it not to conflict with other standards and new revisions of DP are backwards compatible. Not to mention DP 1.3 improves vblanking so LCD aren't restricted to CRT era vblanking.

Intel has supported PSR since Ivy Bridge gen and VRR+PSR in some of its Broadwell chips. I would say Intel and AMD have a better understanding on how to integrate it into their chips since they been doing it much longer with their SoCs and APUs.
 
Last edited:
Joined
Jan 13, 2009
Messages
424 (0.08/day)
nVidia marketing is amazing. They've got everyone clamoring to play on their 144Hz monitors at less then 40Hz. Many people I've seen supporting the superiority of G-sync due to it's handling of the output below the VRR are the same ones who claim they could never go back to gaming on a 60Hz monitor because of lack of smoothness and blur. 40Hz is where it's at now though. o_O

A benefit of Freesync is it doesn't require the use of vsync for it to work where G-sync does. With Freesync these same people who require these high speed screens for competitive gaming don't have to put up with the additional latency associated with vsync. While I believe this isn't a big deal, before this latest below the VRR marketing push I've seen people argue until they are red in the face that it does. Why isn't anyone making a bigger deal of this aspect of the performance between the two? Seems to be just as important of an outlier situation as the low refresh handling.
 
Joined
Nov 9, 2010
Messages
5,649 (1.16/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
Understanding why they do things the way they do should come first before you whine and complain about costs of it.
LOL, I'm not "whining", just stating the reality. Nvidia's huge market share doesn't come from their oft broken, seldom used features. It comes from sheer GPU performance, brand loyalty, and higher profit margins. Many of the latter of which come from the business sector on high priced product.

You act like I don't understand, yet Nvidia continues to put out features that are not in fact living up to the hype. And GPUs ARE being given more VRAM. That will also be less of a problem in the future with stacked DRAM that both Nvidia and AMD have adopted.

The only thing Freesync really needs is lower refresh displays. It's already been proven in bench tests that within the refresh range of the display, Freesync performs equally, at a much lower cost. It seems many are in denial of that and trying to invent problems that don't exist.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Actually, it smells a lot like the relationship AMD and Intel had in the 2004-2006. Intel was found guilty of offering rebates and paying OEMs to use Intel processors. AMD has been pretty consistently competitive with NVIDIA in terms of performance and price as they were with CPUs back in 2004-2006.


Refreshing the display below the displays minimum refresh rate is what we have been discussing. G-Sync is more expensive because it addresses it using a frame buffer that is absolutely required to fix the problem. Instead of everyone running out and using G-Sync, I'd argue AMD needs to push eDP displays to fix the problem.
 

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,378 (2.37/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Question.

When I see a 'Freesync' monitor advertised, what does that mean? Does it hold in it's own hardware, nothing that Nvidia couldn't use if it wanted to? i.e. if Nvidia wrote some code in their drivers, their cards would instantly by compatible. Or, are Freesync monitors inherently only AMD compatible? Is the right term for a Freesync monitor not Variable Refresh Tech? and thus abandoning any proprietary leanings?

You know what I'm saying - Is a Freesync monitor AMD only or can (if Nvidia changed their minds) a 'Freesync' monitor be compatible with Nv cards (Kepler upwards?).

Until the tech is universal, still a waste of a monitor IMO.
 
Joined
Nov 10, 2008
Messages
1,982 (0.35/day)
Processor Intel Core i9 9900k @ 5.1GHZ all core load (8c 16t)
Motherboard MSI MEG Z390 ACE
Cooling Corsair H100i v2 240mm
Memory 32GB Corsair 3200mhz C16 (2x16GB)
Video Card(s) Powercolor RX 6900 XT Red Devil Ultimate (XTXH) @ 2.6ghz core, 2.1ghz mem
Storage 256GB WD Black NVME drive, 4TB across various SSDs/NVMEs, 4TB HDD
Display(s) Asus 32" PG32QUX (4k 144hz mini-LED backlit IPS with freesync & gsync & 1400 nit HDR)
Case Corsair 760T
Power Supply Corsair HX850i
Mouse Logitech G502 Lightspeed on powerplay mousemat
Keyboard Logitech G910
VR HMD Wireless Vive Pro & Valve knuckles
Software Windows 10 Pro
Question.

When I see a 'Freesync' monitor advertised, what does that mean? Does it hold in it's own hardware, nothing that Nvidia couldn't use if it wanted to? i.e. if Nvidia wrote some code in their drivers, their cards would instantly by compatible. Or, are Freesync monitors inherently only AMD compatible? Is the right term for a Freesync monitor not Variable Refresh Tech? and thus abandoning any proprietary leanings?

You know what I'm saying - Is a Freesync monitor AMD only or can (if Nvidia changed their minds) a 'Freesync' monitor be compatible with Nv cards (Kepler upwards?).

Until the tech is universal, still a waste of a monitor IMO.

The term "freesync" is an AMD marketing term for monitors that support the VESA variable refresh specification (there is no difference between a freesync monitor and a VESA variable refresh compatible monitor beyond the branding). Yes Nvidia could write an implementation for their GPUs tomorrow that support the VESA variable refresh spec (as long as there isn't a hardware limitation on their cards). That would then allow all variable refresh monitors (including all freesync branded monitors) to work on Nvidia cards at a variable refresh rate. Intel could also do the same thing for their integrated GPUs.
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
Refreshing the display below the displays minimum refresh rate is what we have been discussing. G-Sync is more expensive because it addresses it using a frame buffer that is absolutely required to fix the problem. Instead of everyone running out and using G-Sync, I'd argue AMD needs to push eDP displays to fix the problem.

Depends what you want to address VRR on/off collision or the PSR feature. You need a frame buffer if you want to filter each frame for PSR or overdrive matching.

TheTechReport said:
Everything I've just explained may seem terribly complicated, but the bottom line is straightforward. FreeSync's logic for handling low-FPS situations isn't anywhere near as bad as some folks have suggested, and it isn't all that different from G-Sync's. Nvidia's method of avoiding collisions seems like it might be superior in some ways, but we're talking about small differences.

Testing the practical impact of these differences in real games is tough. Nothing good is happening when your average frame rate is below 40 FPS, with bottlenecks other than the display's behavior coming into play. Sorting out what's a GPU slowdown and what's a display collision or quantization isn't always easy.
Still, I made an attempt in several games intensive enough to push our R9 290X below 40 FPS. Far Cry 4 was just a stutter-fest, with obvious system-based bottlenecks, when I cranked up the image quality. Crysis 3, on the other hand, was reasonably playable at around 35 FPS.
In fact, playing it was generally a good experience on the XL2730Z. I've seen low-refresh quantization effects before (by playing games on one of those 30Hz-only 4K monitors), and there was simply no sign of it here. I also had no sense of a transition happening when the frame rate momentarily ranged above 40Hz and then dipped back below it. The experience was seamless and reasonably fluid, even with vsync enabled for "out of bounds" frame intervals, which is how I prefer to play. My sense is that, both in theory and in practice, FreeSync handles real-world gaming situations at lower refresh rates in perfectly acceptable fashion. In fact, my satisfaction with this experience is what led me to push harder to understand everything I've explained above.

Collision addressing I think can be smidgetly improved but much like Nvidias attempts they havent been able to fix theres in almost 2years so I wouldn't keep my hopes up.

PCPerspective said:
  • This is not limited to the ROG Swift. All variable refresh panels we have tested (including 4K) see this effect to a more or less degree than reported here. Again, this only occurs when games instantaneously drop to 0 FPS, and not when those games dip into low frame rates in a continuous fashion.
  • The effect is less perceptible (both visually and with recorded data) at lower maximum refresh rate settings.
  • The effect is not present at fixed refresh rates (G-Sync disabled or with non G-Sync panels).

Depending on game engine your still getting some issues either way with each solution.

The real question is if you plan on sustaining a sub 40fps and I don't know why anyone would want to. Is the multiplying feature of G-Sync worth the premium $200 price.
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Depending on game engine your still getting some issues either way with each solution.
But that's my point: VRR should be entirely handled in the hardware with zero software beyond choosing refresh rate (whatever the screen can handle + variable). What you see on the screen should always be from the frame buffer and the GPU updates the frame buffer when possible. That way there is always a completed frame available for displaying, even at 60 Hz, even if the GPU failed to render a new frame since the last refresh. It smooths out all refresh rates regardless of the GPU attached. It doesn't matter what software ("game engine") is used as well because it is all handled below the software. It would improve games made a decade ago (e.g. Crysis) even no matter if VRR is enabled or not.

The only downsides are a little bit of cost relative to the resolution of the panel and a slight increase in latency.
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
But that's my point: VRR should be entirely handled in the hardware with zero software beyond choosing refresh rate (whatever the screen can handle + variable). What you see on the screen should always be from the frame buffer and the GPU updates the frame buffer when possible. That way there is always a completed frame available for displaying, even at 60 Hz, even if the GPU failed to render a new frame since the last refresh. It smooths out all refresh rates regardless of the GPU attached. It doesn't matter what software ("game engine") is used as well because it is all handled below the software. It would improve games made a decade ago (e.g. Crysis) even no matter if VRR is enabled or not.

The only downsides are a little bit of cost relative to the resolution of the panel and a slight increase in latency.

I see what your saying and I agree.

The issues with G-Sync method is its doing multiplier beyond the VRR minimal limit. It then goes into the realm of mcME we find in TVs.

G-Sync monitors
Asus ROG SWIFT 50-144 (G-Sync duplicates at 36hz and operates at 56hz)
BenQ 56-144 (G-Sync duplicate operation should be similar or slightly higher than ROG Swift)

PSR should never exceed the VRR minimal
PSR should always sustain its maximum

Ideal setup would be
VRR 40-144
PSR 39 and below

If your multiplying frames higher then your VRR minimal there really is no need to have your VRR that low because those frame would be smooth either way.
 
Last edited:
Joined
Nov 9, 2010
Messages
5,649 (1.16/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
Refreshing the display below the displays minimum refresh rate is what we have been discussing.
You know Ford, I'm not an idiot, I know full well what you're implying is needed, which is what I was addressing as inventing problems that don't exist. Displays can be easily made in the 30Hz or even less bottom end. If you're hovering below that, you're obviously using higher game settings than your hardware can handle.

Most games don't perform very well below 30Hz. It's as if you're saying Nvidia is prepared for very low frame rates because their graphics features aren't very well optimized. I think many here are in denial that Freesync sounds more appealing to consumers. If VESA thought it needed special hardware on the displays, I doubt they'd have adopted it as is. Nor have they adopted G-Sync.

Odd that you're trying to lecture me on what a handful of you are discussing/forcing, when you don't even see the writing on the wall. Look at the poll results, it's a good example of how consumers and manufacturers feel, which is what won the HD format war between BD and HD-DVD. This is a very similar situation.

With the advances that are coming, it shouldn't be too hard for avid gamers into gaming specific displays to maintain 30 FPS or higher. Nvidia, like many somewhat suspect console catering dev/pub teams, seem to be trying to convince gamers low FPS results are acceptable.
 
Last edited by a moderator:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I would call FreeSync's behavior at less than monitor minimum refresh rate a bug. There are games out there (like Crysis) that run perfectly fine at 40+ FPS until you get to one of the few situations where that drops potentially into the single digits. If you're saying the answer to the problem is QA testing of every game ever made so they never drop below [insert arbritrary number here] FPS, then you have PC gaming confused with consoles. The bug needs to be fixed in FreeSync; eDP 1.3 + DP 1.2a is the best solution.
 
Joined
Nov 9, 2010
Messages
5,649 (1.16/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
First off, I'm not talking absolute min frames, and I think you are confused by it. When they talk min frames in these tests, they're referring to how low the AVERAGE frame rate dips.

What you're suggesting is building the tech around the likelihood of frames hovering at rates so low that single digit FPS readings become as common as those hovering at 30 FPS, and there I'm talking momentary dips AVERAGING 30 FPS.

Technologies, GPU hardware and games have ALL been built around trying to hit certain FPS averages, why would it be anything new? The only time it's going to be a problem is if the person using say a 30Hz bottom end display frequently hovers below that in average FPS, and that describes a person using too low GPU power, and/or too high game settings, to realistically use such a tech.

Bottom line, I'd rather spend the $100 saved on the display on a more powerful GPU. Again, this comes back to expensive vs affordable displays, and again, how well is the former really working for Nvidia? They've had the tech long enough, yet it's not doing too well.

You seem to be implying you've got this figured out better than VESA does. VESA is anything are trying to keep this tech alive, vs seeing it stagnate like G-Sync.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
There's a monitor minimum refresh rate and a rendering minimum refresh rate. FreeSync monitors never drop below their minimum refresh rate even if the rendering did which causes many graphical artifacts. G-Sync monitors never drop below their minimum refresh rate because the GPU sends the last completed frame as many times as necessary to satisfy the monitor's needs.

What I'm suggesting is keeping the last frame always in the monitor so the monitor is never absent a completed frame to display be the GPU running at 1000 frames per second or 1. When the monitor is operating in VRR and it is a 40-144 Hz monitor, here's some examples of framerate versus refresh rate:

1 fps -> 144 Hz (frame 1 refreshed ~144 times)
2 fps -> 144 Hz (frame 1 refreshed ~72 times, frame 2 refreshed ~72 times)
4 fps -> 144 Hz (frame 1 refreshed ~36 times, frame 2 refreshed ~36 times, frame 3 refreshed ~36 times, frame 4 refreshed ~36 times)
8 fps -> 144 Hz (you get the idea)
16 fps -> 144 Hz
32 fps -> 144 Hz
64 fps -> 64 Hz (it has exceeded the minimum so now VRR can lock to the frame rate)
128 fps -> 128 Hz
256 fps -> 144 Hz (exceeded what the monitor can handle, every refresh shows the last completed frame from buffer, 144 in total)
512 fps -> 144 Hz (same as above, 72% of rendered frames are not displayed).

VESA did figure it out years ago but display manufacturers aren't implementing it. AMD and NVIDIA both use DP 1.2's Adaptive Sync capability. NVIDIA has a proprietary implementation of eDP. AMD uses no eDP in FreeSync (or a really cheap, RAM-free version of it), hence the problems. Neither are fully compliant with the standard VESA approved. I won't buy a VRR display until there is an implementation of Adaptive Sync that is fully compliant with the standard (cheaper than NVIDIA's proprietary implementation; better than AMD's partial implementation).
 
Last edited:
Top