• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Responds to NVIDIA G-Sync with FreeSync

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,683 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
At CES, various display makers exhibited their gaming-grade monitors featuring NVIDIA G-Sync, a display fluidity technology that's an evolution of V-sync, which we've seen with our own eyes to make a tangible difference. AMD, at the back-room of its CES booth, demoed what various sources are calling "FreeSync," a competitive technology to G-Sync, but one that doesn't require specialized hardware, or licenses to the display makers. AMD didn't give out too many details into the finer-workings of FreeSync, but here's what we make of it.

FreeSync taps into a lesser known feature that AMD Radeon GPUs have had for the past three generations (i.e. since Radeon HD 5000 series), called dynamic refresh rates. The feature allows GPUs to spool down refresh rates to save power, without entailing a display re-initialization (the flicker that happens when a digital display is sent a signal with a new resolution and refresh rate), on supported displays. Dynamic refresh is reportedly also a proposed addition to VESA specifications, and some (if not most) display makers have implemented it. On displays that do, AMD Catalyst drivers already run dynamic refresh rates. For display makers, supporting the technology won't require buying licenses, or integrating specialized hardware into the displays.

According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware. Although the results of FreeSync will be close to those of G-Sync, NVIDIA's technology will have an edge with its output quality, because the two are implemented differently, and by that we don't just mean how the hardware is laid out on a flow-chart, although the goals for both technologies is the same - to make a display's refresh rate slave to the GPU's frame-rate, rather than the other way around (with V-Sync).

In AMD's implementation, VBLANK length (interval between two refresh cycles where the GPU isn't putting out "new" frames, a sort of placebo frames) is variable, and the driver has to speculate what VBLANK length to set for the next frame; whereas, in NVIDIA's implementation, the display holds onto a VBLANK until the next frame is received. In NVIDIA's implementation, the GPU sends out whatever frame-rate the hardware can manage, while the monitor handles the "sync" part. In AMD's the speculation involved in setting the right VBLANK length for the next frame could cause some software overhead for the host system. That overhead is transferred to the display in NVIDIA's implementation. We're looking forward to AMD's whitepaper on FreeSync. AMD holds the advantage when it comes to keeping costs down when implementing the technology. Display makers have to simply implement something that VESA is already deliberating over. The Toshiba laptops AMD used in its FreeSync demo at CES already do.

View at TechPowerUp Main Site
 
FreeSync taps into a lesser known feature that AMD Radeon GPUs have had for the past three generations (i.e. since Radeon HD 5000 series), called dynamic refresh rates.

Past 3 generations eh? So they waited until now to tell us this? I call BS, typical AMD PR stunt.

...the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware.

Oh, you mean like the adaptive v-sync setting in the Nvidia control panel?

GTX-680-120.jpg

NVIDIA said:
NVIDIA Adaptive VSync makes your gaming experience smoother and more responsive by eliminating frame rate stuttering and screen tearing
 
Last edited:
Past 3 generations eh? So they waited until now to tell us this?

Yes, because nobody cared about dynamic refresh rates until now (with G-Sync).
 
Past 3 generations eh? So they waited until now to tell us this? I call BS, typical AMD PR stunt.

Why so surprised? AMD had something useful and they didn't market it, because their marketing skills are abysmal, and they assumed nobody would care about it. Kinda like that entire section they sold off that's now incredibly successful. They're only quietly marketing it now because NVidia is set to make a killing on GSync monitors.
 
BAM!!....and there they are again lol.
 
Yes, because nobody cared about dynamic refresh rates until now (with G-Sync).

What makes you think nobody cared about it? people have been complaining over vsync on/off tearing/stuttering issues ever since its implementation. Not to name any names, but Lucid Virtu anyone?
 
Last edited:
What makes you think nobody cared about it? people have been complaining over vsync tearing/stuttering issues ever since its implementation. Not to name any names, but Lucid Virtu anyone?
What did Virtu do for refresh rates? I thought they only worked on hybridizing GPUs.
 
What did Virtu do for refresh rates? I thought they only worked on hybridizing GPUs.

google "hyperformance" and "virtual vsync"

now we know that its mostly a dud, (as with most software optimizations regarding this area), but it was a hot selling point for their chipset back in the days.
 
What makes you think nobody cared about it? people have been complaining over vsync tearing/stuttering issues ever since its implementation. Not to name any names, but Lucid Virtu anyone?

lucid virtu makes almost my games running horribly stuttering even on single monitor. It's just play well on creating a wonderful benchmark score. I guess no one using lucid virtu while playing games.
 
What makes you think nobody cared about it? people have been complaining over vsync tearing/stuttering issues ever since its implementation.

Ummm...you do know vsync helps gets rid of tearing/stuttering issues, not cause them right?

I honestly don't see how people can play without it. Just seeing one screen tear sends me into inis, control panels, and scrounging the web to find solutions to enable vsync in a game that forgot to add it.

Barely could get through Uncharted and a few other PS3 games because vsync wasn't enabled due to my HDfury adapter breaking it. No, I don't have an HDCP monitor. I bought the gen before everyone suddenly made it standard. Lousy DRM companies.
 
lucid virtu makes almost my games running horribly stuttering even on single monitor. It's just play well on creating a wonderful benchmark score. I guess no one using lucid virtu while playing games.

exactly, which is why I said a software solution to a hardware problem genuinely doesn't works, unless this is some massive breakthrough that we miss a few years ago and nobody knew about, I'll be convinced until I see some real results.
 
Last edited:
Ummm...you do know vsync helps gets rid of tearing/stuttering issues, not cause them right?

I honestly don't see how people can play without it. Just seeing one screen tear sends me into inis, control panels, and scrounging the web to find solutions to enable vsync in a game that forgot to add it.

Barely could get through Uncharted and a few other PS3 games because vsync wasn't enabled due to my HDfury adapter breaking it. No, I don't have an HDCP monitor. I bought the gen before everyone suddenly made it standard. Lousy DRM companies.

I never said the vsync was creating the tear -it removes them yes. I simply summarized the issue with one sentence - I thought people would get the point, but now I have to type this entire paragraph just to explain it again. vsync was traditionally known for stuttering/lag on some video cards/games, it was never the perfect solution as gamers had to choose between tear or lag.

Also, vsync does not get rid of stuttering, it creates them because stuttering is caused by having vsync enabled when your frame rates drops under the monitor refresh rate.
 
Last edited:
Hmm... NVidia is certainly better at capitalizing on their own ideas. If you want to take advantage of G-Sync, you'll have to buy multiple NV products. Somewhat reminiscent of SLI, no?

AMD comes out with a tech that they'll let everyone take advantage of.

It remains to be seen which one is superior. I'm guessing NV.
 
BTW, AMD/ATi already patented such same tech as those sync stuff in 2002

: [url]http://www.google.com/patents/US20080055318[/URL]

it called Dynamic frame rate adjustment.

perhaps, this is why AMD not care to much with gsync.

cuz they have it 11 years ago.
 
...you'll have to buy multiple NV products. Somewhat reminiscent of SLI, no?

AMD comes out with a tech that they'll let everyone take advantage of.

like Mantle and crossfire?

...oh wait.
 
We need more info about this FreeSync, also a comparison test sometimes...
I love anything that is free btw
 
I never said the vsync was creating the tear -it removes them yes. I simply summarized the issue with one sentence - I thought people would get the point, but now I have to type this entire paragraph just to explain it again. vsync was traditionally known for stuttering/lag on some video cards/games, it was never the perfect solution as gamers had to choose between tear or lag.

Also, vsync does not get rid of stuttering, it creates them because stuttering is caused by having vsync enabled when your frame rates drops under the monitor refresh rate.


having vsync off caused tearing.

having vsync turned on with a badly coded game engine COULD cause stuttering. games fault since they assumed no one had hardware fast enough to run past 60FPS, thus they never looked into the issue.
 
Man, that is the worst analysis ever. It doesn't provide any concrete example or data, it just trashes AMD for free. Plus there are some really immature statements in that "article" :)

My thoughts exactly, that guy seems like a douche, and provided no information other than "I think GSync looks better better on a demo after seeing Freesync in a backroom on a tiny laptop NOT DESIGNED FOR GAMES".

AMD are just complete retards. NVidia demoes their polished GSync probably on some real nice hardware. AMD clearly quickly rushed out a demo of a crappy little laptop showing off technology they haven't even bothered to spit-shine. They would have been better off waiting, polishing it up, and then showcasing it on a nice HD monitor with a 290X or something, you know, REALLY marketing it hard, showing how awesome it is on some high end hardware in a AAA title.
 
Last edited:
My thoughts exactly, that guy seems like a douche, and provided no information other than "I think GSync looks better better on a demo after seeing Freesync in a backroom on a tiny laptop NOT DESIGNED FOR GAMES".

AMD are just complete retards. NVidia demoes their polished GSync probably on some real nice hardware. AMD clearly quickly rushed out a demo of a crappy little laptop showing off technology they haven't even bothered to spit-shine. They would have been better off waiting, polishing it up, and then showcasing it on a nice HD monitor with a 290X or something, you know, REALLY marketing it hard, showing how awesome it is on some high end hardware in a AAA title.


way to miss the point - they wanted to show it CAN work on existing hardware, including where it matters most - crap low end hardware that cant push 60FPS.
 
way to miss the point - they wanted to show it CAN work on existing hardware, including where it matters most - crap low end hardware that cant push 60FPS.

Freesync is supposed to be competing against GSync right? I'm fairly certain people buying GSync monitors have enough cash to splash on a good system considering they're spending so much on a GSync monitor. If this isn't for the high end market like GSync monitors, then it isn't competing at all, just another bit of free stuff for everyone.
 
Man, that is the worst analysis ever. It doesn't provide any concrete example or data, it just trashes AMD for free. Plus there are some really immature statements in that "article" :)

My thoughts exactly, that guy seems like a douche, and provided no information other than "I think GSync looks better better on a demo after seeing Freesync in a backroom on a tiny laptop NOT DESIGNED FOR GAMES".

Ending AMD hype - worst analysis ever.

Just a few examples: False [url]http://fudzilla.net/home/item/33570-kaveri-presentation-leaked-fails-to-impress[/URL]
True http://fudzilla.net/home/item/33558-nvidia-heading-for-a-spanking-this-year
 
Patented 7-11 years ago by ati and implemented 3 years ago in AMD gpus, along with something they're pushing for in the vesa standard, but didn't improve/capitalize on because conflicts with marketing budget/open-standard mentality?

So very, very, very typical of ATi...history repeats itself for the nth time. It sounds like pretty much every baseline hardware block/gpu-use implementation outside hardware T&L for the last decade. ATi takes an idea, implements it, pushes for it to be a standard in DX/OGL while it goes unused because by definition of invention initially proprietary, nvidia makes a version much later that based on that initial idea but developed further and pushed harder (because of marketing or newer fab processes affording them the space to implement it) usually at that point in a needlessly proprietary manner, and then eventually it becomes a standard.

Another entry into the forward-thinking but badly capitalized technology of ati. May they forever be the guinea pigs that break the ice that allows nvidia to publicize it so we all eventually benefit. Hopefully the open version of the tech, now that it is in fashion, is further realized and adopted.

I'mma file this right next to TRUFORM and CTM, and hope this turns out equally as well as those eventually did and will.
 
Last edited:
Back
Top