Tuesday, January 7th 2014

AMD Responds to NVIDIA G-Sync with FreeSync

At CES, various display makers exhibited their gaming-grade monitors featuring NVIDIA G-Sync, a display fluidity technology that's an evolution of V-sync, which we've seen with our own eyes to make a tangible difference. AMD, at the back-room of its CES booth, demoed what various sources are calling "FreeSync," a competitive technology to G-Sync, but one that doesn't require specialized hardware, or licenses to the display makers. AMD didn't give out too many details into the finer-workings of FreeSync, but here's what we make of it.

FreeSync taps into a lesser known feature that AMD Radeon GPUs have had for the past three generations (i.e. since Radeon HD 5000 series), called dynamic refresh rates. The feature allows GPUs to spool down refresh rates to save power, without entailing a display re-initialization (the flicker that happens when a digital display is sent a signal with a new resolution and refresh rate), on supported displays. Dynamic refresh is reportedly also a proposed addition to VESA specifications, and some (if not most) display makers have implemented it. On displays that do, AMD Catalyst drivers already run dynamic refresh rates. For display makers, supporting the technology won't require buying licenses, or integrating specialized hardware into the displays.

According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware. Although the results of FreeSync will be close to those of G-Sync, NVIDIA's technology will have an edge with its output quality, because the two are implemented differently, and by that we don't just mean how the hardware is laid out on a flow-chart, although the goals for both technologies is the same - to make a display's refresh rate slave to the GPU's frame-rate, rather than the other way around (with V-Sync).

In AMD's implementation, VBLANK length (interval between two refresh cycles where the GPU isn't putting out "new" frames, a sort of placebo frames) is variable, and the driver has to speculate what VBLANK length to set for the next frame; whereas, in NVIDIA's implementation, the display holds onto a VBLANK until the next frame is received. In NVIDIA's implementation, the GPU sends out whatever frame-rate the hardware can manage, while the monitor handles the "sync" part. In AMD's the speculation involved in setting the right VBLANK length for the next frame could cause some software overhead for the host system. That overhead is transferred to the display in NVIDIA's implementation. We're looking forward to AMD's whitepaper on FreeSync. AMD holds the advantage when it comes to keeping costs down when implementing the technology. Display makers have to simply implement something that VESA is already deliberating over. The Toshiba laptops AMD used in its FreeSync demo at CES already do.
Sources: The TechReport, AnandTech
Add your own comment

53 Comments on AMD Responds to NVIDIA G-Sync with FreeSync

#51
Wanton
Has anyone seen any other demos about FreeSync than that wind turbine one?
I mean that isn't very comparable to games where frames close to each other can have huge difference in rendering time.
If FreeSync works by predicting and dynamically setting refresh interval and not by lets call it frame holding like G-Sync, you will get stutter in games for sure. Because you can't just predict frame rending time accurately. Maybe you can buffer some frames but then you get more latency.

Maybe VESA Direct Drive can help AMD to achieve something like G-Sync in the future. But I think that needs more hardware on display card side.
Posted on Reply
#52
Xzibit
WantonHas anyone seen any other demos about FreeSync than that wind turbine one?
I mean that isn't very comparable to games where frames close to each other can have huge difference in rendering time.
If FreeSync works by predicting and dynamically setting refresh interval and not by lets call it frame holding like G-Sync, you will get stutter in games for sure. Because you can't just predict frame rending time accurately. Maybe you can buffer some frames but then you get more latency.

Maybe VESA Direct Drive can help AMD to achieve something like G-Sync in the future. But I think that needs more hardware on display card side.
?

The G-Sync module is a add-on TCON that uses the DD

FreeSync uses the eDP TCON. Since its standard it won't need to be a add-on through DD or replace any others.
Posted on Reply
#53
Wanton
Xzibit?

The G-Sync module is a add-on TCON that uses the DD

FreeSync uses the eDP TCON. Since its standard it won't need to be a add-on through DD or replace any others.
Ok but eDP is embedded display port. What about normal displays? There are probably some people who game on laptops but what about others?
Posted on Reply
Add your own comment
May 12th, 2024 19:36 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts