1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Responds to NVIDIA G-Sync with FreeSync

Discussion in 'News' started by btarunr, Jan 7, 2014.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,959 (11.02/day)
    Thanks Received:
    13,757
    Location:
    Hyderabad, India
    At CES, various display makers exhibited their gaming-grade monitors featuring NVIDIA G-Sync, a display fluidity technology that's an evolution of V-sync, which we've seen with our own eyes to make a tangible difference. AMD, at the back-room of its CES booth, demoed what various sources are calling "FreeSync," a competitive technology to G-Sync, but one that doesn't require specialized hardware, or licenses to the display makers. AMD didn't give out too many details into the finer-workings of FreeSync, but here's what we make of it.

    FreeSync taps into a lesser known feature that AMD Radeon GPUs have had for the past three generations (i.e. since Radeon HD 5000 series), called dynamic refresh rates. The feature allows GPUs to spool down refresh rates to save power, without entailing a display re-initialization (the flicker that happens when a digital display is sent a signal with a new resolution and refresh rate), on supported displays. Dynamic refresh is reportedly also a proposed addition to VESA specifications, and some (if not most) display makers have implemented it. On displays that do, AMD Catalyst drivers already run dynamic refresh rates. For display makers, supporting the technology won't require buying licenses, or integrating specialized hardware into the displays.

    According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware. Although the results of FreeSync will be close to those of G-Sync, NVIDIA's technology will have an edge with its output quality, because the two are implemented differently, and by that we don't just mean how the hardware is laid out on a flow-chart, although the goals for both technologies is the same - to make a display's refresh rate slave to the GPU's frame-rate, rather than the other way around (with V-Sync).

    In AMD's implementation, VBLANK length (interval between two refresh cycles where the GPU isn't putting out "new" frames, a sort of placebo frames) is variable, and the driver has to speculate what VBLANK length to set for the next frame; whereas, in NVIDIA's implementation, the display holds onto a VBLANK until the next frame is received. In NVIDIA's implementation, the GPU sends out whatever frame-rate the hardware can manage, while the monitor handles the "sync" part. In AMD's the speculation involved in setting the right VBLANK length for the next frame could cause some software overhead for the host system. That overhead is transferred to the display in NVIDIA's implementation. We're looking forward to AMD's whitepaper on FreeSync. AMD holds the advantage when it comes to keeping costs down when implementing the technology. Display makers have to simply implement something that VESA is already deliberating over. The Toshiba laptops AMD used in its FreeSync demo at CES already do.

    Sources: The TechReport, AnandTech
     
    sttubs says thanks.
  2. arterius2

    Joined:
    May 21, 2011
    Messages:
    520 (0.40/day)
    Thanks Received:
    105
    Past 3 generations eh? So they waited until now to tell us this? I call BS, typical AMD PR stunt.

    Oh, you mean like the adaptive v-sync setting in the Nvidia control panel?

    [​IMG]
     
    Last edited: Jan 7, 2014
  3. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,959 (11.02/day)
    Thanks Received:
    13,757
    Location:
    Hyderabad, India
    Yes, because nobody cared about dynamic refresh rates until now (with G-Sync).
     
  4. RCoon

    RCoon Gaming Moderator Staff Member

    Joined:
    Apr 19, 2012
    Messages:
    7,966 (8.17/day)
    Thanks Received:
    4,082
    Location:
    Gypsyland, UK
    Why so surprised? AMD had something useful and they didn't market it, because their marketing skills are abysmal, and they assumed nobody would care about it. Kinda like that entire section they sold off that's now incredibly successful. They're only quietly marketing it now because NVidia is set to make a killing on GSync monitors.
     
  5. buggalugs

    buggalugs

    Joined:
    Jul 19, 2008
    Messages:
    957 (0.41/day)
    Thanks Received:
    154
    Location:
    Australia
    BAM!!....and there they are again lol.
     
  6. arterius2

    Joined:
    May 21, 2011
    Messages:
    520 (0.40/day)
    Thanks Received:
    105
    What makes you think nobody cared about it? people have been complaining over vsync on/off tearing/stuttering issues ever since its implementation. Not to name any names, but Lucid Virtu anyone?
     
    Last edited: Jan 7, 2014
  7. Solidstate89

    Solidstate89

    Joined:
    May 29, 2012
    Messages:
    209 (0.22/day)
    Thanks Received:
    40
    Location:
    Western New York
    What did Virtu do for refresh rates? I thought they only worked on hybridizing GPUs.
     
  8. arterius2

    Joined:
    May 21, 2011
    Messages:
    520 (0.40/day)
    Thanks Received:
    105
    google "hyperformance" and "virtual vsync"

    now we know that its mostly a dud, (as with most software optimizations regarding this area), but it was a hot selling point for their chipset back in the days.
     
  9. SIGSEGV

    SIGSEGV

    Joined:
    Mar 31, 2012
    Messages:
    510 (0.51/day)
    Thanks Received:
    107
    lucid virtu makes almost my games running horribly stuttering even on single monitor. It's just play well on creating a wonderful benchmark score. I guess no one using lucid virtu while playing games.
     
  10. NC37

    NC37

    Joined:
    Oct 30, 2008
    Messages:
    1,205 (0.54/day)
    Thanks Received:
    270
    Ummm...you do know vsync helps gets rid of tearing/stuttering issues, not cause them right?

    I honestly don't see how people can play without it. Just seeing one screen tear sends me into inis, control panels, and scrounging the web to find solutions to enable vsync in a game that forgot to add it.

    Barely could get through Uncharted and a few other PS3 games because vsync wasn't enabled due to my HDfury adapter breaking it. No, I don't have an HDCP monitor. I bought the gen before everyone suddenly made it standard. Lousy DRM companies.
     
  11. arterius2

    Joined:
    May 21, 2011
    Messages:
    520 (0.40/day)
    Thanks Received:
    105
    exactly, which is why I said a software solution to a hardware problem genuinely doesn't works, unless this is some massive breakthrough that we miss a few years ago and nobody knew about, I'll be convinced until I see some real results.
     
    Last edited: Jan 7, 2014
  12. arterius2

    Joined:
    May 21, 2011
    Messages:
    520 (0.40/day)
    Thanks Received:
    105
    I never said the vsync was creating the tear -it removes them yes. I simply summarized the issue with one sentence - I thought people would get the point, but now I have to type this entire paragraph just to explain it again. vsync was traditionally known for stuttering/lag on some video cards/games, it was never the perfect solution as gamers had to choose between tear or lag.

    Also, vsync does not get rid of stuttering, it creates them because stuttering is caused by having vsync enabled when your frame rates drops under the monitor refresh rate.
     
    Last edited: Jan 7, 2014
  13. Sasqui

    Sasqui

    Joined:
    Dec 6, 2005
    Messages:
    7,800 (2.36/day)
    Thanks Received:
    1,523
    Location:
    Manchester, NH
    Hmm... NVidia is certainly better at capitalizing on their own ideas. If you want to take advantage of G-Sync, you'll have to buy multiple NV products. Somewhat reminiscent of SLI, no?

    AMD comes out with a tech that they'll let everyone take advantage of.

    It remains to be seen which one is superior. I'm guessing NV.
     
  14. Rahmat Sofyan

    Rahmat Sofyan

    Joined:
    Sep 7, 2011
    Messages:
    99 (0.08/day)
    Thanks Received:
    20
    Location:
    Pekanbaru -Riau - Indonesia - Earth
    BTW, AMD/ATi already patented such same tech as those sync stuff in 2002

    : http://www.google.com/patents/US20080055318

    it called Dynamic frame rate adjustment.

    perhaps, this is why AMD not care to much with gsync.

    cuz they have it 11 years ago.
     
  15. Wittermark New Member

    Joined:
    Dec 4, 2012
    Messages:
    22 (0.03/day)
    Thanks Received:
    4
    Location:
    uk
    like Mantle and crossfire?

    ...oh wait.
     
  16. Prima.Vera

    Prima.Vera

    Joined:
    Sep 15, 2011
    Messages:
    2,284 (1.92/day)
    Thanks Received:
    299
    We need more info about this FreeSync, also a comparison test sometimes...
    I love anything that is free btw
     
  17. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,551 (11.42/day)
    Thanks Received:
    9,824

    having vsync off caused tearing.

    having vsync turned on with a badly coded game engine COULD cause stuttering. games fault since they assumed no one had hardware fast enough to run past 60FPS, thus they never looked into the issue.
     
  18. Big_Vulture

    Big_Vulture New Member

    Joined:
    Jul 10, 2013
    Messages:
    27 (0.05/day)
    Thanks Received:
    6
    Is this also working (between 20-30FPS) where Nvidia G-sync not good?
     
  19. Recus

    Recus

    Joined:
    Jul 10, 2011
    Messages:
    533 (0.42/day)
    Thanks Received:
    184
  20. Prima.Vera

    Prima.Vera

    Joined:
    Sep 15, 2011
    Messages:
    2,284 (1.92/day)
    Thanks Received:
    299
  21. RCoon

    RCoon Gaming Moderator Staff Member

    Joined:
    Apr 19, 2012
    Messages:
    7,966 (8.17/day)
    Thanks Received:
    4,082
    Location:
    Gypsyland, UK
    My thoughts exactly, that guy seems like a douche, and provided no information other than "I think GSync looks better better on a demo after seeing Freesync in a backroom on a tiny laptop NOT DESIGNED FOR GAMES".

    AMD are just complete retards. NVidia demoes their polished GSync probably on some real nice hardware. AMD clearly quickly rushed out a demo of a crappy little laptop showing off technology they haven't even bothered to spit-shine. They would have been better off waiting, polishing it up, and then showcasing it on a nice HD monitor with a 290X or something, you know, REALLY marketing it hard, showing how awesome it is on some high end hardware in a AAA title.
     
    Last edited: Jan 7, 2014
  22. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,551 (11.42/day)
    Thanks Received:
    9,824

    way to miss the point - they wanted to show it CAN work on existing hardware, including where it matters most - crap low end hardware that cant push 60FPS.
     
  23. RCoon

    RCoon Gaming Moderator Staff Member

    Joined:
    Apr 19, 2012
    Messages:
    7,966 (8.17/day)
    Thanks Received:
    4,082
    Location:
    Gypsyland, UK
    Freesync is supposed to be competing against GSync right? I'm fairly certain people buying GSync monitors have enough cash to splash on a good system considering they're spending so much on a GSync monitor. If this isn't for the high end market like GSync monitors, then it isn't competing at all, just another bit of free stuff for everyone.
     
  24. Recus

    Recus

    Joined:
    Jul 10, 2011
    Messages:
    533 (0.42/day)
    Thanks Received:
    184
    Ending AMD hype - worst analysis ever.

    Just a few examples: False http://fudzilla.net/home/item/33570-kaveri-presentation-leaked-fails-to-impress
    True http://fudzilla.net/home/item/33558-nvidia-heading-for-a-spanking-this-year
     
  25. alwayssts

    alwayssts

    Joined:
    May 13, 2008
    Messages:
    387 (0.16/day)
    Thanks Received:
    87
    Patented 7-11 years ago by ati and implemented 3 years ago in AMD gpus, along with something they're pushing for in the vesa standard, but didn't improve/capitalize on because conflicts with marketing budget/open-standard mentality?

    So very, very, very typical of ATi...history repeats itself for the nth time. It sounds like pretty much every baseline hardware block/gpu-use implementation outside hardware T&L for the last decade. ATi takes an idea, implements it, pushes for it to be a standard in DX/OGL while it goes unused because by definition of invention initially proprietary, nvidia makes a version much later that based on that initial idea but developed further and pushed harder (because of marketing or newer fab processes affording them the space to implement it) usually at that point in a needlessly proprietary manner, and then eventually it becomes a standard.

    Another entry into the forward-thinking but badly capitalized technology of ati. May they forever be the guinea pigs that break the ice that allows nvidia to publicize it so we all eventually benefit. Hopefully the open version of the tech, now that it is in fashion, is further realized and adopted.

    I'mma file this right next to TRUFORM and CTM, and hope this turns out equally as well as those eventually did and will.
     
    Last edited: Jan 7, 2014

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page