1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Crossfire stutter fixed by freesync?

Discussion in 'AMD / ATI' started by X828, Feb 5, 2017.

  1. GhostRyder

    GhostRyder

    Joined:
    Apr 29, 2014
    Messages:
    3,617 (3.14/day)
    Thanks Received:
    2,088
    Location:
    Texas
    Well coming from 3 R9 290X's and a 1440p FreeSync Acer monitor I will say I did not experience stuttering in the games I played (Though Rise of the Tomb did not like the 3rd card). The few games in the past I did experience stuttering on were fixed via AMD software updates but those were along time ago. I mostly played the Battlefield series, Tomb Raider (both of em), League of Legends, and some other random assortments of games temporarily and did not have the issue unless it was a game that specifically did not support CFX (Mostly some indie games or early DayZ and ARK). To me, on the 1440p 144hz setup with Freesync games were great especially my FPS's and League of Legends as that is where I game the most and I had 0 issues with them.

    I would recommend though getting the best single GPU you can get though before going CFX or SLI as both of the techs are good only when they work (Which for the most part all AAA games are pretty quick especially in this day and age to get that support). Personally speaking, I have tried both and to my eyes and experience they play out exactly the same (G-Sync vs Freesync 144hz 1440p in BF4 and 1 with multi-cards) and there really is no stutter in either of them. I would wait for Vega if you are set on replacing everything just to see what value comes with the new cards since it should be soon from the sound of things and then decide. The downside to investing in G-Sync is the price where as Freesync normally does not have (At least much) of an up charge.

    But to answer your question directly, no, if the game is going to stutter due to not working in SLI/CFX neither of the techs will fix that. It will however fix Vsync lag.
     
    londiste says thanks.
  2. londiste

    Joined:
    Feb 3, 2017
    Messages:
    98 (0.70/day)
    Thanks Received:
    30
    you mentioned dynamic contrast. hdr does not mean that dynamic contrast is not used. hdr has nothing to do with it one way or another.
    at least in hdr tv-s the technology used is very similar to if not the same as dynamic contrast as max brightness has limitation for how long it can be displayed before it will be brought down to lower level.

    and that same page says (somewhat) clearly - note the footnote at the exact statement you quoted: FreeSync 2 does not require HDR capable monitors;
    don't buy into the hype :)
     
    Last edited: Feb 15, 2017
  3. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    20,026 (6.30/day)
    Thanks Received:
    9,275
    Location:
    IA, USA
    http://www.businesswire.com/news/ho...Defines-Premium-Home-Entertainment-Experience
    Former is contrast ratio of 20,000:1 (displays in direct sunlight), latter is 1,080,000:1. Your average SDR monitor has a contrast ratio of ~1000:1. Your average monitor is 350 cd/m2 max brightness. HDR has max brightness of at least 540 cd/m2.

    HDR has no reason for dynamic contrast (again, no "film").

    Whole quote:
    They're talking about the API. If you're playing a game that supports the FreeSync 2 API and you don't have an HDR monitor, the API will still be able to output SDR content in "native mode."
     
    Last edited: Feb 15, 2017
    Crunching for Team TPU
  4. londiste

    Joined:
    Feb 3, 2017
    Messages:
    98 (0.70/day)
    Thanks Received:
    30
    your reference to hdr definitions is right. first one is aimed at lcd-s and second at oleds.
    - starting with the second - "infinite" contrast with oled is quite easily achievable due to black level being essentially 0 (pixel turned off).
    - for the first - 20 000:1 static contrast is simply not achievable with lcd. best panels available are *va-s that these days have reached 3 000:1 contrast, perhaps 4 000:1. this is awesome but far from the 1:20 000 figure. the larger figure so far has been done with and will continue to be done with dynamic contrast. there is just no way around this. the other lcd technologies are far worse, tn and ips/pls peak at around 1 000:1, exactly as you said.

    the important bit to understand here is that the actual panel technologies are the same. there have been no significant breakthroughs in there. 10bit panels, or realistically mostly 8bit+fcr panels, will be used (ruling out almost all current tn panels). These have existed for a long time in a different sector - professional graphics. brightness, contrast and their limitations will remain the same as they currently are.

    monitors can easily be made to be brighter than 350cm/m2, there just has been no reason to do so as it will directly affect black levels, also color accuracy. plus, there has been no real use for extremely bright monitors. High level of brightness is very tiring to eyes. Also worth noting is that calibrated monitors are usually set to around 100 nits.

    freesync2 is a whole pot of different things and amd is doing their darnedest to clump all of it together and make things as fuzzy as possible for good marketing.
    1. there is monitor (or rather the panel) - freesync2 pr implies that hdr monitors will be used (high-brightness pixels, excellent black levels, wide color gamut, hdr etc.). this is bullshit. as the same page states, hdr is not a requirement and if monitor is not hdr, there is no hdr image.
    2. there is the interface - displayport is utilized in a standard way as part of freesync2. 10-bit signal, hdr metadata.
    3. there is some magic on computer side. parts of it in hardware (i would assume some rudimentary support for refresh rate syncing), large part of this is in software (incorporated into drivers, mostly).
    the last part is only part where api comes into play. looks like this is the part that will handle the color conversion, reducing the latency, does the lfc and other things.

    at this point, they seem to aim at windows' color api with api part of freesync2. games themselves will not have to do anything directly with freesync2.

    i honestly do not have a clue as to what amd means with this quote. which is not a good thing considering that this is an official press release of their new killer feature.
     
    Last edited: Feb 16, 2017
  5. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    20,026 (6.30/day)
    Thanks Received:
    9,275
    Location:
    IA, USA
    Imagine driving in a game at night. The headlights of cars will be 540 nits + while the darkness of night behind it will be <1 nit; also known as realistic. Panels today with their dynamic contrast have to tone down the headlights simply because they cannot show something that bright while the rest of the scene is dark.


    1. There will be FreeSync 2 certified monitors.
    2. FreeSync 2 requires DisplayPort 1.4.
    3. Lots of components here:
    3a. FreeSync 2 compatible graphics card (any AMD card with DisplayPort 1.4).
    3b. FreeSync 2 compatible drivers (I believe ReLive or newer).
    3c. The software needs to use the FreeSync 2 API (it has the tone map that it runs all of the data past in the card and adjusts brightness and bit rate so it matches the monitor).

    I think AMD's long term goal is like Mantle and FreeSync itself. They want VESA to adopt a standard of it which Vulkan and DirectX will embed then HDR support in software becomes easy. AMD will keep the FreeSync 2 certification program going for years so people that want the best damn picture available and are willing to pay for it can get it.

    The brightness. HDR monitors are capable of blinding you. HDR features, therefore, need to be disabled when not running HDR compliant software. Basically it is a "do no harm" clause.



    Edit: Here's a good picture that demonstrates what HDR is all about:
    [​IMG]
    On the right, it shows the two pictures that were actually taken: one is super bright because of direct sunlight and the other is super dark because of shadow. The photo on the left is the HDR combination of them both so you get the light of the bright photo but the detail of the dark photo. The left photo closely mimics what you would see standing there. The photos on the right show limitations of the camera.
     
    Last edited: Feb 16, 2017
    Crunching for Team TPU
  6. londiste

    Joined:
    Feb 3, 2017
    Messages:
    98 (0.70/day)
    Thanks Received:
    30
    i get the use case and if we are talking about oled monitors, sure. cost is the only objection there. the models available right now cost 5k+$/€/£ and there is no real hope of these becoming much cheaper soon. tv-s are cheaper. 2k€/$/£ will get you an awesome oled tv with (mostly) minor problems for usage as a monitor.

    for lcd-s, this is just not happening without dynamic contrast. <1nit is not dark enough.
    at 1000:1 contrast ratio, darkness will be 0.54. that is far from dark. turn your monitor brightness to 100% and see what black looks like. this is likely below 0.54 :)
    at 20 000:1 it would be 0.027 which is close to what best lcd monitors do today... at around 100 nit brightness for white (very good mva panel with 4000:1 contrast level).

    the best indication to go by is the two nvidia's g-sync hdr monitors they mentioned around ces, asus pg27uq and acer xb272-hdr. both are based on au optronics 27" uhd ips panel. wide gamut, 144hz and 384 dynamically controlled led backline zones. prices are expected to be 1200+$/€/£. this is, dynamic backlight control for zones around a square inch each.

    also, as per amd, hdr will not be freesync2 requirement.
    we should probably split the entire hdr argument into a separate thread :)

    agreed.
    according to amd, the requirement for this certification will be lcf - maximum frequency at least 2.5x higher than minimum.
    that is it.

    i don't see why. color management is usually done at the operating system level. if i remember correctly, hdr support should come to windows 10 with creators' update.
     
    Last edited: Feb 16, 2017
  7. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    20,026 (6.30/day)
    Thanks Received:
    9,275
    Location:
    IA, USA
    I never said it would be cheap but there's economy of scale. The more production there is for OLED, the cheaper it will become.

    The FreeSync 2 monitor certification program has not been qualified yet. From what AMD has said, it's pretty clear that LFC and HDR are going to be pillars of it.
    That same article explains why the API needs to switch between SDR and HDR.


    There will be lots of FreeSync monitors but only some FreeSync 2 certified monitors.
     
    Crunching for Team TPU
  8. londiste

    Joined:
    Feb 3, 2017
    Messages:
    98 (0.70/day)
    Thanks Received:
    30
    oh, now i see where that srgb and brightness quote came from.
    so, this is the part that requires api for some reason.
    practically all non-tn monitors today are capable of srgb. i am not sure what 2x the brightness is compared to.

    edit:
    this is getting interesting. this time around nvidia seems to go for standard solution in form of hdr10 and amd goes for more complicated proprietary solution.

    edit2:
    the real problem here is that amd has not provided good detailed technical overview of what freesync2 is going to be. as counter intuitive as that sounds considering that they have a nice presentation and a press release about this.
     
    Last edited: Feb 16, 2017
  9. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    20,026 (6.30/day)
    Thanks Received:
    9,275
    Location:
    IA, USA
    Because G-Sync is already end-to-end proprietary. AMD right now uses VESA's standards except for where they can't (e.g. LFC). I wouldn't be surprised if DisplayPort 1.5 or 1.6 has the necessary amendments to it so that FreeSync 2 doesn't have to be proprietary. VESA is basically forcing AMD off-standard to compete with G-Sync.

    AMD will publish certification requirements when the technology is finalized. This tech is still in its infancy and they have to work with monitor manufacturers to figure out what is reasonable to require.
     
    Crunching for Team TPU
  10. londiste

    Joined:
    Feb 3, 2017
    Messages:
    98 (0.70/day)
    Thanks Received:
    30
    there really isn't anything in freesync2 that could be added to vesa standards.

    i guess our hopes should be on the next version of hdmi, 2.1 is said to require support for their version of variable refresh rate standard (as opposed to displayport where adaptive-sync is optional).
     
    Last edited: Feb 16, 2017
  11. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    20,026 (6.30/day)
    Thanks Received:
    9,275
    Location:
    IA, USA
    LFC should be added to the eDP standard (require monitors to be fully buffered). HDR tone mapping could also be implemented in the eDP standard offloading that processing from the graphics card. Advantage of the latter is that games don't have to handle the tone map in the game engine elimating the need for the FreeSync 2 API.
     
    Crunching for Team TPU
  12. kn00tcn

    kn00tcn

    Joined:
    Feb 9, 2009
    Messages:
    1,393 (0.46/day)
    Thanks Received:
    362
    Location:
    Toronto
    that was a lot of reading, not sure if i missed anything, but does this mean that every freesync2 monitor will also be an hdr monitor? i dont care about hdr, only lfc (& cheap... but i do care about 8bit & VA or IPS, so not TN cheap)

    i'm also fine with black being grey, i prefer having another dim light in the room rather than total darkness with a painfully bright screen & the side effect of grey being more obvious, as long as everything is consistent, my eyes can calibrate a bit to not care so much about total black

    OLED is disappointing me a lot, chance of burnin, higher power use, not impressed by the accuracy or battery drain on an S5 phone, minimal brightness causes flickering artifacts on the phone as well
     
  13. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    20,026 (6.30/day)
    Thanks Received:
    9,275
    Location:
    IA, USA
    Yes, monitors must meet HDR, LFC, and latency standards to get FreeSync 2 certified. Mind you, manufacturers have to pay for certification so FreeSync monitors can exist that support all of those things and not be certified. At the same time, FreeSync monitors will still be released that have varying levels of support not unlike now.

    I'm not sure what FreeSync 2 certified monitors will use but AMD's goal with this is to bring the HDR to PC gaming. I'm really hoping to see $400 2560x1440 144 Hz FreeSync 2 monitors but...that might be completely unreasonable. :(
     
    DRDNA says thanks.
    Crunching for Team TPU
  14. londiste

    Joined:
    Feb 3, 2017
    Messages:
    98 (0.70/day)
    Thanks Received:
    30
  15. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    20,026 (6.30/day)
    Thanks Received:
    9,275
    Location:
    IA, USA
    Already been over that: that's talking about the API. The FreeSync 2 API supports SDR and HDR monitors. If developers integrate the FreeSync 2 API in their code, they don't have to worry about putting the burden of having an HDR monitor on their users. HDR features will only be enabled through the FreeSync 2 API when there is a HDR monitor attached.

    FreeSync 2 certified monitors require HDR, LFC, and low latency.
     
    Crunching for Team TPU

Currently Active Users Viewing This Thread: 2 (0 members and 2 guests)