• We've upgraded our forums. Please post any issues/requests in this thread.

Crossfire stutter fixed by freesync?

Joined
Apr 29, 2014
Messages
3,688 (2.78/day)
Likes
2,106
Location
Texas
System Name Alucard / The Reinforcer / Portable?
Processor i7 5930K @ 4.5ghz (24/7) / 2x Intel Xeon X5670 / Intel i7 3610QM
Motherboard MSI X99S Gaming 9 AC / Dell Dual Socket (R710) / MSI Stock Gaming Laptop
Cooling RX 360mm + 140mm Custom Loop in Push Pull Config. / Dell Stock / MSI Stock
Memory Corsair Vengeance DDR4 2666 16gb (4x4gb) CL 16 / 1333mhz DDR3 96gb 12 x 8gb / 12gb DDR3 3 x 4gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector) / GTX 880m
Storage Samsung 840/850 512gb Raid 0, WD Velociraptor 600gb x 5 Raid 5 / 300gb 15k RPM x 8 / 2x 240gb Adata
Display(s) Acer XG270HU 1440p 144hz Freesync, Acer B286HK 4K UHD Monitor, 1 Hanns-G 27inch 1920x1080p Monitor
Case Corsair Obsidian 800D / Dell Poweredge R710 Rack Mount Case / MSI Gaming 17inch
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 10 Pro / Windows Server 2008 R2 / Windows 10 Pro
#26
I'm about to upgrade my aging 780Ti along with my monitor but Gsync is so expensive that I'm thinking of going with two 480s and an Acer 27" 1440p monitor with a 144hz refresh rate. It's just under 500 bucks along with the 500 I'd spend on the GPUs.

I'd love to go with a 1080 but the extra cost of Gsync monitors just plain sucks.

As the title says... Will an adaptive sync monitor remedy the stutter associated with crossfire? If not then Gsync and a 1080 it is.
Well coming from 3 R9 290X's and a 1440p FreeSync Acer monitor I will say I did not experience stuttering in the games I played (Though Rise of the Tomb did not like the 3rd card). The few games in the past I did experience stuttering on were fixed via AMD software updates but those were along time ago. I mostly played the Battlefield series, Tomb Raider (both of em), League of Legends, and some other random assortments of games temporarily and did not have the issue unless it was a game that specifically did not support CFX (Mostly some indie games or early DayZ and ARK). To me, on the 1440p 144hz setup with Freesync games were great especially my FPS's and League of Legends as that is where I game the most and I had 0 issues with them.

I would recommend though getting the best single GPU you can get though before going CFX or SLI as both of the techs are good only when they work (Which for the most part all AAA games are pretty quick especially in this day and age to get that support). Personally speaking, I have tried both and to my eyes and experience they play out exactly the same (G-Sync vs Freesync 144hz 1440p in BF4 and 1 with multi-cards) and there really is no stutter in either of them. I would wait for Vega if you are set on replacing everything just to see what value comes with the new cards since it should be soon from the sound of things and then decide. The downside to investing in G-Sync is the price where as Freesync normally does not have (At least much) of an up charge.

But to answer your question directly, no, if the game is going to stutter due to not working in SLI/CFX neither of the techs will fix that. It will however fix Vsync lag.
 
Joined
Feb 3, 2017
Messages
201 (0.63/day)
Likes
63
Processor i7-6700k
Motherboard asus z170i pro gaming
Cooling ekwb custom loop for cpu/gpu running on d5 and 480 rad
Memory 2*16gb ddr4-2400
Video Card(s) msi geforce gtx 1080 ti aero
Storage 250gb 950 pro, 2*500gb samsung 850 evo
Display(s) asus pg279q, eizo ev2736w
Case thermaltake core p5
Power Supply seasonic platinum 660
Mouse logitech g700
Keyboard corsair k60
#27
Sounds like you need to do more research on what HDR is. To meet HDR requirements, dynamic contrast simply doesn't work.
you mentioned dynamic contrast. hdr does not mean that dynamic contrast is not used. hdr has nothing to do with it one way or another.
at least in hdr tv-s the technology used is very similar to if not the same as dynamic contrast as max brightness has limitation for how long it can be displayed before it will be brought down to lower level.

and that same page says (somewhat) clearly - note the footnote at the exact statement you quoted: FreeSync 2 does not require HDR capable monitors;
don't buy into the hype :)
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
20,927 (6.24/day)
Likes
10,026
Location
IA, USA
System Name BY-2015
Processor Intel Core i7-6700K (4 x 4.00 GHz) w/ HT and Turbo on
Motherboard MSI Z170A GAMING M7
Cooling Scythe Kotetsu
Memory 2 x Kingston HyperX DDR4-2133 8 GiB
Video Card(s) PowerColor PCS+ 390 8 GiB DVI + HDMI
Storage Crucial MX300 275 GB, Seagate 6 TB 7200 RPM
Display(s) Samsung SyncMaster T240 24" LCD (1920x1200 HDMI) + Samsung SyncMaster 906BW 19" LCD (1440x900 DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay
Audio Device(s) Realtek Onboard, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse SteelSeries Sensei RAW
Keyboard Tesoro Excalibur
Software Windows 10 Pro 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
#28
you mentioned dynamic contrast. hdr does not mean that dynamic contrast is not used. hdr has nothing to do with it one way or another.
at least in hdr tv-s the technology used is very similar to if not the same as dynamic contrast as max brightness has limitation for how long it can be displayed before it will be brought down to lower level.
http://www.businesswire.com/news/ho...Defines-Premium-Home-Entertainment-Experience
A combination of peak brightness and black level either:
  • More than 1000 nits peak brightness and less than 0.05 nits black level
    OR
  • More than 540 nits peak brightness and less than 0.0005 nits black level
Former is contrast ratio of 20,000:1 (displays in direct sunlight), latter is 1,080,000:1. Your average SDR monitor has a contrast ratio of ~1000:1. Your average monitor is 350 cd/m2 max brightness. HDR has max brightness of at least 540 cd/m2.

HDR has no reason for dynamic contrast (again, no "film").

and that same page says (somewhat) clearly - note the footnote at the exact statement you quoted: FreeSync 2 does not require HDR capable monitors;
don't buy into the hype :)
Whole quote:
Qualifying FreeSync™ 2 monitors will harness low-latency, high-brightness pixels, excellent black levels, and a wide color gamut to display High Dynamic Range (HDR) content. *FreeSync 2 does not require HDR capable monitors; driver can set monitor in native mode when FreeSync 2 supported HDR content is detected.
They're talking about the API. If you're playing a game that supports the FreeSync 2 API and you don't have an HDR monitor, the API will still be able to output SDR content in "native mode."
 
Last edited:
Joined
Feb 3, 2017
Messages
201 (0.63/day)
Likes
63
Processor i7-6700k
Motherboard asus z170i pro gaming
Cooling ekwb custom loop for cpu/gpu running on d5 and 480 rad
Memory 2*16gb ddr4-2400
Video Card(s) msi geforce gtx 1080 ti aero
Storage 250gb 950 pro, 2*500gb samsung 850 evo
Display(s) asus pg279q, eizo ev2736w
Case thermaltake core p5
Power Supply seasonic platinum 660
Mouse logitech g700
Keyboard corsair k60
#29
http://www.businesswire.com/news/ho...Defines-Premium-Home-Entertainment-Experience
Former is contrast ratio of 20,000:1 (displays in direct sunlight), latter is 1,080,000:1. Your average SDR monitor has a contrast ratio of ~1000:1. Your average monitor is 350 cd/m2 max brightness. HDR has max brightness of at least 540 cd/m2.
HDR has no reason for dynamic contrast (again, no "film").
your reference to hdr definitions is right. first one is aimed at lcd-s and second at oleds.
- starting with the second - "infinite" contrast with oled is quite easily achievable due to black level being essentially 0 (pixel turned off).
- for the first - 20 000:1 static contrast is simply not achievable with lcd. best panels available are *va-s that these days have reached 3 000:1 contrast, perhaps 4 000:1. this is awesome but far from the 1:20 000 figure. the larger figure so far has been done with and will continue to be done with dynamic contrast. there is just no way around this. the other lcd technologies are far worse, tn and ips/pls peak at around 1 000:1, exactly as you said.

the important bit to understand here is that the actual panel technologies are the same. there have been no significant breakthroughs in there. 10bit panels, or realistically mostly 8bit+fcr panels, will be used (ruling out almost all current tn panels). These have existed for a long time in a different sector - professional graphics. brightness, contrast and their limitations will remain the same as they currently are.

monitors can easily be made to be brighter than 350cm/m2, there just has been no reason to do so as it will directly affect black levels, also color accuracy. plus, there has been no real use for extremely bright monitors. High level of brightness is very tiring to eyes. Also worth noting is that calibrated monitors are usually set to around 100 nits.

Whole quote:
They're talking about the API. If you're playing a game that supports the FreeSync 2 API and you don't have an HDR monitor, the API will still be able to output SDR content in "native mode."
freesync2 is a whole pot of different things and amd is doing their darnedest to clump all of it together and make things as fuzzy as possible for good marketing.
1. there is monitor (or rather the panel) - freesync2 pr implies that hdr monitors will be used (high-brightness pixels, excellent black levels, wide color gamut, hdr etc.). this is bullshit. as the same page states, hdr is not a requirement and if monitor is not hdr, there is no hdr image.
2. there is the interface - displayport is utilized in a standard way as part of freesync2. 10-bit signal, hdr metadata.
3. there is some magic on computer side. parts of it in hardware (i would assume some rudimentary support for refresh rate syncing), large part of this is in software (incorporated into drivers, mostly).
the last part is only part where api comes into play. looks like this is the part that will handle the color conversion, reducing the latency, does the lfc and other things.

at this point, they seem to aim at windows' color api with api part of freesync2. games themselves will not have to do anything directly with freesync2.

driver can set monitor in native mode when FreeSync 2 supported HDR content is detected.
i honestly do not have a clue as to what amd means with this quote. which is not a good thing considering that this is an official press release of their new killer feature.
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
20,927 (6.24/day)
Likes
10,026
Location
IA, USA
System Name BY-2015
Processor Intel Core i7-6700K (4 x 4.00 GHz) w/ HT and Turbo on
Motherboard MSI Z170A GAMING M7
Cooling Scythe Kotetsu
Memory 2 x Kingston HyperX DDR4-2133 8 GiB
Video Card(s) PowerColor PCS+ 390 8 GiB DVI + HDMI
Storage Crucial MX300 275 GB, Seagate 6 TB 7200 RPM
Display(s) Samsung SyncMaster T240 24" LCD (1920x1200 HDMI) + Samsung SyncMaster 906BW 19" LCD (1440x900 DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay
Audio Device(s) Realtek Onboard, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse SteelSeries Sensei RAW
Keyboard Tesoro Excalibur
Software Windows 10 Pro 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
#30
monitors can easily be made to be brighter than 350cm/m2, there just has been no reason to do so as it will directly affect black levels, also color accuracy. plus, there has been no real use for extremely bright monitors. High level of brightness is very tiring to eyes. Also worth noting is that calibrated monitors are usually set to around 100 nits.
Imagine driving in a game at night. The headlights of cars will be 540 nits + while the darkness of night behind it will be <1 nit; also known as realistic. Panels today with their dynamic contrast have to tone down the headlights simply because they cannot show something that bright while the rest of the scene is dark.


freesync2 is a whole pot of different things and amd is doing their darnedest to clump all of it together and make things as fuzzy as possible for good marketing.
1. there is monitor (or rather the panel) - freesync2 pr implies that hdr monitors will be used (high-brightness pixels, excellent black levels, wide color gamut, hdr etc.). this is bullshit. as the same page states, hdr is not a requirement and if monitor is not hdr, there is no hdr image.
2. there is the interface - displayport is utilized in a standard way as part of freesync2. 10-bit signal, hdr metadata.
3. there is some magic on computer side. parts of it in hardware (i would assume some rudimentary support for refresh rate syncing), large part of this is in software (incorporated into drivers, mostly).
the last part is only part where api comes into play. looks like this is the part that will handle the color conversion, reducing the latency, does the lfc and other things.
1. There will be FreeSync 2 certified monitors.
2. FreeSync 2 requires DisplayPort 1.4.
3. Lots of components here:
3a. FreeSync 2 compatible graphics card (any AMD card with DisplayPort 1.4).
3b. FreeSync 2 compatible drivers (I believe ReLive or newer).
3c. The software needs to use the FreeSync 2 API (it has the tone map that it runs all of the data past in the card and adjusts brightness and bit rate so it matches the monitor).

at this point, they seem to aim at windows' color api with api part of freesync2. games themselves will not have to do anything directly with freesync2.
I think AMD's long term goal is like Mantle and FreeSync itself. They want VESA to adopt a standard of it which Vulkan and DirectX will embed then HDR support in software becomes easy. AMD will keep the FreeSync 2 certification program going for years so people that want the best damn picture available and are willing to pay for it can get it.

i honestly do not have a clue as to what amd means with this quote. which is not a good thing considering that this is an official press release of their new killer feature.
The brightness. HDR monitors are capable of blinding you. HDR features, therefore, need to be disabled when not running HDR compliant software. Basically it is a "do no harm" clause.



Edit: Here's a good picture that demonstrates what HDR is all about:

On the right, it shows the two pictures that were actually taken: one is super bright because of direct sunlight and the other is super dark because of shadow. The photo on the left is the HDR combination of them both so you get the light of the bright photo but the detail of the dark photo. The left photo closely mimics what you would see standing there. The photos on the right show limitations of the camera.
 
Last edited:
Joined
Feb 3, 2017
Messages
201 (0.63/day)
Likes
63
Processor i7-6700k
Motherboard asus z170i pro gaming
Cooling ekwb custom loop for cpu/gpu running on d5 and 480 rad
Memory 2*16gb ddr4-2400
Video Card(s) msi geforce gtx 1080 ti aero
Storage 250gb 950 pro, 2*500gb samsung 850 evo
Display(s) asus pg279q, eizo ev2736w
Case thermaltake core p5
Power Supply seasonic platinum 660
Mouse logitech g700
Keyboard corsair k60
#31
Imagine driving in a game at night. The headlights of cars will be 540 nits + while the darkness of night behind it will be <1 nit; also known as realistic. Panels today with their dynamic contrast have to tone down the headlights simply because they cannot show something that bright while the rest of the scene is dark.
i get the use case and if we are talking about oled monitors, sure. cost is the only objection there. the models available right now cost 5k+$/€/£ and there is no real hope of these becoming much cheaper soon. tv-s are cheaper. 2k€/$/£ will get you an awesome oled tv with (mostly) minor problems for usage as a monitor.

for lcd-s, this is just not happening without dynamic contrast. <1nit is not dark enough.
at 1000:1 contrast ratio, darkness will be 0.54. that is far from dark. turn your monitor brightness to 100% and see what black looks like. this is likely below 0.54 :)
at 20 000:1 it would be 0.027 which is close to what best lcd monitors do today... at around 100 nit brightness for white (very good mva panel with 4000:1 contrast level).

the best indication to go by is the two nvidia's g-sync hdr monitors they mentioned around ces, asus pg27uq and acer xb272-hdr. both are based on au optronics 27" uhd ips panel. wide gamut, 144hz and 384 dynamically controlled led backline zones. prices are expected to be 1200+$/€/£. this is, dynamic backlight control for zones around a square inch each.

also, as per amd, hdr will not be freesync2 requirement.
we should probably split the entire hdr argument into a separate thread :)

1. There will be FreeSync 2 certified monitors.
agreed.
according to amd, the requirement for this certification will be lcf - maximum frequency at least 2.5x higher than minimum.
that is it.

3c. The software needs to use the FreeSync 2 API (it has the tone map that it runs all of the data past in the card and adjusts brightness and bit rate so it matches the monitor).
i don't see why. color management is usually done at the operating system level. if i remember correctly, hdr support should come to windows 10 with creators' update.
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
20,927 (6.24/day)
Likes
10,026
Location
IA, USA
System Name BY-2015
Processor Intel Core i7-6700K (4 x 4.00 GHz) w/ HT and Turbo on
Motherboard MSI Z170A GAMING M7
Cooling Scythe Kotetsu
Memory 2 x Kingston HyperX DDR4-2133 8 GiB
Video Card(s) PowerColor PCS+ 390 8 GiB DVI + HDMI
Storage Crucial MX300 275 GB, Seagate 6 TB 7200 RPM
Display(s) Samsung SyncMaster T240 24" LCD (1920x1200 HDMI) + Samsung SyncMaster 906BW 19" LCD (1440x900 DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay
Audio Device(s) Realtek Onboard, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse SteelSeries Sensei RAW
Keyboard Tesoro Excalibur
Software Windows 10 Pro 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
#32
I never said it would be cheap but there's economy of scale. The more production there is for OLED, the cheaper it will become.

The FreeSync 2 monitor certification program has not been qualified yet. From what AMD has said, it's pretty clear that LFC and HDR are going to be pillars of it.
To begin, branded displays must deliver more than 2x the perceivable brightness and color volume of sRGB. While seemingly an arbitrary benchmark to set, remember that AMD wants this technology to launch in 2017. It has to work within the bounds of what will be available, and AMD’s Glen said that satisfying a 2x requirement will already require the very best monitors you’ll see this year.

FreeSync 2 certification is also going to require low latency. Glen stopped short of giving us numbers, but he did say more than a few milliseconds of input lag would be unacceptable. Finally, low frame rate compensation is a mandatory part of FreeSync 2, suggesting that we’ll see wide VRR ranges from those displays.
That same article explains why the API needs to switch between SDR and HDR.


There will be lots of FreeSync monitors but only some FreeSync 2 certified monitors.
 
Joined
Feb 3, 2017
Messages
201 (0.63/day)
Likes
63
Processor i7-6700k
Motherboard asus z170i pro gaming
Cooling ekwb custom loop for cpu/gpu running on d5 and 480 rad
Memory 2*16gb ddr4-2400
Video Card(s) msi geforce gtx 1080 ti aero
Storage 250gb 950 pro, 2*500gb samsung 850 evo
Display(s) asus pg279q, eizo ev2736w
Case thermaltake core p5
Power Supply seasonic platinum 660
Mouse logitech g700
Keyboard corsair k60
#33
amd press release said:
FreeSync 2 does not require HDR capable monitors
oh, now i see where that srgb and brightness quote came from.
amd press release said:
Radeon FreeSync™ 2 technology offers over 2x the brightness and color volume over sRGB.
Only attainable when using a FreeSync 2 API enabled game or video player and content that uses at least 2x the perceivable brightness and color range of sRGB, and using a FreeSync 2 qualified monitor.
so, this is the part that requires api for some reason.
practically all non-tn monitors today are capable of srgb. i am not sure what 2x the brightness is compared to.

edit:
this is getting interesting. this time around nvidia seems to go for standard solution in form of hdr10 and amd goes for more complicated proprietary solution.

edit2:
the real problem here is that amd has not provided good detailed technical overview of what freesync2 is going to be. as counter intuitive as that sounds considering that they have a nice presentation and a press release about this.
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
20,927 (6.24/day)
Likes
10,026
Location
IA, USA
System Name BY-2015
Processor Intel Core i7-6700K (4 x 4.00 GHz) w/ HT and Turbo on
Motherboard MSI Z170A GAMING M7
Cooling Scythe Kotetsu
Memory 2 x Kingston HyperX DDR4-2133 8 GiB
Video Card(s) PowerColor PCS+ 390 8 GiB DVI + HDMI
Storage Crucial MX300 275 GB, Seagate 6 TB 7200 RPM
Display(s) Samsung SyncMaster T240 24" LCD (1920x1200 HDMI) + Samsung SyncMaster 906BW 19" LCD (1440x900 DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay
Audio Device(s) Realtek Onboard, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse SteelSeries Sensei RAW
Keyboard Tesoro Excalibur
Software Windows 10 Pro 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
#34
Because G-Sync is already end-to-end proprietary. AMD right now uses VESA's standards except for where they can't (e.g. LFC). I wouldn't be surprised if DisplayPort 1.5 or 1.6 has the necessary amendments to it so that FreeSync 2 doesn't have to be proprietary. VESA is basically forcing AMD off-standard to compete with G-Sync.

AMD will publish certification requirements when the technology is finalized. This tech is still in its infancy and they have to work with monitor manufacturers to figure out what is reasonable to require.
 
Joined
Feb 3, 2017
Messages
201 (0.63/day)
Likes
63
Processor i7-6700k
Motherboard asus z170i pro gaming
Cooling ekwb custom loop for cpu/gpu running on d5 and 480 rad
Memory 2*16gb ddr4-2400
Video Card(s) msi geforce gtx 1080 ti aero
Storage 250gb 950 pro, 2*500gb samsung 850 evo
Display(s) asus pg279q, eizo ev2736w
Case thermaltake core p5
Power Supply seasonic platinum 660
Mouse logitech g700
Keyboard corsair k60
#35
there really isn't anything in freesync2 that could be added to vesa standards.

i guess our hopes should be on the next version of hdmi, 2.1 is said to require support for their version of variable refresh rate standard (as opposed to displayport where adaptive-sync is optional).
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
20,927 (6.24/day)
Likes
10,026
Location
IA, USA
System Name BY-2015
Processor Intel Core i7-6700K (4 x 4.00 GHz) w/ HT and Turbo on
Motherboard MSI Z170A GAMING M7
Cooling Scythe Kotetsu
Memory 2 x Kingston HyperX DDR4-2133 8 GiB
Video Card(s) PowerColor PCS+ 390 8 GiB DVI + HDMI
Storage Crucial MX300 275 GB, Seagate 6 TB 7200 RPM
Display(s) Samsung SyncMaster T240 24" LCD (1920x1200 HDMI) + Samsung SyncMaster 906BW 19" LCD (1440x900 DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay
Audio Device(s) Realtek Onboard, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse SteelSeries Sensei RAW
Keyboard Tesoro Excalibur
Software Windows 10 Pro 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
#36
LFC should be added to the eDP standard (require monitors to be fully buffered). HDR tone mapping could also be implemented in the eDP standard offloading that processing from the graphics card. Advantage of the latter is that games don't have to handle the tone map in the game engine elimating the need for the FreeSync 2 API.
 
Joined
Feb 9, 2009
Messages
1,469 (0.45/day)
Likes
381
Location
Toronto
Processor i7-2670QM / Q9550 3.6ghz
Motherboard laptop / Asus P5Q-E
Cooling laptop / Cooler Master Hyper 212
Memory 2x4gb ddr3sd / 2x2gb ddr2
Video Card(s) 570m / MSI 660 Gaming OC
Storage ST9750420AS / ST1000DM003
Display(s) BenQ FP241VW / BenQ GW2265HM
Case MSI gx780 / Corsair 500r
Audio Device(s) onboard
Power Supply laptop / Corsair 750tx
Mouse Steelseries Kinzu V2 / Logitech M120
Keyboard Logitech Deluxe 250 / Logitech K120
Software Windows 7
#37
that was a lot of reading, not sure if i missed anything, but does this mean that every freesync2 monitor will also be an hdr monitor? i dont care about hdr, only lfc (& cheap... but i do care about 8bit & VA or IPS, so not TN cheap)

i'm also fine with black being grey, i prefer having another dim light in the room rather than total darkness with a painfully bright screen & the side effect of grey being more obvious, as long as everything is consistent, my eyes can calibrate a bit to not care so much about total black

OLED is disappointing me a lot, chance of burnin, higher power use, not impressed by the accuracy or battery drain on an S5 phone, minimal brightness causes flickering artifacts on the phone as well
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
20,927 (6.24/day)
Likes
10,026
Location
IA, USA
System Name BY-2015
Processor Intel Core i7-6700K (4 x 4.00 GHz) w/ HT and Turbo on
Motherboard MSI Z170A GAMING M7
Cooling Scythe Kotetsu
Memory 2 x Kingston HyperX DDR4-2133 8 GiB
Video Card(s) PowerColor PCS+ 390 8 GiB DVI + HDMI
Storage Crucial MX300 275 GB, Seagate 6 TB 7200 RPM
Display(s) Samsung SyncMaster T240 24" LCD (1920x1200 HDMI) + Samsung SyncMaster 906BW 19" LCD (1440x900 DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay
Audio Device(s) Realtek Onboard, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse SteelSeries Sensei RAW
Keyboard Tesoro Excalibur
Software Windows 10 Pro 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
#38
Yes, monitors must meet HDR, LFC, and latency standards to get FreeSync 2 certified. Mind you, manufacturers have to pay for certification so FreeSync monitors can exist that support all of those things and not be certified. At the same time, FreeSync monitors will still be released that have varying levels of support not unlike now.

I'm not sure what FreeSync 2 certified monitors will use but AMD's goal with this is to bring the HDR to PC gaming. I'm really hoping to see $400 2560x1440 144 Hz FreeSync 2 monitors but...that might be completely unreasonable. :(
 
Joined
Feb 3, 2017
Messages
201 (0.63/day)
Likes
63
Processor i7-6700k
Motherboard asus z170i pro gaming
Cooling ekwb custom loop for cpu/gpu running on d5 and 480 rad
Memory 2*16gb ddr4-2400
Video Card(s) msi geforce gtx 1080 ti aero
Storage 250gb 950 pro, 2*500gb samsung 850 evo
Display(s) asus pg279q, eizo ev2736w
Case thermaltake core p5
Power Supply seasonic platinum 660
Mouse logitech g700
Keyboard corsair k60
#39

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
20,927 (6.24/day)
Likes
10,026
Location
IA, USA
System Name BY-2015
Processor Intel Core i7-6700K (4 x 4.00 GHz) w/ HT and Turbo on
Motherboard MSI Z170A GAMING M7
Cooling Scythe Kotetsu
Memory 2 x Kingston HyperX DDR4-2133 8 GiB
Video Card(s) PowerColor PCS+ 390 8 GiB DVI + HDMI
Storage Crucial MX300 275 GB, Seagate 6 TB 7200 RPM
Display(s) Samsung SyncMaster T240 24" LCD (1920x1200 HDMI) + Samsung SyncMaster 906BW 19" LCD (1440x900 DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay
Audio Device(s) Realtek Onboard, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse SteelSeries Sensei RAW
Keyboard Tesoro Excalibur
Software Windows 10 Pro 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
#40
Already been over that: that's talking about the API. The FreeSync 2 API supports SDR and HDR monitors. If developers integrate the FreeSync 2 API in their code, they don't have to worry about putting the burden of having an HDR monitor on their users. HDR features will only be enabled through the FreeSync 2 API when there is a HDR monitor attached.

FreeSync 2 certified monitors require HDR, LFC, and low latency.