• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Crossfire stutter fixed by freesync?

Joined
Nov 2, 2016
Messages
47 (0.02/day)
System Name Current Rig (updated)
Processor i7 4770k @ 4.3ghz ( 1.2v)
Motherboard MSI GAMING G45 Z87
Cooling Corsair H80i
Memory 16GB Gskill Ripjaws DDR3 1866
Video Card(s) Gigabyte G1 GAMING GTX 1080
Storage Samsung EVO 850 250GB SSD, Seagate barracuda HD 1TB, Mushkin Reactor 480GB SSD
Display(s) Dell 27" IPS 1440p 144hz Gsync
Case Thermaltake armor
Audio Device(s) Integrated
Power Supply Antec HCG 850
Mouse Razor Naga
Keyboard Razor Death Adder
Software Win 10 64 bit
I'm about to upgrade my aging 780Ti along with my monitor but Gsync is so expensive that I'm thinking of going with two 480s and an Acer 27" 1440p monitor with a 144hz refresh rate. It's just under 500 bucks along with the 500 I'd spend on the GPUs.

I'd love to go with a 1080 but the extra cost of Gsync monitors just plain sucks.

As the title says... Will an adaptive sync monitor remedy the stutter associated with crossfire? If not then Gsync and a 1080 it is.
 
G-SYNC or Free-sync makes the monitors refresh rate variable so tearing is gone and you get minimal input lag , than being said it has nothing to do with ingame stutter.
 
Well that sucks... I already knew it cured tearing but wasn't sure about the stuttering......Thanks for the reply.
 
I use freesync on a crossfire setup , no stuttering here but I'm not in double figure FPS often except older games which also don't stutter , benches do sometimes upon switching loads.
But the killer is no dual GPU support in dx12 or vulkan ATM so if I were you (and not folding or using 4 k ) I'd go NV but as for gsync nahhhh.
There are many on here reckon Nvidias adaptive sync setting does a fine job.
 
Don't go for crossfire unless you are willing to risk not having the best performance and problems such as stutter / waiting for crossfire support / high power consumption, heat. Rather go for the 1080 and no gsync, high fps will make gsync or freesync unneeded anyway. Just see to it that you have stable fps of over 100 then, if you'll use a 144hz monitor which you still should do.
 
Last edited:
Don't go for crossfire unless you are willing to risk not having the best performance and problems such as stutter / waiting for crossfire support / high power consumption, heat. Rather go for the 1080 and no gsync, high fps will make gsync or freesync unneeded anyway. Just see to it that you have stable fps of over 100 then, if you use a 144hz monitor which you still should do.
^^ What he said. Damnit, you stole my thunder. :laugh:

If the card is rendering above the monitor's refresh rate and you use FastSync with the 1080, then you get the best of all worlds of low lag, no stutter or tearing and liquid smooth animation. In short, it's a killer feature that current gen AMD cards don't support and don't have the performance.

Vega might change all that, but it remains to be seen.
 
Last edited:
Well that sucks... I already knew it cured tearing but wasn't sure about the stuttering......Thanks for the reply.

If you are even remotely sensitive to stutter, avoid dual-GPU and stick to single. It is that simple. If not today, then tomorrow you will come across a game you want to play that doesn't handle crossfire well, and that means you're down to one, generally weaker, GPU. That's a lot of money to spend for half of it to be idle even in one or two games that you play.

With Single-GPU, and enough performance in it, you really don't 'need' Gsync or FreeSync. Adaptive Sync already fixes 95% of all use cases and in the remaining 5% you can tweak your game to suit your system performance and maintain a stable FPS.

The minor price/performance advantage gained from linking two GPUs really is not worth the additional hassle and risk of no support, even outside of the stutter/tearing argument.

DualGPU on anything but ultra high end, in my opinion should always be avoided. You should 'double up' because there is no single GPU that meets that performance level. Not to save 20-40 bucks. In that case, you're stuck with one powerful GPU in a worst case scenario which still isn't bad performance.
 
Well I'm leaning towards a 27 inch Gsync 1440p Dell or Acer with a refresh of 144+ Hz. My son has a Dell variant pushed by a 1060 6gb and it's pretty smooth at med-high settings. I figure a 1080 will rock high fps at ultra settings all day long @ 1440p.

I was looking at an Acer 35 inch ultra wide with freesync so that is why I was curious about crossfire stutter. I think 1440p would look better that 2560x1080 anyway.
 
Well I'm leaning towards a 27 inch Gsync 1440p Dell or Acer with a refresh of 144+ Hz. My son has a Dell variant pushed by a 1060 6gb and it's pretty smooth at med-high settings. I figure a 1080 will rock high fps at ultra settings all day long @ 1440p.

I was looking at an Acer 35 inch ultra wide with freesync so that is why I was curious about crossfire stutter. I think 1440p would look better that 2560x1080 anyway.
If you go for the 1080, then go for a Gsync monitor ofc. Resolution and Ultra Wide or not, is more a thing of taste I'd say, so you have to know that yourself. But if you elaborate on what you're playing and plan to do with it, I can give you some more advice.
 
It may be possible that your CPU power settings are creating the stutter problem.... When your gaming maybe try setting the CPU at MAX performance as when the CPU mhz is raised and lower it has been known to cause stutter. Your current card is still very strong.

And welcome to TPU :lovetpu:
 
Last edited:
Your current card is still very strong.
Yes indeed. I've got that exact model and the main limitation is running out of memory before GPU performance at times.
 
CF stutter depends on the game & it's supposedly much improved compared to the past

i used to have a 4870x2, aka 'the past', while certain things did stutter horribly, having enough power to perform above your refresh & then enabling vsync meant things were smooth

If the card is rendering above the monitor's refresh rate and you use FastSync with the 1080, then you get the best of all worlds of low lag, no stutter or tearing and liquid smooth animation. In short, it's a killer feature that current gen AMD cards don't support and don't have the performance.
that's just impossible, it's dropping frames so animations will stutter or judder

the engine needs to know the refresh to generate precisely timed frames
 
CF stutter depends on the game & it's supposedly much improved compared to the past

i used to have a 4870x2, aka 'the past', while certain things did stutter horribly, having enough power to perform above your refresh & then enabling vsync meant things were smooth


that's just impossible, it's dropping frames so animations will stutter or judder

the engine needs to know the refresh to generate precisely timed frames
No it's not impossible and I've seen it with my own eyes. Google for how it works. In its most basic terms, it lets the card freewheel above the refresh rate and then it outputs only the frame due just before the refresh, discarding the extra frames.

Funny thing is, I bought my 1080 not even knowing about this killer feature. When I tried it on out I was even more pleased I'd got it.
 
No it's not impossible and I've seen it with my own eyes. Google for how it works. In its most basic terms, it lets the card freewheel above the refresh rate and then it outputs only the frame due just before the refresh, discarding the extra frames.

Funny thing is, I bought my 1080 not even knowing about this killer feature. When I tried it on out I was even more pleased I'd got it.
wish people were more specific than 'google how it works' in this age of fake news, but of course i've seen the nvidia slides about it

what a few sites had a concern for, & what i really want accurate testing of, is measuring animation smoothness, not the output framerate

let's say you're at 60hz, but your card can render 63fps (near worst case example), the game world time of the 63fps is not divisible to the refresh, so it cant display every frame & it cant drop every other frame

it's going to drop at an awkward number in the same way that leap years work, the earth's position in orbit is ahead of a fixed point every year on the same date, eventually you add an extra date & now earth's position is behind the fixed point

so i'm saying, if game time (physics/animation) happens every 14ms, the moment individual frames are dropped to match 16ms is going to be uneven, one frame might be displayed that's game time 15ms after its previous displayed, another frame later on (not successive) might be 3ms after its previous displayed, get it?

should i make a thought-out diagram or test case?
 
Can't wait for Vega? I think it is supposed to be coming first half of 2017. Vega should be as powerful as GTX 1080 and you'll still get your savings with the adaptive sync monitor.

Correct me if I'm wrong but I'm pretty sure you can use any graphics card to power a FreeSync monitor, you just won't get the adaptive sync feature of it until the card matches the tech. For example, you can run a 144 Hz FreeSync monitor with a 780 Ti but it will just be pegged to 144 Hz (not adaptive) or whatever you set it to. If that monitor is a fantastic deal, you could just grab it then upgrade the graphics card later. Even if GTX 1080 does end up being better than Vega, the fact Vega launched should cause GTX 1080's price to come down a bit so win-win.
 
Last edited:
im unsure but if the board supports ECC ram turn that function off
 
Correct me if I'm wrong but I'm pretty sure you can use any graphics card to power a FreeSync monitor, you just won't get the adaptive sync feature of it until the card matches the tech. For example, you can run a 144 Hz FreeSync monitor with a 780 Ti but it will just be pegged to 144 Hz (not adaptive) or whatever you set it to.
you are not wrong. 144hz freesync monitor will work at 144hz with nvidia card but no freesync. same for the other side - 144hz gsync monitor will work fine at 144hz with amd card but no gsync.

i recently got a 144hz gsync monitor and i must say that unless you play fast-paced shooters competitively (cs:go, maybe cod, there really are not that many of them), gsync/freesync are the real killer feature on these fast monitors. no vsync, no tearing is awesome, the fact that it works regardless of the framerate is even better. 40fps has never been so smooth :)

let's say you're at 60hz, but your card can render 63fps (near worst case example), the game world time of the 63fps is not divisible to the refresh, so it cant display every frame & it cant drop every other frame

it's going to drop at an awkward number in the same way that leap years work, the earth's position in orbit is ahead of a fixed point every year on the same date, eventually you add an extra date & now earth's position is behind the fixed point

so i'm saying, if game time (physics/animation) happens every 14ms, the moment individual frames are dropped to match 16ms is going to be uneven, one frame might be displayed that's game time 15ms after its previous displayed, another frame later on (not successive) might be 3ms after its previous displayed, get it?
physics/animation should not be tied to framerate in this day and age. in games where this matters (again, this is mostly first-person shooter thing), it isn't.

nvidia's fastsync is awesome in some ways but it has its limitations - especially the part where you need to have framerate over the monitor refresh rate, preferably a lot over that to avoid stutters when something on-screen drags framerate down. at 144+hz, that is going to be tricky in many cases.
 
I'm about to upgrade my aging 780Ti along with my monitor but Gsync is so expensive that I'm thinking of going with two 480s and an Acer 27" 1440p monitor with a 144hz refresh rate. It's just under 500 bucks along with the 500 I'd spend on the GPUs.

I'd love to go with a 1080 but the extra cost of Gsync monitors just plain sucks.

As the title says... Will an adaptive sync monitor remedy the stutter associated with crossfire? If not then Gsync and a 1080 it is.

You could in theory wait for the newer AMD GPU coming next. https://videocardz.com/amd/radeon-rx-400/radeon-rx-490

Edit:
Better infosheet here https://videocardz.net/amd-radeon-rx-490/
 
physics/animation should not be tied to framerate in this day and age. in games where this matters (again, this is mostly first-person shooter thing), it isn't.
there can be both animation inconsistencies & for sure the time from your input vs what you see on screen will be changing, visualize the following grid of frames:

if monitor is 60hz aka 16ms
| | | | |

if game is 15ms
| | | | |

game vs monitor
| | | | |
| | | | |


the latency between the rendered frame & its time to display will be varying between 0 & the refresh time, 16ms in this example, & there will also be a moment where the timing rolls over, causing a microstutter similar to how leap years work

game ... 21 22 23 24 25 26
monitor ... | | | | | |
visible ... 20 21 23 24 25 26

EDIT: uhhh... ok if this looks messed up, paste this into a text editor with a fixed letter width font

EDIT2: TPU lacks a code view? it also seems to remove extra spaces even though editing the post works, have a screenshot

Screenshot_2017-02-15_01-43-50.png

 
you are right. that latency difference is always there. whether it is noticeable, depends.
it gets a bit offtopic here but what nvidia's fastsync is meant to do is to run the game at considerably larger framerate, minimizing that latency.
still, that is what makes freesync/gsync the killer feature here. :)
 
I've been looking around at monitors and decided to wait for FreeSync 2 hardware certified monitors. FreeSync 2 was revealed January 3, 2017 (so no hardware available yet), and features everything I want from my next monitor:
-adaptive sync (no tearing)
-HDR-10 (wide contrast ratio so blacks are extremely dark and whites are extremely bright)
-Rec. 2020 / BT. 2020 (broad and accurate color production)
-wide viewing angles (colors are accurate from almost every angle)
-Low Framerate Compensation is mandatory (the monitor knows how to fill in the gaps when the refresh rate falls below minimum for the monitor)

This video is what sold me (4 minutes in):
 
Last edited:
freesync2 has 2 big points over freesync:
1. lfc requirement. this along with having a reasonable range for freesync refresh rates (lfc's max more than 2.5 min sounds good) should have been a requirement from the get-go. lfc is supported in freesync already as long as you pick a capable monitor. freesync2 really won't help here.
2. hdr support - this is just that, support. do not get carried away by the marketing hype. this will do nothing by itself. for any benefit, monitor itself will need to support hdr (much larger brightness, contrast and 10-bit support = expensive) and they will have to figure out how hdr will work for pc.

btw, hdr marketing is bullshit squared. disregard any picture or video shown about it - they are simply bullshit. i do not think i have seen a single instance for hdr marketing, whether it is about tv-s or monitors, where the examples would not be with a very high level of source tampering.

don't get me wrong, hdr is awesome (although i much prefer the oled way with black blacks, not the lcd way of overblown brightness) and 10-bit is awesome but the hdr screens are (very) expensive and cheap hdr is crap. the standards are only now getting standardized for tv-s (hdr-10, dolby vision), computers are lagging way behing on that aspect. yes, 10-bit monitors have existed for years but these are for specific applications and end-to-end 10-bit color on a pc is not for the faint of heart... or nonexistant for consumer stuff, including games.
 
Not all FreeSync monitors support LFC. FreeSync 2 certification makes LFC mandatory.

The film he described in the video is real. It is caused by dynamic contrast ratio because the pixels let too much light from the backlighting bleed through. It's pretty obvious on virtually any movie: look at the letterboxing area at the top and bottom and note how it is gray, not black. That's a limitation of non-HDR panels.

The reason why FreeSync 2 certification is great is because even though a panel can accept 10-bit color and HDR data doesn't mean it actually produces it visually. FreeSync 2 certifies that not only do they accept the signal, they faithfully produce it.

I wouldn't be surprised if only OLED panels can pass certification. It's guaranteed that the first FreeSync 2 displays are going to be very expensive.

TVs only have a 30 Hz target. Monitors can go as high as 240 Hz.

And yeah, there are definitely standardization issues. The fact LFC isn't part of the VESA adapative sync standard is a problem. DisplayPort should also have HDR capability flags so that the graphics card can signal whether SDR or HDR content is in the tube so the display can adjust accordingly. FreeSync 2 does all of those things and AMD is known for pushing standards so I wouldn't be surprised if VESA adopts FreeSync 2 (or something compliant) in DisplayPort 1.5. The next DisplayPort standard is expected soon.
 
Last edited:
Not all FreeSync monitors support LFC. FreeSync 2 certification makes LFC mandatory.
yes, but mandatory lfc alone is rather flimsy bonus for new version of freesync. as long as you keep an eye out for lfc support you'd probably get much better deals for old freesync monitors once new ones are out :)

The film he described in the video is real. It is caused by dynamic contrast ratio because the pixels let too much light from the backlighting bleed through. It's pretty obvious on virtually any movie: look at the letterboxing area at the top and bottom and note how it is gray, not black. That's a limitation of non-HDR panels.
non-black blacks like in that letterboxing is a limitation of lcd panels and has nothing to do with hdr.
not the point here but dynamic contrast ratio as such is evil and should burn in hell.

btw, hdr will make this even worse. the drive is towards larger contrast ratios and brightness is the easy way to go with lcd, black levels be damned.

The reason why FreeSync 2 certification is great is because even though a panel can accept 10-bit color and HDR data doesn't mean it actually produces it visually. FreeSync 2 certifies that not only do they accept the signal, they faithfully produce it.
amd has not said this.
well, in a typical amd fashion, there are very few real details available. wording in their presentation is very carefully about hdr support. support, not requirement.

if the standard really will require 10-bit, hdr and 2.5x min/max refresh rate, the monitors are going to be VERY expensive. both models :p

And yeah, there are definitely standardization issues. The fact LFC isn't part of the VESA adapative sync standard is a problem. DisplayPort should also have HDR capability flags so that the graphics card can signal whether SDR or HDR content is in the tube so the display can adjust accordingly.
end-to-end 10-bit from pc to screen is problematic today. this, when not even looking at hdr.
latest displayport versions already have some hdr metadata in the standard and the data stream itself is just 10-bit data so nothing complicated about the data transfer.
processing the hdr data at both ends is going to be interesting for a while though.

i think lfc is beyond what displayport standard is supposed to handle. displayport is about transferring the data, after all. doubling the frames is implementation question for data processors at either end. currently gsync is doing that in hardware in the monitor, lfc seems to be doing that in software on pc.
 
Last edited:
Sounds like you need to do more research on what HDR is. To meet HDR requirements, dynamic contrast simply doesn't work.


http://www.amd.com/en-us/press-releases/Pages/freesync-2-2017jan04.aspx
Qualifying FreeSync™ 2 monitors will harness low-latency, high-brightness pixels, excellent black levels, and a wide color gamut to display High Dynamic Range (HDR) content.1 In addition, all FreeSync™ 2 monitors will have support for Low Framerate Compensation (LFC). With Radeon FreeSync™ 2 technology, gamers can enjoy a complete plug-and-play HDR gaming experience, avoiding the need to tweak settings in software or on the monitor.
AMD has said that few monitors will be able to pass FreeSync 2 qualifications testing. Those that don't will have to be marketed as simply FreeSync monitors.


LFC is something FreeSync (and VESA adaptive sync at large) needs to compete with G-Sync. If VESA is serious about making the technology commonplace, it is something they'll have to add.
 
Last edited:
Back
Top