• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Microstuttering masterclass

How much does microstuttering bother you?


  • Total voters
    111
is that Mussels has a way of being right on anything technical, which adds weight to his argument. :)

i'm definitely wrong at times.


let me present this extremely simplified theory to you: if a single card stutters, a dual GPU solution will have twice as much stutter, or worse (Due to delays in communication between the cards)
 
i'm definitely wrong at times.


let me present this extremely simplified theory to you: if a single card stutters, a dual GPU solution will have twice as much stutter, or worse (Due to delays in communication between the cards)

Yes, it's the syncing of the two image renders isn't it? And it totally disappears with vsync on and no dropped frames, doesn't it? This is why it will be interesting to see erocker's response. ;)

In my time with a 9800 GX2 a while back (eBay'd alas) I never noticed the card appear to have excessive stutter. In fact, it was very fast and smooth. It did get bloody hot though! :eek:
 
if a single card stutters, a dual GPU solution will have twice as much stutter, or worse

Unless, of course, the second card gets it's frames to the main card between the stuters, effectively smoothing the framerate.

This key idea was the whole point behind multi-GPU usage for 3D, to add more power to make up for crappy rendering due to a lack of performance.

Then, with the 3870x2 in Crossfire, with 4 total GPUs, they ran into the point where too many GPUs just made it worse, as rather than trying to deal with just one frame with a delay, now it was 3 frames, and they were delaying each other.
In fact, it was very fast and smooth

That's largely becuase of the NVIO, which acted as a buffer where all frames were rendered to. nVidia, probably thanks to 3Dfx's technology, realized that there may be issues syncing the GPUs, and put a buffer in place to prevent the problem.
 
Last edited:
it doesnt always disappear with Vsync, Vsync just reduces it. the article explains how it can smooth it out if only one of the three triple buffered frames is dropped, but that doesnt mean it solves it or removes it completely - and just one missing frame can still be visible.
 
Tbh... Ive seen microstuttering with a single gpu setup, tho this is due to a lot of hdd activity on the drive the game is running on. Either way it can be annoying. But it really depends on the game too as far as just how annoying it gets
 
I have a 5870 2gb eyefin edtion and just with my one gpu i have had bad Microstuttering spikes in some games.(mostly older or less supported games) I think its just a driver problem for the game and s me rage when im shuttering and getting shot at.
 
Somehow I missed this thread earlier. Two 6850s here 1920x1200 60Hz and never have noticed any stutter.
 
Tbh... Ive seen microstuttering with a single gpu setup, tho this is due to a lot of hdd activity on the drive the game is running on. Either way it can be annoying. But it really depends on the game too as far as just how annoying it gets

sounds to me the trouble lies in the HD n not the hardware, normally its better to separate OS Drive from Games Drive
 
They have a follow up on TechReport, now micro-stuttering captured in video:

http://techreport.com/discussions.x/21625

This is a direct link to the video in Youtube in case you prefer that:

http://www.youtube.com/watch?v=zOtre2f4qZs

It shows micro-stuttering very clearly and I now understand why I have never perceived a real benefit with a second card, despite getting much better frames. There is no benefit! Well in some cases, but I've always seen micro-stutter in most of the games, so it's been years since I don't go with multiple cards.

Now I'm even more convinced, what's the point of getting twice the frames if every odd one is virtually identical to the previous one?
 
"Microstutter" is by popular definition a multi-GPU AFR issue, but the irregular frametimes which characterize the end user experience of it happen with single-GPU configurations as well. No GPU or number of GPUs renders perfectly smoothly and the deviation from standard FPS depends less on the number of GPUs than it does on the specific game, settings, drivers, and video card series and model(s). Depending on those factors, multi-GPU setups can easily generate more regular frametimes than single-GPU systems as often as not.
 
"Microstutter" is by popular definition a multi-GPU AFR issue, but the irregular frametimes which characterize the end user experience of it happen with single-GPU configurations as well. No GPU or number of GPUs renders perfectly smoothly and the deviation from standard FPS depends less on the number of GPUs than it does on the specific game, settings, drivers, and video card series and model(s). Depending on those factors, multi-GPU setups can easily generate more regular frametimes than single-GPU systems as often as not.

Have you read the articles? Because that's not what THG and TR's articles are showing at all.
 
Have you read the articles? Because that's not what THG and TR's articles are showing at all.

Yes, but more importantly for my business I've benched for years using min/ave/max and standard deviation on individual frame times to measure micro stutter and long since learned that the numbers can always be presented in ways that do not communicate the degree of the issue in a way that accurately parallels the end user's perception of it. Benchmark sites publish benchmarks. I benchmark so I can effectively design and build systems for people who actually play games. I believe that perceived smoothness can be concretely quantified and predicted given adequate data, but the data must be considered in context and rigorously analyzed with regard for other factors (game genre, other hardware, and variables mentioned in my previous post).
 
Have you read the articles? Because that's not what THG and TR's articles are showing at all.

They also don't claim that this is a problem happening for everyone running multiple cards. I've had microstuttering in the past with dual-gpu systems and I have also not had it with dual-gpu systems. Cllaiming it happens with all multi card systems is false.
 
Yes, but more importantly for my business I've benched for years using min/ave/max and standard deviation on individual frame times to measure micro stutter and long since learned that the numbers can always be presented in ways that do not communicate the degree of the issue in a way that accurately parallels the end user's perception of it. Benchmark sites publish benchmarks. I benchmark so I can effectively design and build systems for people who actually play games. I believe that perceived smoothness can be concretely quantified and predicted given adequate data, but the data must be considered in context and rigorously analyzed with regard for other factors (game genre, other hardware, and variables mentioned in my previous post).

I agree, but you're wading into scary territory here, where you are almost saying that a system needs to be customized, even down to the hardware level, depending on the game you are playing.

This effectively creates a bunch of "console designs" that we need to change the configuration of in order to have it adapt to new media, and that's a tough pill for people to swallow.

PC users don't want to hear they need a specific piece of hardware just be able to enjoy a game, as that defeats the purpose of having a PC in the first place. It's very easy to blame the hardware for the problem, but software plays it's role too, and when we have console ports...well..naturally the hardware design the software was initially developed for is most likely to have the best experience tagged on to it.
 
I agree, but you're wading into scary territory here, where you are almost saying that a system needs to be customized, even down to the hardware level, depending on the game you are playing.

This effectively creates a bunch of "console designs" that we need to change the configuration of in order to have it adapt to new media, and that's a tough pill for people to swallow.

PC users don't want to hear they need a specific piece of hardware just be able to enjoy a game, as that defeats the purpose of having a PC in the first place. It's very easy to blame the hardware for the problem, but software plays it's role too, and when we have console ports...well..naturally the hardware design the software was initially developed for is most likely to have the best experience tagged on to it.
Yes, it could turn into something like: buy an AMD setup for Fallout New Vegas & CoD: MW3, but nvidia for BF3 & Bulletstorm.

Ugh, no, I don't think we want a situation like that. :shadedshu
 
Well, then you understand why this is possibly so controversial? :laugh:
 
They also don't claim that this is a problem happening for everyone running multiple cards. I've had microstuttering in the past with dual-gpu systems and I have also not had it with dual-gpu systems. Cllaiming it happens with all multi card systems is false.

I have seen it on every multi-GPU setup I've tried and counting those of friends, it's a large enough base to claim that I do see micro-stuttering in every game, every setup.

Not everybody can see it even if it's happening, but it's almost invariably always there. As TIGR said it happens even on single GPU setups, it's just that with 2 cards it's worse, enough to make a difference for many people.

Claiming it happens with all multi card systems might be false (I never said that), but it's also false to claim it's not a problem. The reality is that micro-stuttering is real and happens, and it's worse on multi-GPU systems (which makes it kind of a gamble), denying that is being delusional considering the facts and the opinion of so many people.
 
i say the time stuttering occurs is when a HD hs having to do a major write operation thats why its best to keep games on a separate drive n even move to SSDs
 
I have seen it on every multi-GPU setup I've tried and counting those of friends, it's a large enough base to claim that I do see micro-stuttering in every game, every setup.

Not everybody can see it even if it's happening, but it's almost invariably always there. As TIGR said it happens even on single GPU setups, it's just that with 2 cards it's worse, enough to make a difference for many people.

Claiming it happens with all multi card systems might be false (I never said that), but it's also false to claim it's not a problem. The reality is that micro-stuttering is real and happens, and it's worse on multi-GPU systems (which makes it kind of a gamble), denying that is being delusional considering the facts and the opinion of so many people.

I can tell if it is present or not. I cannot tell a difference between using one card or two with my current setup in terms of microstuttering. Both run very smoothly. I've seen setups with a large enough claim to say that micro-stuttering is not seen in every game on every setup. So I guess we're both wrong/right?

*.. and I completely tuned out the last part of your post. Is it a problem in my opinion? Absolutely. I've seen it.
 
Microstuttering is a random issue, it can be caused my manythings n cannot be really shot down to hardware or software.
 
I can tell if it is present or not. I cannot tell a difference between using one card or two with my current setup in terms of microstuttering. Both run very smoothly. I've seen setups with a large enough claim to say that micro-stuttering is not seen in every game on every setup. So I guess we're both wrong/right?

Why has anyone need to be right? Everything is subjective, what's ok for you might not be ok for me. I was just aknowledging the fact that I do see it a lot and I think it's annoying, so that's why stopped using multi-GPU setups. For years I've been seing it but at the same time I didn't really believe in it, I didn't know the jitter could be that big, so I knew a second card did very little to improve MY gaming experience, but didn't know why and now thanks to these articles I know exactly why, so I told my experience is all.

Both the articles and the poll on this same thread says I'm not alone, so it clearly is something real. You don't see it, right, good for you, good for the ones who don't have this "problem", but I fail to see how that is a reason to deny the existence of micro-stutter, or a reason to downplay it's consequences.

EDIT: In a sense this reminds me of how many console players do not see their games aliased or think it's not a problem at all.
 
Last edited:
if you take a look at 580 sli in this article, you'll see that the frame variation is far smaller than slower crossfire and sli setups.

in most of these articles they come to the same conclusion, faster cards have less fram variation and thus in sli/crossfire they produce more consistent frames than slower cards.

I notice no stutter whatsoever and not it's not that I'm not looking. It's that my solution simply produces less to where the human eye will not pick it up.

I don't care if you've got the vision of an eagle you simply will not see any on my setup.

past crossfire/sli solutions will of course have a better chance of producing microstutter, but that doesn't mean jack for current gen cards.

when testing current gen cards the results show:

1.that ati in dual gpu configs suffers more from it than nvidia
2. ati triple configs show less than dual
3. singe gpus suffer the same issues and slower single gpu's will have more stutter than faster duals (ie single gtx 560 will have more stutter than dual gtx580's)
4. how you measure the data completely changes the results.
 
Back
Top