• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Microstuttering masterclass

How much does microstuttering bother you?


  • Total voters
    111
I think "microstuttering" is a poorly choiced word here. Subpar performance would be better. So, if I were using a low performance rig and was playing a game that required a high performance rig I suppose I would be bothered. Isn't it pretty obvious that lower spec computers simply aren't capable of running some demanding software? All in all, I suppose the "graph" is a good indicator of how lower end systems struggle with demanding programs. Nothing new. Anyways, this poll needs a "I don't know what microstuttering is" option for those who do their homework and build systems that suit their needs.
 
I voted "It doesn't bother me at all" because I don't see any microstuttering on my games using the 5970 (dual gpu). True I haven't yet tried out Metro, but I get smooth gameplay with games like Fallout 3, Fallout NV etc which are notorious for microstuttering. (Now we'll see how Skyrim responds within 2 months time).

If I was affected by microstuttering though, I would say it's "very annoying".
 
Does using FRAPS mean measuring the time it takes a frame to end up in the final buffer, or how the frames are output from said buffer? I see a potential fault in this method of testing...

good question. fraps takes the time between direct3d present() calls.

http://tomsdxfaq.blogspot.com/ suggests that d3d internally delays the present call to sync it up with the gpu
 
IAnyways, this poll needs a "I don't know what microstuttering is" option for those who do their homework and build systems that suit their needs.
Yeah, I was looking for that option since I really don't have any idea what "microstuttering" is. Anyway, I'm using a single GPU so I voted that option.

If somebody could briefly explain microstuttering to me and why it apparently only affects multi-card setups, I'd appreciate that. The article seemed to assume I already knew these things.
 
http://tomsdxfaq.blogspot.com/ suggests that d3d internally delays the present call to sync it up with the gpu

That in turn means that it's possible to "cope" with any potential stutter caused by inequal rendering times for specific configs, and that most issues left would probably be an artifact of differing clockspeeds between test systems and user systems? ie, overclocking?

I mean, I personally started talking about "Microstutter" with the 3870x2, which AMD admitted was a problem caused by limited bandwidth provided by the bridge chips, and running high resolutions. This was a specfic situation caused by a specific hardware config that software counters just couldn't fix.


That was three hardware generations ago, and although I am one of few people that think PCIe 3.0 is really needed, and now, it strikes me as odd that there are so many proponents against PCie 2.0 being a limitation for current graphics cards, yet there are still many users that feel "Microstutter" is an issue?

This puts it very eloquently:

The thing that is taking the time is that the CPU is waiting for the GPU to catch up.

Now, of course, we really need to identify what's going on here, and why frame render times differ(or, potentially, just seem to differ). I see alot of people seeing the behavior now, unlike years ago, but we still aren't looking into the actual cause, and how it's dealt with by DirectX and drivers.
 
Yeah, I was looking for that option since I really don't have any idea what "microstuttering" is. Anyway, I'm using a single GPU so I voted that option.

If somebody could briefly explain microstuttering to me and why it apparently only affects multi-card setups, I'd appreciate that. The article seemed to assume I already knew these things.

I think quoting my previous post will answer your question: :)

You won't notice microstutter if your rig can render without any dropped frames and vsync is on, regardless of the refresh rate of your monitor.

From what I can tell, you get microstuttering with single cards too, when the system can't maintain the framerate. Basically, you get an uneven, jerky mess, as each frame takes a different time to render as the scene changes. Perhaps the exact way it looks on screen is subtly different, but I'm not sure.
 
Hmm, well that sounds like something that would annoy the hell out of anybody then lol
 
You won't notice microstutter if your rig can render without any dropped frames and vsync is on, regardless of the refresh rate of your monitor.

From what I can tell, you get microstuttering with single cards too, when the system can't maintain the framerate.

I never game without vsync on - screen tearing drives me insane more than anything else!

When I had the 8800GT (single gpu, same computer specs) and gamed at 1680x1050 I used to get a lot of microstutter.. I'd be 40fps then it'd drop to 10 or 15 momentarily and back up again.
 
I'd be 40fps then it'd drop to 10 or 15 momentarily and back up again.

I think that's regular stutter, though I'm not sure on the increments between stutter to classify it as microstutter or just plain ol' stutter. :) We can all hold hands and agree, stutter is bad.

Anyways, I voted for single card and I don't care.. Even though I'm runnnig two GPU's, I don't get stutter, hence I don't care.
 
Last edited:
Thanks your post lead me to read about triple-buffering. I had never thought about checking that box in CCC before. I'm playing Tom Clancy's Splinter Cell Conviction ATM and it's running much smoother now.

Cheers..:toast:

Glad it helped. Although, correct me if I'm wrong, isn't the triple buffering option in CCC and Nvidia CPanel only for OpenGL applications? If I remember correctly, that's why I was hunting for a program that would force Triple Buffering in D3D apps which lead me to RivaTuner/D3DOverrider.
 
Glad it helped. Although, correct me if I'm wrong, isn't the triple buffering option in CCC and Nvidia CPanel only for OpenGL applications? If I remember correctly, that's why I was hunting for a program that would force Triple Buffering in D3D apps which lead me to RivaTuner/D3DOverrider.

Hah! Finally found that article on triple buffering. It was actually AnandTech that did it, not Ars. :o

According to AnandTech:

UPDATE: There has been a lot of discussion in the comments of the differences between the page flipping method we are discussing in this article and implementations of a render ahead queue. In render ahead, frames cannot be dropped. This means that when the queue is full, what is displayed can have a lot more lag. Microsoft doesn't implement triple buffering in DirectX, they implement render ahead (from 0 to 8 frames with 3 being the default).

It's a really detailed article and will take time and concentration to read it. Here's the link to it, starting from my quote above, not the beginning of the article.

Interestingly, the nvidia control panel on my system has a triple buffer option, but doesn't specify whether it's for DirectX or OpenGL, but it does differentiate for the other options.
 
hmm would be nice if W1zzard could right a simple tool that allows us to force Triple buffering in Direct X and just turn it off or on as needed. but hes a busy man and i cant afford his skillz there to leet
 
That's exactly what D3DOverrider does...

http://www.brighthub.com/computing/hardware/articles/27742/image/9610/

It's just a small overlay gui that with 2 clicks you can enable V-sync and Triple Buffering in Direct X Apps even if you have forced them off in game or in CCC/NCP.

If you haven't tried it crazy I'd highly recommend firing up BC2 with it. Makes it much more easy on the eyes than no V-sync. I used to not use v-sync as I could notice the slight input lag that comes from it but this cleared it right up.
 
last i checked it was built into riva tuner and that was discontinued, if you got a download link for just D3D overider then post it, otherwise i dont feel like having out dated OC tools period, since they tend to fuck with BC2 for me to the point i cant play.
 
Hmm. Well if it gives you issues I can understand you wouldn't want it. It has produced no issues with my setup and BC2.

Also, it may be bundled with RivaTuner and must install it, but you can run D3DOverrider without running RivaTuner.

Alas, if you are already having problems with BC2, you could try it and see if it helps? And delete it if not.
 
not performance issues

just overclocking tools in general for the GPU cause BC2 to freak out on me

MSI afterburner
Sapphire TriXX

all of them cause BC2 and Punkbuster to freak so i stopped using them and the problems went away, resorted ot OC via CCC and got the same clocks etc, i just dont have fan profile control on the fly like before, If D3D overrider was standalone it would be useful app till then no thanks,

seems only Radeon Pro offers Triple buffering via Direct X but it forces Vsync as well. which kinda kills that idea as well
 
not performance issues

just overclocking tools in general for the GPU cause BC2 to freak out on me

MSI afterburner
Sapphire TriXX

all of them cause BC2 and Punkbuster to freak so i stopped using them and the problems went away, resorted ot OC via CCC and got the same clocks etc, i just dont have fan profile control on the fly like before, If D3D overrider was standalone it would be useful app till then no thanks,

seems only Radeon Pro offers Triple buffering via Direct X but it forces Vsync as well. which kinda kills that idea as well

Are you talking duel GPU? Because with a single GPU afterburner is great.
 
and no it does it to me in single and dual gpu mailman and it does it to many others in the BC2 clubhouse, and it happens to Nvidia and AMD users its just a small % so no companies see the need to bother fixing it.

had the same problem with my 5850s and 6970s on both AMD and Intel based rigs, with fresh installs etc.
 
and no it does it to me in single and dual gpu mailman and it does it to many others in the BC2 clubhouse, and it happens to Nvidia and AMD users its just a small % so no companies see the need to bother fixing it.

I really don't know WTF you guys do to your systems.
 
seems only Radeon Pro offers Triple buffering via Direct X but it forces Vsync as well. which kinda kills that idea as well

Wait...what?

Isn't that the whole idea? To allow Triple Buffering with V-sync enabled in DX apps?
 
well i can tell you right now full stock cpu everything with just a fresh install of windows i can make BC2 crash within 5 mins everytime, all it takes is to have MSI afterburner running with Unofficial OC enabled, if i turn off the app no issue turn off Unofficial OC and it will still crash just takes longer, editing the fan profile also causes the crash,

with Sapphire Trixx, fan profile causes issues and so does overclocking but the app can remain running

last i checked Triple Buffering can be used without vsync,

just as frame render ahead can be used without vsync

Open GL allows for Triple Buffering without issue,
Direct X uses render ahead limit instead, the problem is the stuttering from micro stutter isnt fixed with Vsync all the time 4870x2 + Crysis = stutter whore the entire time i had it vsync on or off, back then riva tuner was still viable, and forcing off vsync but keeping triple buffering on, ment smoother and greater frame rates in most games with no input lag from vsync.

capping my frame rate at 60fps is not what i want, if i wanted to cap at 60 id have 1 6970 and use vsync and just stfu and gtfo :roll: i want high frame rates but smooth fluid playback,

Triple Buffering also eliminates tearing so you can go ABOVE the refresh rate but get less to 0 tearing which is what i want, Oblivion, FO3, NV they dont stutter for me but animations seem off higher then 60fps the animations feel smooth less wonky, but the texture tearing is insane, if you force Triple Buffering with Vsync off you can keep the high frame rate but remove the texture tearing, Triple Buffering is more beneficial then just removing microstutter etc
 
I feel like the last question doesn't exactly fit the rest of scenarios. I'd imagine there's people using single gpu setups specifically because they want to avoid micro-stuttering.
 
I think that's regular stutter, though I'm not sure on the increments between stutter to classify it as microstutter or just plain ol' stutter. :) We can all hold hands and agree, stutter is bad.

What I used to get with the 8800GT @1680x1050 was a significant drop for a second or less. I realised what was happening when I game using fraps. Without fraps, I used to have those "wtf" moments which are too short to worry about, like that insiduous doubt of - were my eyes playing games on me or was the game playing games on my eyes.. :laugh:
 
does anyone have access to a high speed camera ? maybe that could be used to investigate micro stuttering?
 
Ah. I gotcha. I thought you wanted Triple Buffering with enabled V-sync to remove issues like input lag and microstutter not just Triple Buffering alone.
 
Back
Top