Discussion in 'Graphics Cards' started by entropy13, Sep 10, 2011.
i want to know what w1zzard thinks of this
Thanks, good read
Nice info, nice read, don't quite understand it..
same here. I got to around page 5 before I started to get confused.
I read this yesterday too, it's a great read and definitely highlights the issue's surrounding what appears as stutter in games, and inconsistent FPS. they're dead right in saying that just FPS doesn't tell the whole story.
I particularaly liked and agree with statements like this; even though it is partially beside the point.
Just imagine how fluid gameplay would become if they decide, LCD dont need to run @ more than 25 frames, cause movies dont need more
Read somewhere that his work was flawed because he was trying to capture and record the frames before they reached the rendering pipeline instead of after. There are many things that happen during rendering and a lot of those include throwing away frames. So as always take it with a grain of salt
Are you talking about the multi-GPU setups, specifically the Nvidia ones? The "flaw", as you mention, is that Nvidia apparently has a slightly different implementation of things in multi-GPU setups compared to AMD. It's a flaw in a sense that not all variables are held constant during comparison, but it's not such a big flaw that everything else posted about the article is automatically void.
And there are other things that are holding it back so to speak (like the use of IPS panels instead of TN panels, relying on software instead of hardware - like speed cameras, which he mentions), yet that does not mean that it already should have been taken with "a grain of salt."
All scientific articles published in academic journals all have certain "reservations." In your opinion then, those articles (all of them) must be taken with a grain of salt because they have "flaws."
Interesting article. And they could use CRT for real life experience testing of frame jitter.
what interested me was that slower response time monitors make it harder to see the issue, so a fast response TN panel could actually be a bad thing for gaming and microstutter.
I'm a little confused on something. Maybe someone can clear this up for me.
On page 11 he states that Fraps records a frame timestamp when the game engine hands off a frame to the DX API.
Earlier he stated that the jitter on multi-GPU setups was due to the fact that a frame rendered on the second (non-output) GPU had to hand the frame back to the primary display GPU, thus adding a little delay.
Why would Fraps detect this if it timestamps the frames prior to the GC even getting the frame information?
The frame times would, of course, vary based on how long it took the engine to get it to the API, but why would that mirror the GPU to GPU transfer delay?
Separate names with a comma.