• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

[PCPER]Frame-rating, what it is, what is does, and why it's important.


Yes, I read it, and the parts you quoted confirm what I said. They aren't taking any readings from the T_present, that is just where they are inserting the overlay on each frame the engine spits out. Their readings are taken at the end of the line, what the user actually sees. This is a far better method than FRAPs.

Maybe but if the problem (in Eyefinity) is as bad as Ryan is say, then I have an extremely hard time believing most people would just play if off after looking at their frame rates.

I don't, I remember reading an article where they took a bunch of people that said they couldn't stand playing games on anything less than 60FPS, turned the framerate counter off, limited the games to 30FPS and 90% of the people that supposedly knew they could tell the difference when a game was below 60FPS said the game felt totally smooth to them.

Some people obviously had to complain otherwise we wouldn't have multiple websites testing the complains, we wouldn't have nVidia developing a tool to test it(arguably just to make their competition look bad), and Dave's been complaining about it for a while now.
 
Last edited:
Dave's been complaining about it for a while now.

And at the same time, I can only "share war stories" with a couple of people on the same level. Most don't seem to be as sensitive to FPS as I am. Like I literally said to W1zz that it was like the secondary card was doing the work, but it never gets displayed (I only remember this because W1zz and I don't talk about random stuff too often, mostly TPU-work).


But ask any other Crossfire user here, bar one of two, and they all enjoy their systems.


We really need to see PC Per run their tests with HT on and off. I'm sure they've thought of this by now and people have suggested it in their forums.

HT is one of those things that can hurt performance in some cases and this might be one of them, due to the realtime nature of gaming graphics rendering.



HT does NOT play a big role in this. THAT might play a role in INPUT latency, but after dealing with this for YEARS, people telling me it's not a real problem, blah, blah, I've done a tonne of testing and research into this.

I have both 3570K and 3770K to test that theory, actually. No reason for me to buy 3570K, at all, except for that. I did my testing, and now that $200 chip sits on the shelf, since I don't need it for any other reason.

I also bought i5 760 and i7 870.

I tend to test that whole each "HT causes lag problems" thing with every generation, and I have found it to be not true. I also find disabling HT doesn't affect normal usage temps, either, but many people report that, too. SO whatever.
 
HT does NOT play a big role in this. THAT might play a role in INPUT latency, but after dealing with this for YEARS, people telling me it's not a real problem, blah, blah, I've done a tonne of testing and research into this.

I have both 3570K and 3770K to test that theory, actually. No reason for me to buy 3570K, at all, except for that. I did my testing, and now that $200 chip sits on the shelf, since I don't need it for any other reason.

I also bought i5 760 and i7 870.

I tend to test that whole each "HT causes lag problems" thing with every generation, and I have found it to be not true. I also find disabling HT doesn't affect normal usage temps, either, but many people report that, too. SO whatever.

Ok, it's good to know that HT isn't the culprit here and I trust your testing.

I did see years ago on Ars or something about HT reducing performance in certain situations, though. It could have even been in the P4 era though. It's so long ago, I don't remember any details or which applications they were talking about.

For the record, I keep HT on and have had zero problems with it.
 
For the record, I keep HT on and have had zero problems with it.

There have been numerous titles that on launch have had issues with HT, that cannot be denied, but that's a coding issue, not a hardware issue, and that's the software running badly on HT cores, not a driver issue.


That's part of the problem in dealing and explaining this issue, since there are many other things that can cause similar behavior. Separating one from the other can be rather difficult.

That's what makes this so hard for AMD to solve quickly...eliminating all the other issues that might be present may take some time, but since they've already confirmed this problem, and said they are already working to fix it, I'm not that worried about it, to be honest. I will, however, be selling off my extra AMD vgas, this week most likely. I need to get some NVidia GPUs.
 
Why exactly?

It seems to me that reading the framerate the user actually sees would be a far better method and far more informative to the user than reading the framerate the game engine generates long before it ever actually makes it to the user.

If the goal is to measure output frames then the FCAT should dump the overlay capture. The time interval after the data has gone through the GPU render would serve best without imposing data prior to the queing process.

Yes, I read it, and the parts you quoted confirm what I said. They aren't taking any readings from the T_present, that is just where they are inserting the overlay on each frame the engine spits out. Their readings are taken at the end of the line, what the user actually sees. This is a far better method than FRAPs.

I think were saying the same thing in two different ways.

The overlay frame tagging doesnt serve a purpose to the end count. Comparing it to final frames output does.
 
I think were saying the same thing in two different ways.

The overlay frame tagging doesnt serve a purpose to the end count. Comparing it to final frames output does.

FCAT is required because of how video capture takes place. That overlay makes it very easily to quickly recognize what's going on, but at the same time, I do feel there is a better way to deal with this, and it's something that has to be done by the developer of every single title.

And again, since AMD admitted it is a problem, I have no lasting issues with this method of testing, and neither does AMD, it seems.
 
FCAT is required because of how video capture takes place. That overlay makes it very easily to quickly recognize what's going on, but at the same time, I do feel there is a better way to deal with this, and it's something that has to be done by the developer of every single title.

And again, since AMD admitted it is a problem, I have no lasting issues with this method of testing, and neither does AMD, it seems.

Were arent talking about FCAT as a whole we are talking about output count.

I dont see overlay frame tagging useful in that sense. You can take count anywhere in the process just doing it closes to the user would be best.
 
Were arent talking about FCAT as a whole we are talking about output count.

I dont see overlay frame tagging useful in that sense. You can take count anywhere in the process just doing it closes to the user would be best.

Closest to the user would be over the cable to the monitor, which is where the displayed frames are actually captured. This is what is being done.


However, not all rendered frames are displayed to the user, so you are right, capturing data about what is showed to the end user is most relevant, however, you need to know what actually makes up what the end user sees, so you have to do something like what is done here with FCAT to find that out.


The overlay is required to identify what makes up the frames that are displayed to the user, which is obviously not a simple thing of "frameX was rendered, frame X gets displayed."

It's more like Frame x, y, z, and t were rendered, 60% frame x was used, frame y was 2%, frame z was dropped, and frame t was 38%. That's four frames making up one frame the end user sees, and if only the data passed over the cable is captured, you'd have no idea where it came from.


The only way to properly find that out is to produce the overlay either as the frame is rendered by the 3D engine, or immediately after it was 100% complete.


Honestly, what's going on is a very complex subject, and armchair research isn't going to help much here. After literally dealing with this issue for years, and people saying it's not real, it's something else, blah blah blah, there's going to be very little that will sway me personally in any other direction.


You can literally go back through years of posts here on TPU and find me complaining and talking about this problem. Years.
 
So FRAPS tells you how smooth the engine spits out the action.

FCAT tells you how smooth frames are delivered after the engine is done with everything and time the frame output.

There is flaws in both methods. One tells you how smooth the game is and the other tells you how smooth the frames are. It's dumb to take one over the other.

FCAT's flaws is when games and drivers do this

engine output
x xxxx x x x xx

frame output
x x x x x x x x x x

As you can see, NVIDIA or AMD can really fix this issue with FCAT with a bandaid fix to make themselves look better.
Smooth frames =/= smooth gaming experience. These guys just need to fix their freaking drivers to deliver both frames and action smoothly. Even at the cost of less fps, I will take it. I can watch movies at 24 or 30 fps so I'm sure I can deal with a smooth gaming experience at only 30 or 60 fps.

Else, their only option is to make it so retardedly fast that the human eye can't possibly see. But then, erratic frames are not perceivable because my brain corrects it so it will give a lot of people, like me, plenty of headaches even though I can't tell.

Hate to say it but a closed platform like consoles seems to be the wave of the future for mainstream trouble free gaming. PC gaming is/had become the early adopter platform.
 
Else, their only option is to make it so retardedly fast that the human eye can't possibly see. But then, erratic frames are not perceivable because my brain corrects it so it will give a lot of people, like me, plenty of headaches even though I can't tell.

Thats what FRAPS was telling us before. FCAT just comfirms it and introduces variance.

A disruption coming out of the Game Engine it self due to what ever issues will cause a visual overlap in frames being displayed.

Thats a different issue then what FCAT is try'n to convey.

AMD knew that was an issue and was working on it thru Microsoft GPUView
 
Were arent talking about FCAT as a whole we are talking about output count.

I dont see overlay frame tagging useful in that sense. You can take count anywhere in the process just doing it closes to the user would be best.

How do you expect the capture card and software to detect runt and dropped frames without some kind of pattern overlayed on each frame?

Output count is not what we are talking about, we are talking about perceived framerate that the user sees.
 
Thats what FRAPS was telling us before. FCAT just comfirms it and introduces variance.

I can't tell if you're on the side of FCAT or not? So please do not take this post as a retort to you.

Fraps is useless for set ups that have latency issues. I speak from direct experience. There are also issues with benchmarks though i don't know if they use the same methods that Fraps relies on.

My 3Dmark scores on both editions were higher on my crossfire set up yet there was obvious stutter. The initial scenes were all fine but the final mix including cpu physics was a bitch to watch, especially on 3DMark 2013.
Despite a lower score, my single card was measurably smoother.

Now I'm not in total agreement with certain aspects. I'm still adamant my BF3 experience on crossfire was perfect. My single card offers no better a visual feast (or worse for that matter).
Crossfired 7970's on Tomb Raider was more juddery than a single card despite higher fps, likewise on Crysis 3 (though that was much harder to call).

Anything that tells us what the end user sees is far more relevant than what Fraps alone tells us. Like i say, from direct experience, high fps numbers are irrelevant if the picture isn't butter smooth so if fraps isn't telling me what I'm seeing, it's not very useful.

And to throw a cat in amongst the pigeons, FCAT confuses me.... :laugh:
 
How do you expect the capture card and software to detect runt and dropped frames without some kind of pattern overlayed on each frame?

Output count is not what we are talking about, we are talking about perceived framerate that the user sees.

It seems to me that reading the framerate the user actually sees would be a far better method and far more informative to the user than reading the framerate the game engine generates long before it ever actually makes it to the user.

Output count was what i was talking about. I see your talking about two different things. Overlay tags are inserted to be compared. I was saying there useless if your just counting output frames. Obviously you want the comparison.

Just take a frame count.
1.) As it leaves the game engine
2.) As it leaves the Video Card

Preferably you take it at several stages in the pipeline
1.) Game engine
2.) Direct X
3.) GPU
4.) Output


If perceived framerate was an issue we'd all be playing at the required amount and anything more would be useless. The issue is consistancy.

Right now FRAPS and FCAT cant even agree on that.

And to throw a cat in amongst the pigeons, FCAT confuses me.... :laugh:

It is confusing.

I'm not against what FRAPS or FCAT does but how its being interpreted to mean other things then what it does. Thats my point.
 
I'm just wondering if all this time he learned how to bech from Nvidia. j/k


As you can tell I'm not good at drawing straight arrows :laugh:

PCPER 2013-04-01 21-56-36-98.jpg


Wondering if thats the reason he stopped posting results cause he realized hes not properly testing or worse if he was. In which case opens the flood gates to more questions.
 
Last edited by a moderator:
I have been saying about this since GF8 and HD 2900 series. Guess how many believed me then?

All the hatred of fanboys and like I have had to deal with when they have been blindly defending company and telling me there is no problem LA LA LAAA.
 
Wondering if thats the reason he stopped posting results cause he realized hes not properly testing or worse if he was. In which case opens the flood gates to more questions.


Did you not know that both AMD and NVidia deliver completely different images and colors in games? They basically render in opposite ways, so you cannot directly compare NVidia quality vs AMD, although this has been done many times over through the years. Every time someone checks visual quality, it's different.


Has nothing to do with testing methods. You're the only one that is concerned about THAT.

I have been saying about this since GF8 and HD 2900 series. Guess how many believed me then?

All the hatred of fanboys and like I have had to deal with when they have been blindly defending company and telling me there is no problem LA LA LAAA.

Yeah, I know you and I have a similar story, and a similar history of systems built, too. We've even talked about it in the past. And yeah, seems to have started with DX10, DX9 Crossfire was great, and still is.

The truth of the matter is that what we spend on rigs and the rigs we build are a minority, and most users haven't even got a second card, never mind a high-end system built a week after parts have released to the public. That's the failure of buying into tech early...you get all the problems, too. The problems aren't that big of a deal, honestly...I expect that...it's the users without any experience with high-end hardware commenting on an issue they have no experience with that is the most entertaining. :p


All I see this resulting in is that a problem I have had for years is now identified, and that it can fixed. This news, isn't bad news, really...it's notice that the issue is being looked at. If you have any issues wit hCrossfire and 7-series cards right now, well, you gotta deal with it.


And that's why I started this thread. AMD has admitted Crossfire is problematic, and will be, for some time. If you run into issues while suing Crossfire, you'll have to wait for better drivers. I really think AMD should be making a public statement abuot this whole ordeal, and quell the fanboys.
 
And yeah, seems to have started with DX10, DX9 Crossfire was great, and still is.

So there are no runts at all in DX9? I never caught that part of the info.
 
So there are no runts at all in DX9? I never caught that part of the info.

either performance is at the point it doesn't matter and doesn't present a problem, or the issue is confined to DX10/DX11 only.


DX9 problems are per-app. DX10/DX11 will take a global fix to driver as a base(should be here by July), and then, maybe per-app fixes.


Assuming I understood what AMD said properly.

The only thing that bugs me is that the 7-series launched near 18 months ago, and still no working Crossfire, and I've been told I got to wait even longer yet. I am also not really comfortable with hearing a few months ago ,that a big driver update was coming to fix DX10/Dx11 stutter, and that it was memory management, and that this driver should be out my March, and now that it's March, I'm being told July.


Delay after delay after delay = me moving to NVidia. I'll keep a single 7970.
 
Delay after delay after delay = me moving to NVidia.

That's what I did and I didn't regret it.

Been supporting ATI for too much while getting the shaft.
 
The only thing that bugs me is that the 7-series launched near 18 months ago, and still no working Crossfire, and I've been told I got to wait even longer yet. I am also not really comfortable with hearing a few months ago ,that a big driver update was coming to fix DX10/Dx11 stutter, and that it was memory management, and that this driver should be out my March, and now that it's March, I'm being told July.


Delay after delay after delay = me moving to NVidia. I'll keep a single 7970.

Exactly.

I wouldn't like to be treated this way after spending lots of money on AMD products. Come to the green side and enjoy some awesome graphics cards! :rockout:

You'd be surprised how even something as old as an 8800 GTX still works. It will be kinda slow, obviously, but everything works just fine even after all this time.
 
That's what I did and I didn't regret it.

Been supporting ATI for too much while getting the shaft.

either performance is at the point it doesn't matter and doesn't present a problem, or the issue is confined to DX10/DX11 only.


DX9 problems are per-app. DX10/DX11 will take a global fix to driver as a base(should be here by July), and then, maybe per-app fixes.


Assuming I understood what AMD said properly.

The only thing that bugs me is that the 7-series launched near 18 months ago, and still no working Crossfire, and I've been told I got to wait even longer yet. I am also not really comfortable with hearing a few months ago ,that a big driver update was coming to fix DX10/Dx11 stutter, and that it was memory management, and that this driver should be out my March, and now that it's March, I'm being told July.


Delay after delay after delay = me moving to NVidia. I'll keep a single 7970.

Exactly.

I wouldn't like to be treated this way after spending lots of money on AMD products. Come to the green side and enjoy some awesome graphics cards! :rockout:

You'd be surprised how even something as old as an 8800 GTX still works. It will be kinda slow, obviously, but everything works just fine even after all this time.

Well you all do provide a compelling point to jump ship, but I will wait for the official release of the HD 7990 with the corrected CF and then make my decision. I think there is still a bit of life left in my cards.
 
Last edited:
Well you all do provide a compelling point to jump ship, but I will wait for the official release of the HD 7990 with the corrected CF and then make my decision.

Yeah, I think it's a bit early. In the least AMD is being pretty open about this, is talking about it, and seemingly working towards fixing it. As much as some people may want to suggest that there is no problem here, this is something that AMD has been talking about openly for months now. Just more and more info is being presented to the end user in a palatable way, IMHO. We can all go back to Anandtech's article about AMD's stutter, and the forthcoming fix that hasn't materialized yet. I think AMD is madly at work trying to fix this.


If they don't, I have $1225 worth of videocards I want a refund on.

And even if they don't fix it, I'll still keep a single AMD card. The thing is, for me, that I have 3x IPS monitors sitting on my desk here for Eyefinity, and AMD doesn't deliver acceptable performance for my needs for such a configuration. A configuration I bought because they said they could do it.
 
I already knew that AMD was the best choice for single GPU setup and the worst choice for multi-gpu, but thanks for scientifically confirming this PCPER!
 
You'd be surprised how even something as old as an 8800 GTX still works.

Yeah, I have GTX 260 in another machine that happily runs Skyrim on high @1280x1024 on my old 19 inch LCD panel ... even after all these years running hot, it just won't die :rockout:

On topic, I was wondering how is it even possible that alternate frame rendering mode produces runt frames?
Think about it, both cards have identical data in VRAM, frames rendered are of similar complexity and there are periods in tests when runt frames do not exist.

It all stinks of heuristic predictions gone bad when rendering frames ahead ... no I wasn't going for a rhyme.
If AMD is using some form of heuristics when determinig AFR timings for rendering frames ahead - maybe runts happen when predicted timings do not agree with what really happened.

Now that I have solved half of the problem :laugh: please, AMD, fix the rest of it.
 
Did you not know that both AMD and NVidia deliver completely different images and colors in games? They basically render in opposite ways, so you cannot directly compare NVidia quality vs AMD, although this has been done many times over through the years. Every time someone checks visual quality, it's different.


Has nothing to do with testing methods. You're the only one that is concerned about THAT.

Might want to look again. ;)

If it was just difference in colors. I wouldnt bother.
Its lighting and texture LOD issues.

In the Crysis 3 comparison it seams to be LOD issues aswell.

Download the videos at 720p and look for yourself. Thats why I include the Video and a Picture. You have to be BLIND not to notice the differance. AT 1:20 mark 7950 CF turn on a light to the right and at 1:25 another light turns on right infront. You realy cant miss it unless your BLIND.

If you think i'm the only one to notice your not following the discussion in different forums.


AMD has admitted Crossfire is problematic, and will be, for some time. If you run into issues while suing Crossfire, you'll have to wait for better drivers. I really think AMD should be making a public statement abuot this whole ordeal, and quell the fanboys.

Did you not read AMD statements.
 
Last edited:
Back
Top