• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

CPU/GPU review methods ( time to include 1% lows please )

Include 1% lows in future GPU/CPU reviews?

  • Yes

    Votes: 25 78.1%
  • No

    Votes: 7 21.9%

  • Total voters
    32
Joined
Jan 24, 2008
Messages
888 (0.14/day)
System Name Meshify C Ryzen 2019
Processor AMD Ryzen 3900X
Motherboard X470 AORUS ULTRA GAMING
Cooling AMD Wraith Prism LED Cooler
Memory 32GB DDR4 ( F4-3200C16D-32GTZKW, 16-16-16-36 @ 3200Mhz )
Video Card(s) AMD Radeon RX6800 ( 2400Mhz/2150Mhz )
Storage Samsung Evo 960
Display(s) Pixio PX275h
Case Fractal Design Meshify C – Dark TG
Audio Device(s) Sennheiser GSP 300 ( Headset )
Power Supply Seasonic FOCUS Plus Series 650W
Mouse Logitech G502
Keyboard Logitech G 15
Software Windows 10 Pro 64bit
Hi,

Reading the 3900x review, you might get the impression that the 3900x is only 2-3% faster than the 2700x on 1440p.
In reality, I see uplifts of my minimum framerates to 20-30%. And I'm not the only one. On Reddit it's filled with posts about framedrops being a thing from the past with the Ryzen 3900x.
The 1% lows are much, much higher than on the Ryzen 2700x.

By not including the 1% lows you are basicly misinforming consumers in the gaming benchmarks. Because you talk about "Game performance".
But if a game is 50% of the time on 60fps, and 50% of the time on 120fps, you would say 90fps as avarage framerate.
But if that same game ran 50% of the time 80fps, and 50% of the time on 130fps, you would say the game now runs at 105fps on avarage ( = only 16% increase ).
But the fact that the minimum framerate is now 80fps instead of 60fps is a HUGE difference ( 33%! ). Not only will the game run smoother due to the higher framerates, the difference between the minimum and maximum framerate is also lower, meaning the frametimes are more consistant, making it feel more smooth overall.

Back in 2009, framerates in reviews were OK. But maybe, time to revisit this method in 2019?
 
Yeah I look at TPU review for quick comparisons but use other sources for my main review because of this. And it's not like they need another bar and make the chart unreadable like GamersNexus. They could easily add the 1% bar inside the average bar TPU currently shows. But I guess next time TPU resets the benchmark suite might be a better time than just randomly changing it now.
 
Totally in agreement. TPU is sadly lagging behind at this point.
 
100% agreed and this was suggested many times already. @W1zzard anything in the pipeline?
 
Perhaps it will happen in 2020???
 
So explain what the 1% lows are to begin with?
 
Very, very badly needed IMO to bring TPU reviews back to being more relevant especially with Vulkan & DX12 APIs. I'll take higher 1% lows over higher FPS every time...
 
W1zz knows of this. I think that many of us asked for it.
I think that he also understands very well why people want TPU to do this.

In terms of grabbing the information and presenting it, there's actually not much that go into it. Those are statistics RTSS grabs anyway and can generate data from.
It may have to do with how reviews are built and his own personal time bank doing those, as he has some other things to do and divide time to.
 
So explain what the 1% lows are to begin with?

Looking at the minimum framerate in addition to the average. Looking at the average doesn't always show the complete picture with low points and stutters.
 
1% lows aren't particularly informative either.

Best thing to do is show a frametime graph over time, from there you can easily see how things really perform.
 
I personally find 3% or 5% a better indicator since it's a larger sample although it often (but no always) parallels 1%. If I need an FPS app to show me stutter then it's not an issue, if I can see it with my eyes during game play then it is an issue.
 
So explain what the 1% lows are to begin with?

Let's say your AVG fps is 100, but your 1% low is 1fps, that would mean you would have complete freezes at certain times.

Borderlands.png


So here the 3950X is only 10% faster on avaragevs the 2990WS, but the 1% lows are 25% higher making a massive difference in the gaming experience.
 
So explain what the 1% lows are to begin with?

It explains how smooth your gameplay is. Think of it as taking a long trip on a highway (exit ramp to exit ramp) where you average 65mph but 1% of the time you were only doing 20mph. It means you hit some stop and go traffic and the ride was not always smooth. On your next trip you average 60mph but your 1% lows were 40mph. Although you averaged 5mph less on the latter trip it was a smoother drive.
 
Let's say your AVG fps is 100, but your 1% low is 1fps, that would mean you would have complete freezes at certain times.

View attachment 137768

So here the 3950X is only 10% faster on avaragevs the 2990WS, but the 1% lows are 25% higher making a massive difference in the gaming experience.

Ok so it's minimum fps that counts.
 
Last edited:
Ok so it's minimum fps that counts.


To me it counts when looking at the industry standard of smooth play at 60fps (for FPS gaming anyway). If you are looking at two cards/CPUs (regardless of resolution) it makes a difference when one never dips below that 60fps while the other establishes lows below that threshold. The card that never gets below a low of 60 may not provide the highest overall framerate, but in theory it will provide a smoother experience.
 
Way back in the day when I was in an AMD phenom 2 x4 920 on a AMD 770 with 2x HD4830's in CF this is something I noticed and I haven't fully trusted a review since.
Same with motherboard chipsets...
Reviews have been utterly useless except to see the maximum fps which really doesn't mean shit without the bottom number and the amount of time spent at the bottom bouncing.

A better example was going from an HD6950 to a R9-280... The 280 had better highs but really wasn't any better because DX12 was a useless feature and they both had roughly the same lows...
 
W1zz knows of this. I think that many of us asked for it.
I think that he also understands very well why people want TPU to do this.

In terms of grabbing the information and presenting it, there's actually not much that go into it. Those are statistics RTSS grabs anyway and can generate data from.
It may have to do with how reviews are built and his own personal time bank doing those, as he has some other things to do and divide time to.
OCAT spits this data out... just need to run it with the benchmarks! You can even script it. ;)

The bigger concern here at this site, however, is all of the motherboard reviews are benchmarked well below the platform spec for memory (which W1z now knows about).
 
To me it counts when looking at the industry standard of smooth play at 60fps (for FPS gaming anyway). If you are looking at two cards/CPUs (regardless of resolution) it makes a difference when one never dips below that 60fps while the other establishes lows below that threshold. The card that never gets below a low of 60 may not provide the highest overall framerate, but in theory it will provide a smoother experience.

Honestly, if your 1% lows are 60fps, and your average is 100+, you will defenitly feel the difference in frametimes, making it harder to train your "muscle memory" in shooters. Once your minimum is 90fps, it becomes pretty hard to notice the difference in frametimes.
 
1% lows aren't particularly informative either.

Best thing to do is show a frametime graph over time, from there you can easily see how things really perform.

This, some engines are very prone to stutter and others are entirely not, a graph can show this, but a number cannot.

Why is this revived, though? Surely W1zz heard our plea the first half dozen times.

Honestly, if your 1% lows are 60fps, and your average is 100+, you will defenitly feel the difference in frametimes, making it harder to train your "muscle memory" in shooters. Once your minimum is 90fps, it becomes pretty hard to notice the difference in frametimes.

If you're about muscle memory you will want locked FPS regardless, or it all goes to shit, and review results bench at ultra so this is a complete mismatch IMO. If you want pro gaming for a specific game, you find forums for it to get deep into maximizing its performance. Definitely not info you want to take from a review. Especially because games and drivers get updated too.

1% low on its own doesn't tell us a whole lot. It could be one fat stutter upon loading the map.
 
100% agreed. :cool:

1% low on its own doesn't tell us a whole lot. It could be one fat stutter upon loading the map.

The "1% low" is the average of 1% of the lowest frames. It's not just a stutter that occurs only once. it's much more relevant than testing the game at 720p (there are no even monitors with this resolution :rolleyes:).
 
100% agreed. :cool:



The "1% low" is the average of 1% of the lowest frames. It's not just a stutter that occurs only once. it's much more relevant than testing the game at 720p (there are no even monitors with this resolution :rolleyes:).

Yes, and in a game like Assassins' Creed, most of these stutters happen on area/block transitions and loads, and can 'feel' entirely different than micro stuttering throughout the whole run. A graph is required to say anything sensible about 1% lows. Min FPS on the other hand is a simple, clear indicator of 'worst case scenario' and can do without graph.
 
100% agreed. :cool:



The "1% low" is the average of 1% of the lowest frames. It's not just a stutter that occurs only once. it's much more relevant than testing the game at 720p (there are no even monitors with this resolution :rolleyes:).

...that test has nothing to do with monitor resolution and everything to do with headroom by removing the GPU as much as possible. All tests have their pros and cons, only test that matters is your eyeball test as Vayra explained above.
 
Yes, and in a game like Assassins' Creed, most of these stutters happen on area/block transitions and loads, and can 'feel' entirely different than micro stuttering throughout the whole run. A graph is required to say anything sensible about 1% lows. Min FPS on the other hand is a simple, clear indicator of 'worst case scenario' and can do without graph.

But you can't use a single game to invalidate a testing methodology, if framerate falls are occurring with "CPU X" and another "Y" not, I'll want to know.

...that test has nothing to do with monitor resolution and everything to do with headroom by removing the GPU as much as possible. All tests have their pros and cons, only test that matters is your eyeball test as Vayra explained above.

This is practically a synthetic test, which doesn't help much at the end.
 
But you can't use a single game to invalidate a testing methodology, if framerate falls are occurring with "CPU X" and another "Y" not, I'll want to know.

I never disagreed, did I? I'm just saying it needs to come with a graph, and on a per game basis, or the number isn't meaningful and may lead to bad choices/wrong conclusions. Its exactly the outliers that matter, and for those you want to know in what way they are problematic. We already know there is much more to smooth gaming than FPS or the minimums on their own.

This is practically a synthetic test, which doesn't help much at the end.

The 720p test is not synthetic. It runs game logic and by doing so it simulates the use case of competitive gaming / tweaking systems for top FPS numbers. This test shows the optimal capability in some of the most effective ways for a range of real games. In addition, you tend to find nearly linear scaling when cpu loads increase as resolution or game loads go up, right up until the moment the GPU becomes the bottleneck. Its very useful info, an indicator of 'absolute' performance. The numbers make sense if you think ahead about your next GPU purchase. Will the CPU become a limiting factor? 720p gives you a peek.
 
Last edited:
I wanted to add this to my original post but for some reason I can no longer edit it.

Just to make it very clear for everyone:

frametiems.png


So majority agrees on this forum, most thrustworty review sites have already added 1% lows. Why does TechPowerup refuse to do so?

If both systems would be compared in a TPU review, the conclusion would be that they are equal. In reality, you want to have system B and not system A. Without 1% lows, no way to know.
 
Back
Top