• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Why are reviewers so lazy? ( Not talking about TPU!!! )

Joined
Jan 24, 2008
Messages
888 (0.14/day)
System Name Meshify C Ryzen 2019
Processor AMD Ryzen 3900X
Motherboard X470 AORUS ULTRA GAMING
Cooling AMD Wraith Prism LED Cooler
Memory 32GB DDR4 ( F4-3200C16D-32GTZKW, 16-16-16-36 @ 3200Mhz )
Video Card(s) AMD Radeon RX6800 ( 2400Mhz/2150Mhz )
Storage Samsung Evo 960
Display(s) Pixio PX275h
Case Fractal Design Meshify C – Dark TG
Audio Device(s) Sennheiser GSP 300 ( Headset )
Power Supply Seasonic FOCUS Plus Series 650W
Mouse Logitech G502
Keyboard Logitech G 15
Software Windows 10 Pro 64bit
I have a problem with the fact that so many review website don't update their game collection and don't revisit older reviews one year later.
Here's a nice example:

What games does he use? Battlefield 1 ( while Battlefield V population is already 3 times bigger ), F1 2017 ( while F1 2018 population is 5 times bigger ) and an old Tombraider game. The numbers are there, the games tested or not getting played a much as the newer titles wich is normal offcourse. But another problem I see is that all the improvements AMD has done over the years, are not being displayed by using old benchmarks or by benchmarking older titles.

In BF1, at release the GTX 1080 wins from Vega 64. This is not the case in the latest version of BF1.
In BF V, the Vega 64 wins from the GTX 1080. But you still see reviewers referring to their older BF1 benchmarks and simply not using Battlefield V for benchmarking.

In F1 2017, at release, the GTX 1070 wins from Vega 64. This is no longer the case. Now the Vega 64 is even faster then the GTX 1080, just like in F1 2018.

And in the latest tombraider, the Vega 64 beats the GTX 1080 again, unlike the results from previous tombraider games at the release.

I hope at some point, Techpowerup can make the difference and starts revisiting older reviews one year later. Because in a years time, game patches and driver updates can make a big difference in the actual performance.
Also, when a new title gets released in a franchise and the new title has a bigger gamer population, they should simply stop benchmarking the older title or at least benchmark the older + the newer title.
And last but not least, if a title can be played in both DirectX 11 and 12, both should get tested. Because when Hardware unbox benchmarked Hitman only in DX11, it pissed me off because this is the reality:
Hitman DX11
GTX 1080 8% faster then Vega 64
Hitman DX12
The exact same performance

And this feedback is aimed at Techpowerup directly, stop benchmarking at the highest quality settings and start benchmarking at medium. The amount of gamers that play on Ultra only represents 10% at most. There are settings like shadow/lighting or post processing that almost every gamer turns down for the big performance impact or the disadvantage it gives in visibility. So why benchmark settings that 90% of gamers will not use.
 
There are some decent youtube channels that test games and settings just like you want.

Most reviewers test games at absurd settings because they have to or they get no more review samples.
 
Actually it's fine with whatever criteria that they use as long as it's valid
And i don't agree test in medium setting why coz you can't see its power, if it is capable in high setting with high AA and AF it should be perform well in medium
 
If it is suck, why they have ultra setting in your pc?
Someone should sue nvidia/AMD coz they have ultra setting
 
Actually it's fine with whatever criteria that they use as long as it's valid
And i don't agree test in medium setting why coz you can't see its power, if it is capable in high setting with high AA and AF it should be perform well in medium

Not true. Latest tombraider game on Ultra, 1080P, RX 580 is 13% faster then GTX 1060. GTX 1060 not even getting 60fps on Ultra.
Latest tombraider game on Medium, 1080p, RX 580 is 21% faster then GTX 1060, both cards getting more then 60fps.
So wich settings are RX 580 and GTX 1060 gamers are going to play on? And how many gamers actually own hardware faster than that?
And how many gamers are actually gaming with the fastest processer available like all those reviewers are using?
 
Last edited:
why don't you just run your own channel "how to cherry pick settings to make radeons look better",unlike those lazy reviewers who don't even bother to find them and use stupid ultra presets :rolleyes:
 
why don't you just run your own channel "how to cherry pick settings to make radeons look better",unlike those lazy reviewers who don't even bother to find them.

A reaction like yours doesnt suprise me from someone with a GTX 1080TI. Because I want honest reviews you call them "cherry pick settings" ?

If it is suck, why they have ultra setting in your pc?
Someone should sue nvidia/AMD coz they have ultra setting

Consoles are used for the baseline these days. Look at it as the medium settings. Then they reduce details for slower PC's ( = low settings ) and increase details and graphics for their trailers and all the marketing that comes with gameplay footage from reviewers and streamers ( high/ultra settigngs ) 10-15 years ago, consoles were not even close to pc's performance and graphics and were designed with highest settings in mind, not with medium settings like today. Isnt it Obvious that all PC games are just consoles ports these days?
 
I hope at some point, Techpowerup can make the difference and starts revisiting older reviews one year later
We don't recycle scores through test system revisions. Every time the test system is updated every single card is rebenched with the same hardware, driver, everything
 
You should do reviews yourself. "If you want it done right do it yourself." (Or have it done how you want it)
 
I have a problem with the fact that so many review website don't update their game collection and don't revisit older reviews one year later.
The reason why no sane reviewer is gonna jump on benching the latest AAA titles, is because they are getting patched up and optimized on the go.
You may waste a lot of time juggling GPUs on a testbench and running benchmarks for a dozen cards, only to find out that 2 months later your results are irrelevant because [insert a developer name here] fixed a bug in post-processing implementation, which doubled everyone's FPS. BF V is a good example of that (at least for RTX).
If the game or a benchmark has been around for at least a year - then it's safe to say that it's already patched up to the point where you cannot expect any giant performance improvements.
There are some exceptions, though. For example, when new API comes out, or GPUs with new API/tech support hit the shelves, you usually have no choice but to test with some half-baked shit due to lack of alternatives: like Deus Ex: Mankind Divided && DX12, or BF:V and RTX, or Vulkan && highly experimental VK render in Talos Principle.

And this feedback is aimed at Techpowerup directly, stop benchmarking at the highest quality settings and start benchmarking at medium.
The reason benchmarks are done on highest available settings, is because it loads the GPU while minimizing the impact of CPU bottlenecking. Also, some games have a hard cap for max FPS, which makes matters even worse. You may get some wrong ideas if you see that on medium settings @1080p an RX470 and GTX1080Ti perform the same in Rocket League or whatever.
 
You do have a point with benching brand new, just released games. NVIDIA almost always has an advantage there because they tend to have a "GameReady" driver out for it before it launches (which benchmarkers often use) where AMD tends to be a month or two behind. First benchmarks are, therefore, going to have an NVIDIA bias.

There's three reasons for NVIDIA being quicker:
1) Most developers develop on NVIDIA hardware (NVIDIA likes to hand out graphics cards to developers who ask, AMD not so much). Problems are discovered sooner and NVIDIA is notified earlier because the developers themselves see the problem and debug it.
2) Games that use GameWorks get NVIDIA involved much sooner in the development cycle than they would otherwise. This mostly applied to AAA games.
3) NVIDIA's drivers are more modular than AMD's drivers which allows NVIDIA to have a much faster release cadence than AMD but AMD's tend to be more polished because the whole package is QA tested.


I concur that benchmarks should be performed on Vulkan > D3D12 > D3D11 > OGL. D3D11 should never be used on a game that supports Vulkan and/or D3D12.
 
Last edited:
why don't you just run your own channel "how to cherry pick settings to make radeons look better",unlike those lazy reviewers who don't even bother to find them and use stupid ultra presets :rolleyes:
Looks like they're trying and advertising it through their signature. I'm sure this is all about bringing themselves attention. Pathetic.
 
I agree that some game choices are downright bizarre and questionable.
 
We don't recycle scores through test system revisions. Every time the test system is updated every single card is rebenched with the same hardware, driver, everything
Don’t worry W1z, he’s just someone that hasn’t bothered to read the volumes already written in these forums by you and others with your review methodology.
 
Some games use BS antipirate software that simply bans a user for 24h after too many hardware changes.
In those games, you simply cannot do these types of benchmarking (20 cards+) in reasonable ammount of time (ie. till next game patch/driver version).
 
Last edited:
We don't recycle scores through test system revisions. Every time the test system is updated every single card is rebenched with the same hardware, driver, everything

And you don't see that as a problem? For the AMD cards you are using the driver 18.8.2 in all the latest reviews in the last couple of months. And for the Battlefield V performance review, you had to use a driver that got released before 18.11.1 got released. Now have a look at the latest drivers release logs:

Up to 3% faster performance in F1 2018 using Radeon™ Software Adrenalin Edition 18.9.2 on the Radeon™ RX Vega 64 (8GB) graphics card than with Radeon™ Software Adrenalin Edition 18.8.2 at 2560x1440 (1440p).RS-249
Up to 4% faster performance in Shadow of the Tomb Raider using Radeon™ Software Adrenalin Edition 18.9.2 on the Radeon™ RX Vega 64 (8GB) graphics card than with Radeon™ Software Adrenalin Edition 18.8.2 at 2560x1440 (1440p).RS-251
Up to 8% faster performance in Battlefield V with Radeon™ Software Adrenalin Edition 18.11.1 on the Radeon™ RX Vega 64 (8GB) graphics card than with Radeon™ Software Adrenalin Edition 18.10.2 at 1920x1080. RS-231
Up to 9% faster performance in Battlefield V with Radeon™ Software Adrenalin Edition 18.11.1 on the Radeon™ RX 580 (8GB) graphics card than with Radeon™ Software Adrenalin Edition 18.10.2 at 1920x1080. RS-232
Up to 8% faster performance in Battlefield V with Radeon™ Software Adrenalin Edition 18.11.1 on the Radeon™ RX Vega 64 (8GB) graphics card than with Radeon™ Software Adrenalin Edition 18.10.2 at 1920x1080. RS-231
Up to 9% faster performance in Battlefield V with Radeon™ Software Adrenalin Edition 18.11.1 on the Radeon™ RX 580 (8GB) graphics card than with Radeon™ Software Adrenalin Edition 18.10.2 at 1920x1080. RS-232
Up to 5% faster performance in Wolfenstein II: The New Colossus using Radeon Software Adrenalin 2019 Edition 18.12.2 on the Radeon RX Vega 64 graphics card than with Radeon Software Adrenalin Edition 18.12.1 at 3840x2160 (4K). RS-280

These are only the performance improvements clearly mentioned by AMD. Their could be more. You can only know by benchmarking every card with the latest driver and every game with the latest patch every time a new peace of hardware gets released. That's the only way to do a fair comparison. You also review new Nvidia cards with newer drivers and old Nvidia cards with older drivers and put them in the same performance chart? Is that a fair comparison? You really think Nvidia gamers don't update their drivers when new games get released?

So basicly all your AMD benchmarks in the last couple of months are not valid and the Battlefield V performance review was already outdated one week after it got published. I know there is no money to be made by revisiting an older review. But honestly wich game is actually finished at release? Performance reviews should be done 3-6 months after the release, when the patches are done and the drivers are optimized. All performance reviews in the first months are just clickbate and provide no clear view of expexted gaming performance.
 
I have a problem with the fact that so many review website don't update their game collection and don't revisit older reviews one year later.
Here's a nice example:

What games does he use? Battlefield 1 ( while Battlefield V population is already 3 times bigger ), F1 2017 ( while F1 2018 population is 5 times bigger ) and an old Tombraider game. The numbers are there, the games tested or not getting played a much as the newer titles wich is normal offcourse. But another problem I see is that all the improvements AMD has done over the years, are not being displayed by using old benchmarks or by benchmarking older titles.

In BF1, at release the GTX 1080 wins from Vega 64. This is not the case in the latest version of BF1.
In BF V, the Vega 64 wins from the GTX 1080. But you still see reviewers referring to their older BF1 benchmarks and simply not using Battlefield V for benchmarking.

In F1 2017, at release, the GTX 1070 wins from Vega 64. This is no longer the case. Now the Vega 64 is even faster then the GTX 1080, just like in F1 2018.

And in the latest tombraider, the Vega 64 beats the GTX 1080 again, unlike the results from previous tombraider games at the release.

I hope at some point, Techpowerup can make the difference and starts revisiting older reviews one year later. Because in a years time, game patches and driver updates can make a big difference in the actual performance.
Also, when a new title gets released in a franchise and the new title has a bigger gamer population, they should simply stop benchmarking the older title or at least benchmark the older + the newer title.
And last but not least, if a title can be played in both DirectX 11 and 12, both should get tested. Because when Hardware unbox benchmarked Hitman only in DX11, it pissed me off because this is the reality:
Hitman DX11
GTX 1080 8% faster then Vega 64
Hitman DX12
The exact same performance

And this feedback is aimed at Techpowerup directly, stop benchmarking at the highest quality settings and start benchmarking at medium. The amount of gamers that play on Ultra only represents 10% at most. There are settings like shadow/lighting or post processing that almost every gamer turns down for the big performance impact or the disadvantage it gives in visibility. So why benchmark settings that 90% of gamers will not use.

Here you go. Five seconds on Google...

https://www.tomshardware.com/reviews/amd-nvidia-driver-updates-performance-tested,5707.html
https://www.phoronix.com/scan.php?page=article&item=nvidia-16gpu-jan2018&num=2
https://www.computerbase.de/2017-01/geforce-treiber-test/2/

And, as for TPU? Members do the work.
@Artas1984
https://www.techpowerup.com/forums/...-353-62-vs-376-33-vs-398-36-vs-411-70.248256/


Bottom line, stop whining, start improving your interwebs skills. I guess you're just another example of Youtubers being blundering fools? Its a real pattern.

PS. if you want to attract subs and viewers, this is how NOT to do it. You just invalidated everything you produce on your channel for the entire TPU community.
 
You can only know by benchmarking every card with the latest driver and every game with the latest patch every time a new peace of hardware gets released. That's the only way to do a fair comparison.
That's simply not humanly possible, unless you hire several people benching for you all the time on multiple identical rigs (which introduces a lot of variation).

You forgot that everything should be tested on: Intel + Intel HEDT + Ryzen + Threadripper.
 
Michael at Phoronix is pretty good about doing benchmarks and looking at the differences between driver versions, but that's something completely different than doing a review of new cards. It's also a review for the Linux ecosystem which doesn't always translate into expected changes in Windows.

To be honest, I wouldn't mind seeing some driver benchmarks between older and newer versions. It doesn't have to happen all the time, but it could be enlightening much as the PCIe scaling reviews are.
That's simply not humanly possible, unless you hire several people benching for you all the time on multiple identical rigs (which introduces a lot of variation).

You forgot that everything should be tested on: Intel + Intel HEDT + Ryzen + Threadripper.
How about at the frequency at which do you PCIe scaling reviews and if you focus on just the drivers like how you focus on just PCIe?

Edit: Mind you, I'm just throwing ideas out there. I don't completely agree with the OP, but the observation is valid, just not the approach.
 
Michael at Phoronix is pretty good about doing benchmarks and looking at the differences between driver versions, but that's something completely different than doing a review of new cards. It's also a review for the Linux ecosystem which doesn't always translate into expected changes in Windows.

To be honest, I wouldn't mind seeing some driver benchmarks between older and newer versions. It doesn't have to happen all the time, but it could be enlightening much as the PCIe scaling reviews are.

How about at the frequency at which do you PCIe scaling reviews and if you focus on just the drivers like how you focus on just PCIe?

Those things dó happen. There is a large table with bench results comparing Kepler and GCN across several drivers too, but I cba to look it up. This story is as old as graphics card benchmarks... and the overall tendency is that drivers show minimal performance gains over time in both camps, but it never really changes the hierarchy of a product stack - and as a buyer that is really the take away.

The TPU link however is a good example and the Phoronix link also has medium settings, which was mostly my point with that one. HWinfo I believe also tests medium on several resolutions... its nothing special really. There are also regular articles on big AMD driver releases where games are revisited across the web. Its not like OP has somehow found something substantial here, he just failed to look around.

The OP seems to focus mostly on Vega 64 switching spots with a 1080 but even that has been the case since launch, just depends on the game and resolution... and Vega hasn't become radically faster either, it gained a few percent across the board, but again, it doesn't change its place in GPU hierarchy, and it also doesn't make unplayable games (or even settings) playable.
 
That's simply not humanly possible, unless you hire several people benching for you all the time on multiple identical rigs (which introduces a lot of variation).

You forgot that everything should be tested on: Intel + Intel HEDT + Ryzen + Threadripper.

You are correct. By doing that it would cost 5-10x more hours BUT the end result will end up with correct comparison charts.
What good is the latest KFA2 GeForce GTX 1060 6 GB GDDR5X Review review when the AMD results don't include the performance improvements form the last 4 months in F1 2018 and Wolfenstein 2 and possibly in other games, when you are using 3 different Nvidia drivers ( one for RTX 2070, one for RTX 2080 and 2080TI and one for all slower cards ) and when half of the games are already replaced by new titles in the same franchise and have a higher playercount ( somethimes 5-10 times greater ). And I know this looks like I'm attacking your businnes but I'm just stating facts here. It's not because all reviewers do it and because it has become the standard today, that I have to like the way hardware reviews are being done.
 
Last edited:
Its not like OP has somehow found something substantial here, he just failed to look around.
...but if you have these kinds of reviews here and you don't have to search around, that's more traffic for TPU which isn't a bad thing though. I'm not saying the OP's suggestion is sound, I'm just saying that there are a lot more things to capture than just reviewing new cards and W1zz occasionally does that with things like the PCIe scaling reviews.
 
...but if you have these kinds of reviews here and you don't have to search around, that's more traffic for TPU which isn't a bad thing though. I'm not saying the OP's suggestion is sound, I'm just saying that there are a lot more things to capture than just reviewing new cards and W1zz occasionally does that with things like the PCIe scaling reviews.

But... effectively these games ARE revisited with every GPU review, all you need to do is grab an old review and put it side by side. It does require you to read up on the test setups.

As for AMDs long period of time between releases, that is entirely their choice, and I think we also forget that this means you're stuck for 4 months with lacking performance - if you care so much about it, that seems as much a problem to me as not having the numbers?

You are correct. By doing that it would cost 5-10x more hours BUT the end result will end up with correct comparison charts.
What good is the latest KFA2 GeForce GTX 1060 6 GB GDDR5X Review review when the AMD results don't include the performance improvements form the last 4 months in F1 2018 and Wolfenstein 2 and possibly in other games, when you are using 3 different Nvidia drivers ( one for RTX 2070, one for RTX 2080 and 2080TI and one for all slower cards ) and when half of the games are already replaced by new titles in the same franchise and have a higher playercount ( somethimes 5-10 times greater ).

The reason to stick to a benchmark suite for a while is exactly because you can then compare cards across longer periods of time. You're completely missing the point here. A good selection of games depends not on player counts, it depends on what engines are being used, and how those engines push a GPU. We know that different architectures are more suited to specific engines, as much as they benefit from per-game optimizations. There are a lot more variables than the stale 'I want to know how game X plays on my GPU'. Reviews are an indicator, not an all encompassing truth. Even if you benchmark 1000 games, you can never capture the worst case scenarios in each game, so you still don't really know how the actual ingame perf will be. Also, people have radically different PC configurations, which again eliminates that idea.
 
Last edited:
You are correct. By doing that it would cost 5-10x more hours BUT the end result will end up with correct comparison charts.
5-10 hours per card. Normally multiple cards are released in the same time frame. For TPU reviews, every single comparison card is also REtested, results are never taken from a chart somewhere.

I’m just not sure you realize that the service you demand for free is handled by one man, who is darned well entitled to also have a life.
 
Back
Top