• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Why are reviewers so lazy? ( Not talking about TPU!!! )

This is my final post since people cant post arguments without getting personal with me or anyone else who tends to agree or understand what I'm saying.
you called reviewers lazy and incompetent in the thread title and first post,what do you expect.you're holding people up to an almost impossible standard,in a way that's very impoltie.you're making biting remarks at TPU,then you put "it's not TPU" in the title with three exclamation marks,then you question their reviews again saying all it'd take is five minutes to fix them. :rolleyes:
 
This thread just needs to be locked..... I don't see anything of any benefit happening from here guys.
Out.
I think we all know where this is headed. Suggestions have been stated and we've made a full circle a couple times already. I'm all for good suggestions, but at this point is feels more like a case of "I need to justify my position, even if it might not be feasible."
beating-a-dead-horse-gif-11.gif

Honestly, I wouldn't mind seeing an occasional driver review, just not all the time because, as W1zz said, it's a lot of work and he can't review new cards while he's doing it.
 
I agree with this, but I think I understand where he's coming from, if you feel me, and with that, his idea is rather vague, yet specific.

Reviews, in general, give a particular perspective, and that perspective isn't always what the general public needs. Such as using Ultra settings.
It's not about what public does or needs. It's about testing graphics performance.
Sane people, like me and you, can interpolate or interpret those results depending on our particular situation.
E.g. if GTX1060 scores 45FPS in some latest title, then some simple things like disabling blur, DoF and chromatic aberration will get it up to speed. etc.
Cause if we start going in this direction, then other things will pop up, like:
- most people use older gen i5. Let's retest everything w/ 4 threads
- most people don't use uber-fast DDR4. Let's retest w/ DDR4-2133/2400
- most people still store their games on mechanical hard drives. Let's do that to.
... and at the end we'll get a mess that won't even reflect the real graphics card capabilities.

For mainstream-friendly things there are tons of youtube channels that'd test anything with anything at any settings for ad revenue, cause people really like to see what they want to see, even if it's far from real: that their R7 iGPU can play Fallout4, or their old HD7570 can handle Quake Champions in FHD, or that GTX1060 can play games in 4K (which it totally does :p)
Thanks to those douchebags I may get occasional calls from buyers, who's not happy with their recently purchased used GPU. Always something along the lines "I saw it can play BF1 on youtube, and it stutters like hell, you filthy scammer", even though my ads always have some sort of links to proper reviews and benchmarks. Or laptop buyers who disregard my several specific 5-time mentions that "this is not a gaming laptop" and then still ask whether it can play Dota/CS:GO/Skyrim etc. cause some creep on youtube posted a gameplay footage on HD4000. I think with current abundance of information on the internet people lost the ability to search or think. They consume information like fastfood, and if you have to cook it - it ain't worth the time.
 
I'm happy to see some people understand what I'm trying to say. Some of you think I'm asking better reviews from a selfish standpoint. Let me tell you this. I personally think BF V sucks bigtime. And I prefer BF1 and like it 10x more. That doesn't change the fact that the BF1 population is now a fraction of the BF V population, and in order to adress as much gamers as possible, the BF V benchmark makes more sence.

Some of you are defending flawed review methods out of respect for the reviewers here that work hard with limited time and budget. I understand that. But there are reviewers out there that make 10x the money Wizz is making and they do have lots of people working for them, and they still use the same flawed review methods. You have to be honest and admit that comparing two peaces of hardware, on different software and putting them in the same chart is not a good way to compare them. That's like testing two cars on a different road and then concluding that one car needs more fuel for the same distance. Testing older games when newer titles are released with a population 5-10x great also doesn't make sence. You try to say it as much as possible that I want to force my opinion on to others. But when a new title has 10x players then an old title, you are writing an article that could be 10x more relavant for those gamers.

And when a game performance article gets written, and the author knows that those numbers are no longer correct due to a game patch or driver update, it only takes 5 minutes to edit the article and mention that there have been performance improvements in the latest patch/driver and that the article no longe. represents current performance. You can defend Wizz or whatever reviewer you respect or like. But respecting someone does not mean that you have to admire them like a god or ignore what he or she does wrong or could do better.

This is my final post since people cant post arguments without getting personal with me or anyone else who tends to agree or understand what I'm saying. Asking me to start reviewing hardware knowing full well that it's impossible for a normal consumer to do that is just...

Nope. You're carefully avoiding the arguments posted here and then repeat your misguided ideas with weird sources. If you choose to ignore replies this is going to go nowhere... and yes, its annoying.

This isn't about defending reviewers, its about trying to make you realize that your point is flawed in many ways, and that the way things are tested is more varied than you say (you even supplied your own HWI source that underlines that) or think they are. Drivers ARE revisited over time, and player counts really don't matter when it comes to benchmarking a GPU -because its a GPU benchmark, not a game performance test.

Again, try to separate the BS from the good arguments you do ALSO have, and make a new topic, more focused on the problem you want to see tackled. Then, and only then can you create a solid discussion on that one problem. One advice: keep the AMD/Nvidia driver regime out. There shouldn't be a difference in how reviewers approach GPU vendors because of their choice of update schedule. That is not something the consumer can control either. The consumer can only control what GPU he will purchase.
 
AMD had poor performance in DOOM back in 2016 on 16.5.2:

They got the main problem fixed in 16.5.2.1.

Then the game released a Vulkan renderer in July. Fury X got a near 40% FPS boost:
https://www.eurogamer.net/articles/...n-patch-shows-game-changing-performance-gains

And now 2.5 years later, AMD is still improving driver performance for the game. It adds credibility to the argument that AMD drivers are like fine wine: improve with age.


Relevant to this thread: the game shouldn't be GPU benchmarked on 16.5.2. It should be benchmarked on 16.5.2.1 where the glaring problems are fixed. I'd argue it should have waited to be benchmarked until July because id Software's intent to release a Vulkan patch was clear. Why benchmark OpenGL when it is about to be made moot by Vulkan? The rush to publish first...is like looking at the graphics card market through a pin hole. You only see the messy launch (which is status quo anymore) and doesn't reflect the intended result.

To TechPowerUp's credit, there was no game launch day performance review of DOOM and it wasn't added to the list of GPU review games until September 6, 2016--plenty of time for the drivers to mature.


Game launch day performance reviews should be taken with the Dead Sea's portion of salt unless AMD and NVIDIA both confirm they have no further optimizations in the pipeline.
 
Last edited:
Witcher 3 and tessellation come to mind.

Yes, and somehow AMD fixed that themselves, and its just another example of how they're always late to the party.

AMD had poor performance in DOOM back in 2016 on 16.5.2:

They got the main problem fixed in 16.5.2.1.

Then the game released a Vulkan renderer in July. Fury X got a near 40% FPS boost:
https://www.eurogamer.net/articles/...n-patch-shows-game-changing-performance-gains

And now 2.5 years later, AMD is still improving driver performance for the game. It adds credibility to the argument that AMD drivers are like fine wine: improve with age.


Relevant to this thread: the game shouldn't be GPU benchmarked on 16.5.2. It should be benchmarked on 16.5.2.1where the glaring problems are fixed. I'd argue it should have waited to be benchmarked until July because id Software's intent to release a Vulkan patch was clear. Why benchmark OpenGL when it is about to be made moot by Vulkan? The rush to publish first...is like looking at the graphics card market through a pin hole. You only see the messy launch (which is status quo anymore) and doesn't reflect the intended result.


Game launch day performance reviews should be taken with the Dead Sea's portion of salt.

You say that, and yet somehow Nvidia does manage to be on time. And it also doesn't change the fact that AMD users have to make reddit topics like the one above to complain about poor performance. Consumers are left hanging for months, and launch day reviews underline that. Its not a bad thing. Does that mean revisiting things have its value? Absolutely, but you may have to wonder how many are going to still have the patience to wait for the saving AMD update for that specific game. Who knows, it may take years. It may also never come.

Fine wine is only fine because you can drink it knowing it aged well. With AMD, who knows, your wine might become fine at some point, but maybe it never will. And at the same time we also saw major performance DROPS over time on for example the Fury X because it is VRAM limited. I can link you the reddit for that...
 
Last edited:
That is incredibly incorrect
Are you actually going to provide reasons for why it's incredibly incorrect because this has been something that has come up before. I suspect that unless you're a reviewer who gets hardware samples, you probably actually don't know for sure.
 
That is incredibly incorrect

No way to know for sure, but it shouldn't surprise anyone if that was the case. Think of the FE 1080s that every reviewer got and how they all clocked up to 2100mhz no problem and the endless threads about how (mostly) everyone else's card throttled massively.
 
Its also incredibly offtopic because even this is not going to make unplayable games playable.

I'm still missing the sources and reasoning behind doing all those extra tests. We won't learn anything new. The only thing I'd say is credible and worthwhile doing here is testing a game on Medium as well. Because yes, there ARE situations and games where the performance gained is far from linear between quality settings. That is something not everyone can read out of an ultra benchmark. But then comes the question: what games? Why those games and not others? Its a slippery slope that will get you awfully close to having to test everything because there is always someone who would prefer seeing that specific setting/game/driver. Its not realistic and reviewers figured that out a few decades ago...
 
just for giggles,here are those magnificent AMD performance increases that happen after the game is released

https://www.tomshardware.com/reviews/amd-nvidia-driver-updates-performance-tested,5707.html
Hitman and Ashes of the Singularity: Escalation makes my point there. They were AMD optimized games so NVIDIA lagged behind at launch and later the developers and/or NVIDIA got the issues fixed. Conversly, Rise of the Tomb Raider was an NVIDIA launch title so AMD lagged behind and eventually got the issues fixed.

That is incredibly incorrect
AMD Radeon R9 290 Review: Fast And $400, But Is It Consistent?

AdoredTV used SiliconLottery to confirm Intel sends golden sample CPUs to reviewers:
 
Hitman and Ashes of the Singularity: Escalation makes my point there. They were AMD optimized games so NVIDIA lagged behind at launch and later the developers and/or NVIDIA got the issues fixed. Conversly, Rise of the Tomb Raider was an NVIDIA launch title so AMD lagged behind and eventually got the issues fixed.


AMD Radeon R9 290 Review: Fast And $400, But Is It Consistent?

That's just it, exceptions. The vast majority of games aren't Nvidia or AMD sponsored and barely ever move - their launch performance is very much real. And even if you do include all those sponsored titles, they exist on both sides, make a performance summary and you're back at square one again, the GPU hierarchy never changes here and not a single game goes from unplayable to playable.

Even TW3 which you served up as an example, was not unplayable, all you had to do was turn Hairworks off - something even Nvidia users did because the performance hit was and still is horrible, for barely any gains. I also remember the CPU PhysX implementation in Project Cars... another case of AMD late to the party and having to admit they had the fix on the shelves but not implemented. But again, none of these rare exceptions change a thing within a GPU review that covers a wide range of games.
 
You say that, and yet somehow Nvidia does manage to be on time. And it also doesn't change the fact that AMD users have to make reddit topics like the one above to complain about poor performance. Consumers are left hanging for months, and launch day reviews underline that. Its not a bad thing. Does that mean revisiting things have its value? Absolutely, but you may have to wonder how many are going to still have the patience to wait for the saving AMD update for that specific game. Who knows, it may take years. It may also never come.
I already explained why AMD tends to lag behind:
https://www.techpowerup.com/forums/...zy-not-talking-about-tpu.251199/#post-3970561

Many of the factors are outside of AMD's control (developers overwhelmingly using NVIDIA GPUs, for example). The only factor AMD can control is choosing stability over performance. BSODs (which NVIDIA drivers suffer a lot of) suck far worse than low framerates (AMD's problem with new game launches).

That's just it, exceptions. The vast majority of games aren't Nvidia or AMD sponsored and barely ever move - their launch performance is very much real.
Because they're made on unifying engines like Unreal Engine or Unity. The major issues revolve around new or heavily modified engines.

Even TW3 which you served up as an example, was not unplayable, all you had to do was turn Hairworks off - something even Nvidia users did because the performance hit was and still is horrible, for barely any gains. I also remember the CPU PhysX implementation in Project Cars... another case of AMD late to the party and having to admit they had the fix on the shelves but not implemented. But again, none of these rare exceptions change a thing within a GPU review that covers a wide range of games.
And why do you think CD Projekt had it enabled by default? They weren't aware it was huge problem for AMD cards because they assumed NVIDIA optimized their shit for AMD. No, they don't, so they patched the game to take into account the hardware installed when it selects defaults. The launch was rough on AMD cards but it's a non-issue now. Funny how everyone works together to sort out the major issues, isn't it?

Back to the topic at hand: benchmarkers choose "Ultra" it enables all of this AMD breaking GameWorks trash. Suddenly AMD graphics cards look awful in performance reviews when it is just NVIDIA's unoptimized code rearing its ugly head again.
 
Last edited:
I already explained why AMD tends to lag behind:
https://www.techpowerup.com/forums/...zy-not-talking-about-tpu.251199/#post-3970561

Many of the factors are outside of AMD's control (developers overwhelmingly using NVIDIA GPUs, for example). The only factor AMD can control is choosing stability over performance. BSODs (which NVIDIA drivers suffer a lot of) suck far worse than low framerates (AMD's problem with new game launches).


Because they're made on unifying engines like Unreal Engine or Unity. The major issues revolve around new or heavily modified engines.

All fine and dandy, but to circle back to the main topic and problem: unfair reviews? I'm still not seeing it. Even Toms summary shows an equal amount of gains on both sides of the fence. And when games are revisited on newer drivers later in time, we also never see the gpu hierarchy has changed in favor of either AMD or Nvidia. Launch performance is rather consistent and certainly not a reason to 'revisit' everything every few months. Even when there are multiple years between tests, we don't see that.

As for your post about API choices... D3D12 before 11? NO thanks.... 11 almost consistently runs better and more stable.
 
I have a problem with the fact that so many review website don't update their game collection and don't revisit older reviews one year later.
Here's a nice example:

What games does he use? Battlefield 1 ( while Battlefield V population is already 3 times bigger ), F1 2017 ( while F1 2018 population is 5 times bigger ) and an old Tombraider game. The numbers are there, the games tested or not getting played a much as the newer titles wich is normal offcourse. But another problem I see is that all the improvements AMD has done over the years, are not being displayed by using old benchmarks or by benchmarking older titles.

In BF1, at release the GTX 1080 wins from Vega 64. This is not the case in the latest version of BF1.
In BF V, the Vega 64 wins from the GTX 1080. But you still see reviewers referring to their older BF1 benchmarks and simply not using Battlefield V for benchmarking.

In F1 2017, at release, the GTX 1070 wins from Vega 64. This is no longer the case. Now the Vega 64 is even faster then the GTX 1080, just like in F1 2018.

And in the latest tombraider, the Vega 64 beats the GTX 1080 again, unlike the results from previous tombraider games at the release.

I hope at some point, Techpowerup can make the difference and starts revisiting older reviews one year later. Because in a years time, game patches and driver updates can make a big difference in the actual performance.
Also, when a new title gets released in a franchise and the new title has a bigger gamer population, they should simply stop benchmarking the older title or at least benchmark the older + the newer title.
And last but not least, if a title can be played in both DirectX 11 and 12, both should get tested. Because when Hardware unbox benchmarked Hitman only in DX11, it pissed me off because this is the reality:
Hitman DX11
GTX 1080 8% faster then Vega 64
Hitman DX12
The exact same performance

And this feedback is aimed at Techpowerup directly, stop benchmarking at the highest quality settings and start benchmarking at medium. The amount of gamers that play on Ultra only represents 10% at most. There are settings like shadow/lighting or post processing that almost every gamer turns down for the big performance impact or the disadvantage it gives in visibility. So why benchmark settings that 90% of gamers will not use.
So let me get this straight, you cried about this on the techspot forum and everyone there told you it's no big deal. So you came to the TPU forum to cry about this since no one on techspot agreed with you and you are getting the exact same response from the TPU forum. So maybe this is a you problem and not a tester problem.
 
really ? care to provide any source of this BS ?
One example: https://venturebeat.com/2016/03/07/...eath-crashes-following-geforce-driver-update/

NVIDIA has pushed some drivers out with notoriously bad QA. Granted, AMD has too (like that DX9 breaking update) but...BSOD is worse than games not working.

I've used AMD drivers for decades now (including pre-release drivers) and I can't name one BSOD that was caused by AMD's drivers. I only used NVIDIA for the GeForce 7-9 series and can name at least a dozen BSODs across multiple systems.

All fine and dandy, but to circle back to the main topic and problem: unfair reviews? I'm still not seeing it. Even Toms summary shows an equal amount of gains on both sides of the fence. And when games are revisited on newer drivers later in time, we also never see the gpu hierarchy has changed in favor of either AMD or Nvidia. Launch performance is rather consistent and certainly not a reason to 'revisit' everything every few months. Even when there are multiple years between tests, we don't see that.
Because they use an aggregate of many games which minimizes the impact of a single game. Since game launches are naturally staggered, there isn't an overwhelming effect on performance summaries.

As for your post about API choices... D3D12 before 11? NO thanks.... 11 almost consistently runs better and more stable.
Only one game that had a choice between the two I had problems with: The Division, and it is because of Steam overlay conflicting with the game in D3D12. Disable Steam overlay, game runs well in D3D12.

If the developer calls D3D12/Vulkan feature complete, then benchmarkers should use it. If they call it beta, then they should not.
 
Last edited:
One example: https://venturebeat.com/2016/03/07/...eath-crashes-following-geforce-driver-update/

NVIDIA has pushed some drivers out with notoriously bad QA. Granted, AMD has too (like that DX9 breaking update) but...BSOD is worse than games not working.


Only one game that had a choice between the two I had problems with: The Division, and it is because of Steam overlay conflicting with the game in D3D12. Disable Steam overlay, game runs well in D3D12.

If the developer calls D3D12/Vulkan feature complete, then benchmarkers should use it. If they call it beta, then they should not.
seriously, one post from 2016 about one driver bsoding in one game and your conclusion is "nvidia is proven to bsod much more frequently than amd" :rolleyes:
some of the stuff that you insist on here is borderline ridiculous.


I have had nvidia gpus since summer 2015,don't recall one bsod during thousands of hours of gaming other than overclocking related. how many times has your card bsod ? dozens probably,judging by the amount of topics on amd driver updates causing bsod you can find on google.
 
Last edited:
seriously, one post from 2016 about one driver bsoding in one game and your conclusion is "nvidia is proven to bsod much more frequently than amd" :rolleyes:
some of the stuff that you insist on here is borderline ridiculous.
I'm disappointed in you. You asked for a source, he provided a source. Now you're saying it's not good enough because it was too old which is not a constraint you provided when you asked for it. You didn't ask for a source from the last year, so you got what you asked for. So before calling someone's response ridiculous, maybe you should look back at what you asked for instead of saying it's not good enough. If you give Ford enough time, I'm sure he'd compile a list for you, but it's not like you can't do some research on your own instead of whining about how Ford's research isn't good enough.
 
seriously, one post from 2016 about one driver bsoding in one game and your conclusion is "nvidia is proven to bsod much more frequently than amd" :rolleyes:
some of the stuff that you insist on here is borderline ridiculous.
Didn't read it?
NVIDIA said:
Initial investigation suggests the issue is isolated to multiple-monitor configurations.
"Isolated." Who here doesn't use multiple monitors? It was widespread which is why Venture Beat (and others) covered it.


Edit: Not BSOD but equally serious: PSA: "NVIDIA Installer cannot continue" on Windows October 2018 Update and How To Fix It
 
Last edited:
I'm disappointed in you. You asked for a source, he provided a source. Now you're saying it's not good enough because it was too old which is not a constraint you provided when you asked for it. You didn't ask for a source from the last year, so you got what you asked for. So before calling someone's response ridiculous, maybe you should look back at what you asked for instead of saying it's not good enough. If you give Ford enough time, I'm sure he'd compile a list for you, but it's not like you can't do some research on your own instead of whining about how Ford's research isn't good enough.
I'm disappointed in you both,like you never learned that you can find examples of anything happening randomly if you just type it in google.what he posted is his opinion,it's nowhere near a comprehensive study.
 
Are you actually going to provide reasons for why it's incredibly incorrect because this has been something that has come up before. I suspect that unless you're a reviewer who gets hardware samples, you probably actually don't know for sure.
I do work for media, and testing samples on a day by day basis for some years now. Review RNG is like consumer RNG. Some early CPUs are much worse than a typical consumer one due to rushed production or immature node.

AdoredTV used SiliconLottery to confirm Intel sends golden sample CPUs to reviewers:
This is incorrect, and based on a fairly small sample of testings. I have received over 50 CPUs in recent years and mine were avarage at best. Some samples were straight out horrible.

I know how samples arrive to reviewers. Sometimes a manufacturer a asks a big chain of local stores to hand a unit for a review, and there's absolutely 0 control over the quality of sample sent. They just take a unit from stock and ship it.

I'm really sorry to burst this fantasy, but this is how things work.
Ask w1zz, ask @cadaveca
 
I'm disappointed in you both,like you never learned that you can find examples of anything happening randomly if you just type it in google.
Are you making an excuse for not doing your own research then claiming someone else's isn't good enough? Research doesn't mean just google it and look at the first result. :slap:
I do work for media, and testing samples on a day by day basis for some years now. Review RNG is like consumer RNG. Some early CPUs are much worse than a typical consumer one.
Interesting, that's a lot of review hardware. So where are the reviews you've done? Who is "media"?
 
Are you making an excuse for not doing your own research then claiming someone else's isn't good enough? Research doesn't mean just google it and look at the first result. :slap:
exactly.random articles from years ago do not constitute a comprehensive study to me.

if he forms an opinion then it's NOT up to me to prove it.you're asking me to prove it for myself. :laugh:
 
Back
Top