Right so it's a subjective thing based on a limited subset of games, yet you are spinning "sandy+quaddies are fine" as a general thing and that
The stutter in R6 siege with the 4790k is because of haswell being 'suboptimal' for AVX which that game uses heavily, there are quite a few other games where this is the case and creates noticeable stuttering at lower framerates (take some of the newer assassin's creed and BF1/5 as examples). Surely that isn't "the media compelling people to buy something newer"...
A lot of people I see flouting their newest hardware are just doing it for epeen points... I know a guy playing league of legends and this genshin game on a 3090, I can guarantee you even purely theoretical performance was not a factor in his decision to buy that card.
Wow, that's ... choices! At least my upcoming GPU upgrade will be stretching its legs with a long backlog of games my Fury X hasn't been up to handling at 1440p. Plus some RT titles, which will be interesting.
You're right that there are some titles that utilize newer instruction sets or tons of threads, but they are still few and far between. Still, if that's your main thing, obviously something more recent is better. There's also background tasks stressing lower core count CPUs more, leading typically to not much worse average fps but much worse lows and thus a more stuttery experience. But again, it depends on the types of games you play, your settings, your monitor, your tolerance for low frame rates, etc. I've had a really great time playing some games at 40-50fps (on a non-freesync display, no less) with my Fury X. But others (Ghostrunner stands out) have been a complete mess.
I tend to call it "the reviewer's paradox":
1. When you're comparing a product with 20 others in the same category, you're bound to nit-pick on meaningless differences because there usually isn't anything else to it.
2. When you get the latest and greatest to review every year, it's easy to forget that upgrading isn't really necessary for most people. Another thing is that you're meant to talk about the product itself, and not about how pointless it is for most people to upgrade.
Then "the review reader's paradox" is when you start seeing things from the reviewer's perspective (instead of your own), get mesmerised by the extra performance, and forget about the fact that whatever you have suits your needs just fine. Modern consumer society is based on wants, not needs. We save up for luxury hardware, and look at benchmarks and FPS numbers instead of enjoying games. We grow e-penises that don't make us happy because the next generation with promises of even more performance always lurks around the corner.
Completely agree here. There's also that companies only want to show off their "best" products, so high-end SKUs are the ones that get seeded for review most often. Heck, it's often even difficult to find any reputable reviews at all of lower end hardware. You'll find dozens upon dozens of reviews of every single Ryzen 7/9/Core i7/i9 SKU and every different AIB version of the 3080, 3090, 6800 XT and 6900 XT. But lower end CPUs or cards? i3s or cheap i5s? 1/10th the number of reviews, at best, despite those selling 10-100x as much.
That's another reason I'm so positive about GamersNexus recently - they buy most of their own review samples, so they're not dependent on review samples. But that's not something that's possible for 99% of the reviewers out there, and thus we end up with a culture fixated on high-end hardware that most of us will never use, let alone own; performance differences that are utterly meaningless except on paper; and overall a focus on statistics and "objective" numbers above experiences, and so on.
I can't help but connect that to the strong undercurrent of logical positivism that seems to dominate tech circles (probably largely due to this branch of philosophy still being dominant in many STEM fields, despite being fundamentally flawed), where so-called "objective" measurements (as if what is measured and how measurement is done doesn't add a significant level of subjectivity to the matter, let alone what is
not measured) are treated mostly as ultimate truths despite these often being woefully poor representations of the experiences they are supposed to inform us about. It's obvious that FPS numbers, frametimes, etc. all significantly impact play experiences, but those impacts are non-linear and interwoven with dozens of entirely separate factors. Yet the people shouting "SCIENCE!!!" all want us to trust these numbers above our own perceptions. Which, to me, is rather absurd - our perceptions are literally our only points of access to the world. Of course in reviewing a GPU you can't account for the difference in colour rendering from your viewers' monitors, their response times, the nuances of their eyesight, or the myriad other relevant factors, but that's why IMO it falls on reviewers to remind people that what they are doing is providing a baseline number for a single factor in a complex experience, not the be-all, end-all measurement for what makes for a good gaming experience. It's obvious that there are context-dependent differences where these numbers play a significant part (while many might not notice a difference between a 144Hz display and a 360Hz one, there is reason to suspect the latter gives a slight competitive advantage) - but you can't put that down to just the numbers alone. Everything involved in play matters, from your body and senses and mental state to your various equipment to the supporting infrastructure (internet, power, etc.).
That's of course another reviewers' paradox: using high end hardware eliminate variables, but this also establishes a de facto baseline for everything, leading to both the expectation and the desire for everything to be
just so, and inherently labeling anything not high-end as inferior. Which feeds mindless consumerism and that never-ending chase for 1-2-3% (and typically entirely imperceptible) differences.