Finally a revisiited comparison with modern drivers. Thank god!
Now if he only benchmarked 1440p at medium settings, it would actually show what the card could do at that resolution...
Fact:
The RX 570 clearly wins in terms of performance
Fact:
Cheapest 1050TI that can be found on Amazon costs 167$
Fact:
Cheapest RX 570 ( + 2 full AAA games ) that can be found on Amazon costs 170$
Fact:
Cheapest RX 580 ( + 2 full AAA games ) that can be found on Amazon costs 190$
So why can I find 5 1050Ti's in the Top 50 best selling cards, including one at 2nd best selling and only one 570 card, at place 45 on amazon.com ?
And it's not just on Amazon, in my country and also in the Netherlands, these cards are being sold for the same price and the 1050TI is still gettting most sales.
Seriously, Nvidia fan or not, give me one good argument of buying a 1050Ti and sacrificing that much performance?
It's like consumers want a monoply. It's like they want every GPU generation to cost 25% more. Because that's the message they send to Nvidia this way.
Wanna bet the 3080ti will cost 1600$? Why not. People buy it any way. Price per dollar doesn't mather anymore…
And people are suprised AMD is no longer trying to compete? Why? It's like PC gamers don't want competition and that they want to PC platform to die due too high prices compared to consoles...
I think the mistake in your thinking is to approach the PC gamer as a homogenous group of people. They aren't. Its probably the most diverse target audience in any marketplace. With a vast majority having no clue about market share Nvidia/AMD, and only a vague idea about performance, about what settings or framerate they want to target, etc. etc. There are also very large groups that exclusively play on or two games - games like Warframe - that are benched almost never, anywhere (and run on a toaster - there would be zero noticeable difference in picking a 570 over the 1050ti

). And there are groups that want a thin/light/silent rig because they only play casual games on the living room TV. And... I could name another few dozen examples of very specific use cases. A large amount of those groups are going to want the smallest, lowest TDP/least power connectors options with the best performance at that. But those groups still don't explain the entire (Major) sales gap, I agree.
Look at the market shares of display resolutions on Steam Survey, for example. Higher resolutions and even 1080p isn't as dominant as you might expect. And that happens to be the exact segment these 1050ti's and RX570's target: casual gaming. Casual gamers. That part of the PC gaming audience that knows very little about all the things we discuss on this forum, or in TPU reviews.
This is an enthusiast website. HWi is just about as casual as it gets for those who read tech sites. Youtube is another one of those entry level sources of information. Its the reason you also see a lot of inaccuracies and bad conclusions over in those regions. HWI is well known for straight up bad testing and bench results that are WAY off the mark compared to many others doing the same test. Its fine if you look at HWI only as far as relative performance goes. But that is just about where it stops being relevant.
As for your statements on why pick one over the other, I agree. I'd also pick the faster GPU for my money. But, again: not everyone is keen on or even capable of reading all those reviews
and make sense of the numbers while doing it. It however has no relation to why or why not test on Medium. The HWI link you posted points that out perfectly: the relative performance of all cards in their line up is perfectly maintained as resolutions and quality settings rise - bar a few exceptions of cards that were already very close together to begin with. Only when you hit unplayable/less comfy (sub ~50 averages) framerates do lower end cards start falling off (hard).
I remember another topic just recently that you fired off where I also kept hammering on that specific point: the relative performance, the GPU hierarchy of these cards never really changes in any test approach, it only does when you present light cards with overweight test scenarios - which isn't good info, it just throws you off balance as a reader.
This means that while you DO see a performance gap between ultra and medium, there is still no new information to be had. Fast cards be fast, slow ones be slow. And guess what, a year from now, those same cards will have fallen off a bit further as benchmarks become heavier at every resolution and quality level and faster cards are released. All of this has nothing to do with a preferred quality setting or resolution and everything with
time.
I suppose it can be, sure. But since getting a 144hz panel and pumping144fps of synced goodness in most titles, it would be tough to go back to 60hz/fps or less. The difference between my PC and my kids 60hz/fps is more than apparent...as are the medium and ultra settings (fortnite for example).
The underlying goal of most is simply to play the game as the dev intended... looking how it should. Indeed it can be hard to distinguish differences but again, unless there are other constraints, users dont buy a PC to game on medium out of the gate but do so typically because of other limitations like budget or too high res for the card.
Not so sure about that. While I
personally agree with you, I think the underlying goal for
most is actually just to play - much like
@sneekypeet says he does. Most people simply have a budget and get whatever is best for them (available, suited for their rig, known to them, and an upgrade) at a specific point in time. Whatever settings and resolution that enables is secondary. The people who do target a resolution often don't target a quality level, and those who target a specific framerate (higher than 60) also don't target a quality level. The quality level a game is played on is the one thing that is fluid here. Everyone is stuck with their monitor choice in terms of resolution and refresh rate, but they can play around with quality settings.
And ehm... playing a game 'as it was intended by the developer'... I don't know, that sounds to me like something out of the audiophile or movie-fanatic corner and it has absolutely no place in gaming, where half of what you see is simply what the engine/API has available for you and the other half determined/limited by hardware. I mean, how do those console gamers even cope with all that trickery to keep their sacred 30 FPS intact? Is that as developers intended it? I hardly think so... As you can also see in many polls about quality we saw lately, many comments say that people disable specific settings such as (Motion) blur, vignetting, chromatic abberation, DoF, etc etc. And let's be honest... think back about all those DX9 games with excessive Bloom effects that would burn your retinas out. Those went insta-disable in my gaming... It is exactly
that flexibility that makes PC gaming what it is. And probably also what elevates it beyond console gaming.