• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Why are people still buying 1050TI's and not RX 570's?

Status
Not open for further replies.
I thought people buy GT1030 DDR4 over RX570 instead. :laugh:
On a series note, because people want Geforce even if it cost $2500, and many have no idea that AMD exists.
 
TPU users are so used to outdated reviews with old drivers that they even use them to proove a point. Here is a more recent review, showing almost 120fps avarage on 1080p and about 88fps avarage on 1440p. And, lets not forget that the drivers in the past 6 months have improved even further.

https://be.hardware.info/reviews/83...chips-hertest-hardwareinfo-gpu-prestatiescore

We have to keep in mind though that these games were benchmarks on a very fast CPU, but still 1440p 60fps is not unrealistic on a slower CPU.
in your cherry picked reviews the gap between Vega 56 and Vega 64 is higer than one between 1080 and 1080Ti.Tell me more about good testing methodology.
 

Finally a revisiited comparison with modern drivers. Thank god!
Now if he only benchmarked 1440p at medium settings, it would actually show what the card could do at that resolution...

Table.png


Fact:
The RX 570 clearly wins in terms of performance
Fact:
Cheapest 1050TI that can be found on Amazon costs 167$
Fact:
Cheapest RX 570 ( + 2 full AAA games ) that can be found on Amazon costs 170$
Fact:
Cheapest RX 580 ( + 2 full AAA games ) that can be found on Amazon costs 190$

So why can I find 5 1050Ti's in the Top 50 best selling cards, including one at 2nd best selling and only one 570 card, at place 45 on amazon.com ?
And it's not just on Amazon, in my country and also in the Netherlands, these cards are being sold for the same price and the 1050TI is still gettting most sales.

Seriously, Nvidia fan or not, give me one good argument of buying a 1050Ti and sacrificing that much performance?
It's like consumers want a monoply. It's like they want every GPU generation to cost 25% more. Because that's the message they send to Nvidia this way.
Wanna bet the 3080ti will cost 1600$? Why not. People buy it any way. Price per dollar doesn't mather anymore…

And people are suprised AMD is no longer trying to compete? Why? It's like PC gamers don't want competition and that they want to PC platform to die due too high prices compared to consoles...

I think the mistake in your thinking is to approach the PC gamer as a homogenous group of people. They aren't. Its probably the most diverse target audience in any marketplace. With a vast majority having no clue about market share Nvidia/AMD, and only a vague idea about performance, about what settings or framerate they want to target, etc. etc. There are also very large groups that exclusively play on or two games - games like Warframe - that are benched almost never, anywhere (and run on a toaster - there would be zero noticeable difference in picking a 570 over the 1050ti ;)). And there are groups that want a thin/light/silent rig because they only play casual games on the living room TV. And... I could name another few dozen examples of very specific use cases. A large amount of those groups are going to want the smallest, lowest TDP/least power connectors options with the best performance at that. But those groups still don't explain the entire (Major) sales gap, I agree.

Look at the market shares of display resolutions on Steam Survey, for example. Higher resolutions and even 1080p isn't as dominant as you might expect. And that happens to be the exact segment these 1050ti's and RX570's target: casual gaming. Casual gamers. That part of the PC gaming audience that knows very little about all the things we discuss on this forum, or in TPU reviews.

This is an enthusiast website. HWi is just about as casual as it gets for those who read tech sites. Youtube is another one of those entry level sources of information. Its the reason you also see a lot of inaccuracies and bad conclusions over in those regions. HWI is well known for straight up bad testing and bench results that are WAY off the mark compared to many others doing the same test. Its fine if you look at HWI only as far as relative performance goes. But that is just about where it stops being relevant.

As for your statements on why pick one over the other, I agree. I'd also pick the faster GPU for my money. But, again: not everyone is keen on or even capable of reading all those reviews and make sense of the numbers while doing it. It however has no relation to why or why not test on Medium. The HWI link you posted points that out perfectly: the relative performance of all cards in their line up is perfectly maintained as resolutions and quality settings rise - bar a few exceptions of cards that were already very close together to begin with. Only when you hit unplayable/less comfy (sub ~50 averages) framerates do lower end cards start falling off (hard).

I remember another topic just recently that you fired off where I also kept hammering on that specific point: the relative performance, the GPU hierarchy of these cards never really changes in any test approach, it only does when you present light cards with overweight test scenarios - which isn't good info, it just throws you off balance as a reader.

This means that while you DO see a performance gap between ultra and medium, there is still no new information to be had. Fast cards be fast, slow ones be slow. And guess what, a year from now, those same cards will have fallen off a bit further as benchmarks become heavier at every resolution and quality level and faster cards are released. All of this has nothing to do with a preferred quality setting or resolution and everything with time.

I suppose it can be, sure. But since getting a 144hz panel and pumping144fps of synced goodness in most titles, it would be tough to go back to 60hz/fps or less. The difference between my PC and my kids 60hz/fps is more than apparent...as are the medium and ultra settings (fortnite for example).

The underlying goal of most is simply to play the game as the dev intended... looking how it should. Indeed it can be hard to distinguish differences but again, unless there are other constraints, users dont buy a PC to game on medium out of the gate but do so typically because of other limitations like budget or too high res for the card.

Not so sure about that. While I personally agree with you, I think the underlying goal for most is actually just to play - much like @sneekypeet says he does. Most people simply have a budget and get whatever is best for them (available, suited for their rig, known to them, and an upgrade) at a specific point in time. Whatever settings and resolution that enables is secondary. The people who do target a resolution often don't target a quality level, and those who target a specific framerate (higher than 60) also don't target a quality level. The quality level a game is played on is the one thing that is fluid here. Everyone is stuck with their monitor choice in terms of resolution and refresh rate, but they can play around with quality settings.

And ehm... playing a game 'as it was intended by the developer'... I don't know, that sounds to me like something out of the audiophile or movie-fanatic corner and it has absolutely no place in gaming, where half of what you see is simply what the engine/API has available for you and the other half determined/limited by hardware. I mean, how do those console gamers even cope with all that trickery to keep their sacred 30 FPS intact? Is that as developers intended it? I hardly think so... As you can also see in many polls about quality we saw lately, many comments say that people disable specific settings such as (Motion) blur, vignetting, chromatic abberation, DoF, etc etc. And let's be honest... think back about all those DX9 games with excessive Bloom effects that would burn your retinas out. Those went insta-disable in my gaming... It is exactly that flexibility that makes PC gaming what it is. And probably also what elevates it beyond console gaming.
 
Last edited:
Is there are info who shows that people buy 1050 Ti over RX 570? Because Steam Hardware Info is not valid according to red team.

If you read reviews when 1050 Ti came out you clearly see it was compared with RX 460. So why compare 3 years old sales numbers with today prices?

You can find used 980Ti for RX 570 prices. Why people buy slower GPUs if you can buy gold in second-hand market?


So AMD promoting 4 GB in 2019 but people bashing Nvidia for RTX 2060 6 GB. OK
 
Is there are info who shows that people buy 1050 Ti over RX 570? Because Steam Hardware Info is not valid according to red team.

If you read reviews when 1050 Ti came out you clearly see it was compared with RX 460. So why compare 3 years old sales numbers with today prices?

The Amazon best sellers lists are not based on 3y old sales numbers but current sales.
 
So AMD promoting 4 GB in 2019 but people bashing Nvidia for RTX 2060 6 GB. OK

Sure thing buddy, except one is 350$.
 
Id want the 1050ti for lack of power connector depending on the build and much less heat in an itx.

i liked the lack of power connector on my GT430 ...until i tried to Hybrid Physx it on a board where all i could use was a 16x to 1x PCI-E adapter, and without the power connector, it wouldn't work.

but i get it. the power consumption is nice. but that's about all it has going for it
 
Sure thing buddy, except one is 350$.

So console class gaming then.

I have to agree with Solaris here. When I game on PC, I buy what it takes to run the resolution I have with the best settings available. If I wanted to play on medium settings, I would buy a console!
 
Got my 570 8GB for £150 + 2 free games and OC to 1410 and 8440 memory it is coming close to 1060 6GB performance and nearly 2x the performance of 1050 ti for the same price with twice the vram. It is capable of ultra settings 1080p and 60 fps in many titles and 40fps+ ultra 1080p in basically every game. This is value you simply can't get with Nvidia.

Answer to the OP question: Because people are stupid and GeForce mindshare is greater.
 
Not so sure about that. While I personally agree with you, I think the underlying goal for most is actually just to play - much like @sneekypeet says he does. Most people simply have a budget and get whatever is best for them (available, suited for their rig, known to them, and an upgrade) at a specific point in time. Whatever settings and resolution that enables is secondary. The people who do target a resolution often don't target a quality level, and those who target a specific framerate (higher than 60) also don't target a quality level. The quality level a game is played on is the one thing that is fluid here. Everyone is stuck with their monitor choice in terms of resolution and refresh rate, but they can play around with quality settings.

And ehm... playing a game 'as it was intended by the developer'... I don't know, that sounds to me like something out of the audiophile or movie-fanatic corner and it has absolutely no place in gaming, where half of what you see is simply what the engine/API has available for you and the other half determined/limited by hardware. I mean, how do those console gamers even cope with all that trickery to keep their sacred 30 FPS intact? Is that as developers intended it? I hardly think so... As you can also see in many polls about quality we saw lately, many comments say that people disable specific settings such as (Motion) blur, vignetting, chromatic abberation, DoF, etc etc. And let's be honest... think back about all those DX9 games with excessive Bloom effects that would burn your retinas out. Those went insta-disable in my gaming... It is exactly that flexibility that makes PC gaming what it is. And probably also what elevates it beyond console gaming.
I am pretty sure about that. You may want to look at what sneeky had to say...

I have to agree with Solaris here. When I game on PC, I buy what it takes to run the resolution I have with the best settings available. If I wanted to play on medium settings, I would buy a console!

...which is what I am saying. Generally default Ultra. The dual polls (still funny both were allowed to live and they have the same damn information) also support the majority of those participating, shoot for the stars and lower only as needed and not for giggles. People target both IQ and FPS. It's part of the reason why we see min and rec specs on the box.

I just meant ultra. Ultra is a setting IN GAME determined by the developers. With Ultra being as close as possible to their vision while still being able to run on the hardware that is around. People nit picking settings like that isn't what we are talking about here. :)
 
You can find used 980Ti for RX 570 prices. Why people buy slower GPUs if you can buy gold in second-hand market?
So AMD promoting 4 GB in 2019 but people bashing Nvidia for RTX 2060 6 GB. OK

Because the 980ti is now a 3+ year old card and the RX570 is new, while they are almost the same performance wise. Its not like the 980ti makes games playable that the 570 can't run proper. The gap isn't that massive. As for your 2060 example... the card is twice the price of the 4GB alternative being promoted. And the RX570 also comes in an 8GB flavor... which is still much cheaper than the 2060. So I think its clear why Nvidia deserves a bit of bashing at that price point for such a meagre GPU. Especially when compared to its previous gen offerings, which had 8GB at that price point.

So console class gaming then.

Not sure I follow you here.
 
Until recently the 570 was twice the price. Its a faulty comparison. Just because they are the same price now does not mean they are the same product category nor have they always been the same price. Amazon sales rank is partially determined by historical sales data.

One is a low power card for light gaming or workstations, the other is a mid range gamer. They aren't the same product category. This is evidenced by the 1050ti lacking a PCI-E power input, it is a low power card that only needs a PCI-E slot to function. Imagine having a prebuilt machine with a proprietary power supply such as a Dell. You could drop the 1050ti in and be able to run games.

In terms of value, the 1050 and 1050ti are a better proposition than the 1030, they are over twice as fast but only $20-30 more. If you are building a workstation, need a video accelerator to run your monitors, and you want it to be able to run anything (non gaming) years into the future, get the 1050x SKU. New 1050ti are available as low as $115 shipped on sites like eBay.

Consider the historic pricing of the RX 570 on camelcamelcamel.com Under the crypto boom, $300-400, and in some cases more. Meanwhile the 1050ti was typically going for half that. Most people bought 1050ti because it was the only card actually available under $200 that was suitable for any level gaming, not because it was a good card for gaming. It has the same performance as the 2012 era GTX 680.

Instead of creating a forum thread that implies that 1050ti purchasers are fools, maybe it would have been better to create a Public Service Announcement: Attention, if you are considering buying the 1050ti, be aware that the 570 is currently priced equivalently.




@Vayra86 I agree, the Ultra preset is not something I spend money chasing, I want games to run smooth and even if I can run it on Ultra, I don't care prefer running ultra as a preset, because frequently the Ultra preset has things I do not like such as motion blur, lense flare, and other "special effects". I run a videocard for a few years until I have to start setting stuff on Low just to have a playable frame rate, and then it is upgrade time once more.
 
Last edited:
I am pretty sure about that. You may want to look at what sneeky had to say...



...which is what I am saying. Generally default Ultra. The dual polls (still funny both were allowed to live and they have the same damn information) also support the majority of those participating, shoot for the stars and lower only as needed and not for giggles. People target both IQ and FPS. It's part of the reason why we see min and rec specs on the box.

I just meant ultra. Ultra is a setting IN GAME determined by the developers. With Ultra being as close as possible to their vision while still being able to run on the hardware that is around. People nit picking settings like that isn't what we are talking about here. :)

Then I must have misread sneeky, because I believe he also said that he doesn't care what FPS the game runs at and it can easily be below 60. While your stance on Ultra is not only the quality but also an FPS target.

For the rest of it, the market share of GPUs completely contradicts your statement. Ultra is out of reach for the vast majority and they settle for less. When FPS is abysmal, usually the settings go down. Its the first thing you do when your GPU can't run game X proper. Polls on enthusiast tech sites are not representative of 'most people'. Not even in PC gaming. That is the whole premise of this topic, I believe - what do the lesser GPUs do and why would you not run a lower quality setting, and what do you really miss out on in that case.

Do you run Ultra 'when you can'? Of course. But that is a different question. There are many things people might prefer but won't have or have access to or simply won't result in a comfortable experience. And about nit-picking settings... most games just offer the same range of settings in the exact same way, it has zero to do with developers carefully crafting an artistic experience. It wasn't too long ago that an overall quality setting also determined many aspects of post processing and draw distance etc. And those games still exist. The ultra setting has absolutely nothing to do with developers' intent and everything with hardware capabilities that games and quality settings are scaled around. Its not like higher or lower LOD has anything to do with 'how developers meant it', for example, yet those are inherent to a quality setting in many games. In fact, the exact opposite is true: developers always try to make the lower quality settings good enough to still get their 'intent' across, so that the overall experience is the same regardless of your hardware power. And on top of that, there are many games that become straight up unplayable if you put everything ON and on Ultra, because there's too much crap obscuring your vision all the time (lens flares, glares, flashy lights, way overdone particles).

What does a 570 offer over that card being 3 years newer?

Age, warranty, resale value. And in the case of AMD, probably also some driver love the 980ti isn't getting much of anymore.

shoot for the stars and lower only as needed and not for giggles. People target both IQ and FPS. It's part of the reason why we see min and rec specs on the box.

Yes, and now back to reality: You buy a card once every few years (much longer for most) and in the meantime, you only really get your desired FPS+Ultra quality at the beginning of that upgrade cycle. The longer you stick to the card, the more you have to adjust to keep either a quality level or an FPS target. People may want both, but they can never have both at every point in time. And I believe that whenever they can't, the settings menu is the first thing they dive into. So, conclusion: Ultra is not the thing that makes or breaks gaming for most people. Simply because budget and spending doesn't allow it to be.
 
Last edited:
the brand, many people just read Nvidia if talking about graphic card, like razer too
 
By fps targets I mean 60hz/fps... a 1060 and equivilants from previous gens can do that at 1080p ultra. He doesnt seem to be able to notice a difference/can play in lower fps is also what he said.

Age, warranty, resale value. And in the case of AMD, probably also some driver love the 980ti isn't getting much of anymore.
I meant features and performance wise... so, nothing. 980tis still cost more than 570 on the market, right? So resale value may or may not be there.
 
Last edited:
GTX 980ti is really doing well in holding up in modern games at ultra for sure. But I noticed here at least, on ebay, a used RX 570 costs almost as much as a 4gb RX 580. I game at 1080p so a 4GB model may just be fine enough. Although paying an extra $50 for an 8gb model isn't bad either. I know my friends GTX 970 4gb may not be sufficient for Resident Evil 2 Remake. But I hope it is. If not, then a RX 580 may be ordered soon.
 
By fps targets I mean 60hz/fps... a 1060 and equivilants from previous gens can do that at 1080p ultra. He doesnt seem to be able to notice a difference/can play in lower fps is also what he said.

Up to a point ;) So what happens when you don't have a few hundred ready to get spent on a GPU, yet your games do produce stutter on Ultra? Settings go down. So the real priority isn't Ultra, its the FPS. Ultra is the luxury you only have if you can afford it. And when somewhat lower settings get very close to the Ultra 'experience'... guess what MOST people tend to default to?

We don't disagree a whole lot, only about your thinking of 'most people'. We are a niche. And when you're in one, its harder to see that.
 
Lol, I'm just going by the polls bud... consider taking a look at them again. :)

There are two threads with identical information in them. One started by Lexi, the dupe by cucker.
 
Lol, I'm just going by the polls bud... consider taking a look at them again. :)

There are two threads with identical information in them. One started by Lexi, the dupe by cucker.


I've seen the two threads here which are essentially the same poll showing that ultra/highest is a goal.
I accept there are tons of users...but in the end, we would all run ultra if we could get acceptable fps. There are few reasons not to if we could. :)

No, you just used the polls to support your argument. And I am saying those aren't representative of your idea of 'most people'.

And here you are saying that actually, we would only run Ultra if we can do so at acceptable FPS. So its not Ultra that people default to. Its playable FPS, and thén what quality settings you can squeeze out of it. All of this doesn't really truly start with a quality setting, it starts with a budget. As per the example of your kids. The midrange is where budget is a crucial element, because otherwise you wouldn't be in this segment.

By fps targets I mean 60hz/fps... a 1060 and equivilants from previous gens can do that at 1080p ultra. .

You go ahead and try that on a GTX 960 at games that were recent for its time of release. Go ahead. I dare you :D Or better yet, try it on a GTX 660 for games at its time of release, let's say Far Cry 3:

https://www.techspot.com/review/615-far-cry-3-performance/page5.html

Woops. And lo and behold, the review even dedicates the majority of its testing to 'High'.
 
Last edited:
LOL, missing the point and sniping not so relevant ones... so I'm just going to step away. I really do not have time to clarify and spell things out.

Cheers. :)
 
LOL, missing the point and sniping not so relevant ones... so I'm just going to step away. I really do not have time to clarify and spell things out.

Cheers. :)

You're not being awfully fair now are you?

I suppose it can be, sure. But since getting a 144hz panel and pumping144fps of synced goodness in most titles, it would be tough to go back to 60hz/fps or less. The difference between my PC and my kids 60hz/fps is more than apparent...as are the medium and ultra settings (fortnite for example).

The underlying goal of most is simply to play the game as the dev intended... looking how it should. Indeed it can be hard to distinguish differences but again, unless there are other constraints, users dont buy a PC to game on medium out of the gate but do so typically because of other limitations like budget or too high res for the card.
Right, but we both know that really, ultra isn't what the dev intended either. That's what NVidia/Intel/AMD want to showcase their hardware features... And this used to be out in the open, and not a big deal. I mean, it's the land of things like Phys-X and such...

I dunno man, I actually do kind of buy a PC to game at medium. I'm an average person, with an average budget. People with lots of cash play on high. I shouldn't be able to play "ultra"... ever, unless I have the top-end VGA of a specific brand. Wasn't that the point? When did that get lost?

You, you review hardware. You don't pay for this stuff. I don't know that your opinion is actually relevant about pricing and such, really. Not to dismiss, but you know... you're spoiled with hardware.


Well, back when we were discussing this, was before microstutter became a term people used. So now I can say I'm sensitive to that, and people know what I'm talking about. That's not a myth, and it's not watching a FPS counter, and its been proven to make other people have motion sickness from games like I do, so at this point I'm pretty vindicated on all of that. Certain FPS intervals make me feel sick. It's not a big deal, but because of that, I seek a consistent FPS. I don't know what that FPS is... that changes depending on what monitor I am using, really.

There used to be a point in time where this price range that the OP is talking about, the 570/1050ti range, was one of the most common. All of a sudden, if you don't have the uber-leet 40GB GPU, then what you like isn't important? OK. See, I can't write for that over-indulging audience...

So me, it's funny. I've been looking at these two GPUs specifically to go with my TR1950X. I kinda want the AMD card so I got a AMD-all build, but man, I like nvidia too...?


I ended up pulling out my 7970 MATRIX and using that for now. That's the last time that AMD was relevant to me when it comes to GPUs. Now I'm looking at all these questions again...

Well, guess what? You're reviewing hardware too, and now I'm back to buying it. It's kind of opened my eyes again as to how important some of the things I started to take for granted while I was doing reviews really is, and how maybe, sometimes what real people want gets missed completely. Oh well.
Dave, if I didnt review I'd still be shooting for ultra settings still. I would likely be doing it at the same res and hz as well. I wouldn't be using a 2080ti... likely still the 1080 which allowed the titles I play to be above 100 fps anyway. Most of the gaming public is at 1080p. It doesnt take much to game at 60hz 1080p and ultra. A 1060 6gb will do there. Point is the goal for everyone is ultra settings where possible. For some it isnt possible or simply a choice for more fps.

But that is ultimately everyone's goal to run ultra if possible. I'd be floored if anyone intentionally ran under ultra (glitches, etc aside) if they could match/surpass their refresh rate. It's just a res/budget/game/setting limit. :)

Edit: I bought two pcs for my kids out of pocket for 1080p 60hz ultra gaming. So while my daily driver is clearly not a common system, I know what it is like to buy them and shoot for a 1080 60hz ultra gaming goal with a budget more in line with the masses. ;)

I'm not missing anything, just simply not in agreement with your statement that most people 'target' or 'want' Ultra. Or about Ultra being 'how the dev intended it'... come on man. Wake up please - dave was spot on saying that Ultra is inflated, heavy settings to make you feel like your fat GPU is worth having. And also not that without Ultra, you're effectively console gaming. You've even said so yourself, a number of times. Hell, I want a Ferrari too, it doesn't mean I'll ever drive one, so its a target I can easily give up on, in favor of having many other things that are perhaps much more comfortable to have, such as a cup holder.
 
Last edited:
A good example, for gaming, of overbuying on resolution and being forced to lower settings to play at the native res.

I have a GTX 1080 and besides of Far Cry 5 I can't remember the last game I played on max settings.
I truly believe if your GPU is capable of handling higher resloutions with some sacrificed graphical details it's the way to go.
There are games that even my overclocked GTX 1080 can't maintain a solid 60FPS on max settings at 1080p, If you always want to go for max settings you'll always be stuck at 1080p even with the fastest GPUs out there.
Having the crispiness and sharpness of higher resolutions is more visually compelling than some sometimes invisible details that ultra settings offer, trust me.
There is also the matter of framerates, if you want a true 144Hz gaming experience even with a 500$ GPU it's impossible to achive unless you're willing to dial back some settings. (In most games)
 
Last edited:
You're not being awfully fair now are you?





I'm not missing anything, just simply not in agreement with your statement that most people 'target' or 'want' Ultra. Or about Ultra being 'how the dev intended it'... come on man. Wake up please - dave was spot on saying that Ultra is inflated, heavy settings to make you feel like your fat GPU is worth having. And also not that without Ultra, you're effectively console gaming. You've even said so yourself, a number of times. Hell, I want a Ferrari too, it doesn't mean I'll ever drive one, so its a target I can easily give up on, in favor of having many other things that are perhaps much more comfortable to have, such as a cup holder.

This I agree with

If people want ultra settings constantly just get a 1080P or 2K with a Vega56.

Not everyone have deep pockets.

I got my card due to pricing being better than the X model, both come in at same clocks or the non X is higher despite a few cus being hard locked.
 
Status
Not open for further replies.
Back
Top