• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Why are people still buying 1050TI's and not RX 570's?

Status
Not open for further replies.
I have to agree with Solaris here. When I game on PC, I buy what it takes to run the resolution I have with the best settings available. If I wanted to play on medium settings, I would buy a console!

I had to do exactly this, and have had to for many years.... can you guess why?


You know why, because we talked in voice for hours for years... so you know how true and real I'm being here, and how I can say that although I can see your side of things, I don't think this guy is trolling about this...


I had a 2560x1600 monitor. Since 2007. When the 1950X was still a thing. Remember?


You bloody well believe i gamed on medium, and you heard me moan about it forever...


Then, I ran eyefinity, which you are also aware of. Also, medium settings in Battlefield 3, which you know. We talked about this lots. You and I, for hours and HOURS AND HOURS...


Now, we have 4k. People are getting 60 FPS @ 4K on ultra settings? Since when?


People that buy the latest and greatest monitors, don't really have any choice but medium settings, and it's been that way for not just years.... decades now. This is a high-end user thing, medium graphical settings, because these monitors aren't cheap, and they are cutting-edge stuff, and they kill VGAs like they always have... and they are WHY WE NEEDED MULTIPLE GPUs and SLI and Crossfire are real tech....


WHY has everyone forgot the past?


Anyway, I'm out. Enjoy!

:lovetpu:
 
Last edited:
i bought a 1050ti a while back just to go in a small none gaming machine.. it gave the machine basic low resolution gaming abilities on a budget and didnt need any extra power connectors..

i dont think people buy a 1050ti for real gaming more just be able to play the odd game if they feel like it.. its the perfect card for that kind of purpose..

trog
 
Now, we have 4k. People are getting 60 FPS @ 4K on ultra settings? Since when?
Since the 2080ti on many titles (and yes I get the irony there, save your time :)).

The thing is nobody wants to run less than ultra...it's a need, for some, to do so however. But people dont go into it looking to do so. Its guided by their resolution and budget for GPU.
 
Since the 2080ti on many titles (and yes I get the irony there, save your time :)).

The thing is nobody wants to run less than ultra...it's a need, for some, to do so however. But people dont go into it looking to do so. Its guided by their resolution and budget for GPU.


Well, I've moved on to 8k now, so yeah... still medium.

https://www.techpowerup.com/240668/viewsonic-introduces-new-professional-and-enterprise-monitors

I've talked with so many of you guys about how medium isn't that much different visually at least once at some point in the past...


But TPU has never been the place of true high-end computing, so I'm not really shocked about all of this, to be honest.
 
Call me strange, but I run 4k with a 1080ti, since SLi has limits, and I'm still using ultra settings. Most games I can get close to 60 fps, but the game is still "playable" with less fps. Frames dont bother me until the game shudders or hitches.
 
A good example, for gaming, of overbuying on resolution and being forced to lower settings to play at the native res.

one could say that buying for ultra settings is the same, and maybe high is the good spot. I dunno. It's the EXACT same thing, but reversed, no? :D

Frames dont bother me until the game shudders or hitches.


But you know that I am personally a bit more sensitive to these things, so I seek to avoid any of that at all costs.
 
one could say that buying for ultra settings is the same, and maybe high is the good spot. I dunno.




But you know that I am personally a bit more sensitive to these things, so I seek to avoid any of that at all costs.

That's the thing.... to me i dont need a set amount of fps to play a game. I am busy playing rather than reading an fps counter.
 
one could say that buying for ultra settings is the same, and maybe high is the good spot. I dunno. It's the EXACT same thing, but reversed, no? :D




But you know that I am personally a bit more sensitive to these things, so I seek to avoid any of that at all costs.
I suppose it can be, sure. But since getting a 144hz panel and pumping144fps of synced goodness in most titles, it would be tough to go back to 60hz/fps or less. The difference between my PC and my kids 60hz/fps is more than apparent...as are the medium and ultra settings (fortnite for example).

The underlying goal of most is simply to play the game as the dev intended... looking how it should. Indeed it can be hard to distinguish differences but again, unless there are other constraints, users dont buy a PC to game on medium out of the gate but do so typically because of other limitations like budget or too high res for the card.
 
I doubt people still buy the 1050ti vs the 570 these days. They are similar prices where I live. The cheapest 3GB 1060 is +35$ and the cheapest 4GB 580 is +50$. Probably worth to go for the 580s instead.
 
I suppose it can be, sure. But since getting a 144hz panel and pumping close to 144hz of synced goodness, it would be tough to go back to 60hz/fps or less.

The underlying goal of many is simply to play the game as the dev intended... looking how it should. Indeed it can be hard to distinguish differences but again, unless there are other constraints, users dont buy a PC to game on medium out of the gate.

Right, but we both know that really, ultra isn't what the dev intended either. That's what NVidia/Intel/AMD want to showcase their hardware features... And this used to be out in the open, and not a big deal. I mean, it's the land of things like Phys-X and such...

I dunno man, I actually do kind of buy a PC to game at medium. I'm an average person, with an average budget. People with lots of cash play on high. I shouldn't be able to play "ultra"... ever, unless I have the top-end VGA of a specific brand. Wasn't that the point? When did that get lost?

You, you review hardware. You don't pay for this stuff. I don't know that your opinion is actually relevant about pricing and such, really. Not to dismiss, but you know... you're spoiled with hardware.

That's the thing.... to me i dont need a set amount of fps to play a game. I am busy playing rather than reading an fps counter.
Well, back when we were discussing this, was before microstutter became a term people used. So now I can say I'm sensitive to that, and people know what I'm talking about. That's not a myth, and it's not watching a FPS counter, and its been proven to make other people have motion sickness from games like I do, so at this point I'm pretty vindicated on all of that. Certain FPS intervals make me feel sick. It's not a big deal, but because of that, I seek a consistent FPS. I don't know what that FPS is... that changes depending on what monitor I am using, really.

There used to be a point in time where this price range that the OP is talking about, the 570/1050ti range, was one of the most common. All of a sudden, if you don't have the uber-leet 40GB GPU, then what you like isn't important? OK. See, I can't write for that over-indulging audience...

So me, it's funny. I've been looking at these two GPUs specifically to go with my TR1950X. I kinda want the AMD card so I got a AMD-all build, but man, I like nvidia too...?


I ended up pulling out my 7970 MATRIX and using that for now. That's the last time that AMD was relevant to me when it comes to GPUs. Now I'm looking at all these questions again...

Well, guess what? You're reviewing hardware too, and now I'm back to buying it. It's kind of opened my eyes again as to how important some of the things I started to take for granted while I was doing reviews really is, and how maybe, sometimes what real people want gets missed completely. Oh well.
 
Last edited:
I think you are stretching my point @cadaveca. Point is that when I got a 4k screen, I knew I needed strong gpu's to push it. I dont get motion sick, and I don't see the need for huge fps numbers when the screen tops out at 60hz. I am not saying others cannot have an issue, but with the right equipment you can make the best of it rather than turning everything down.
 
I think you are stretching my point @cadaveca. Point is that when I got a 4k screen, I knew I needed strong gpu's to push it. I dont get motion sick, and I don't see the need for huge fps numbers when the screen tops out at 60hz. I am not saying others cannot have an issue, but with the right equipment you can make the best of it rather than turning everything down.

Oh, I tried. I tried and tried and tried. but SLI sucks on so much.... and we can agree that until the 2080ti, which is still new, that 4k 60 FPS needed SLI, and that my need for it, well, is a rather stupid one, but still, is medical.

ANd its not like I play uncommon titles... some work. Most don't on SLI. Like... these damn monitor purchases... ROFL... heh. You know what I'm on about with that.

I mean, yeah, you're right, you can do differently, but I would just interpose that you, as a hardware reviewer, are actually the least-common, so just because something works for you, doesn't mean it works for everyone... as much as we all might wish it did. We both know that the stupid shit like you buying memory a week after I did means you and I have completely different PCs....

Anyway, the easy way to get that smooth FPS-to-monitor-refresh-ratio, with a single card, no matter the game... is to just run medium on a single GPU. I mean, like I said, games used to do that medium by default anyway.... We all know this. It seems we all forgot. I was stuck in the land of not being able to forget with my bullshit $2500 monitor purchases.


Because you know, I'm not gaming on a $5k PC with a $500 monitor... ;)
 
ANd its not like I play uncommon titles... some work. Most don't on SLI. Like... these damn monitor purchases... ROFL... heh. You know what I'm on about with that.

Which titles do you play that sli doesn't work? I play most of the AAA games released in the last few years on my pair of 1080ti's in sli at 4k60fps.
 
Dave, if I didnt review I'd still be shooting for ultra settings still. I would likely be doing it at the same res and hz as well. I wouldn't be using a 2080ti... likely still the 1080 which allowed the titles I play to be above 100 fps anyway. Most of the gaming public is at 1080p. It doesnt take much to game at 60hz 1080p and ultra. A 1060 6gb will do there. Point is the goal for everyone is ultra settings where possible. For some it isnt possible or simply a choice for more fps.

But that is ultimately everyone's goal to run ultra if possible. I'd be floored if anyone intentionally ran under ultra (glitches, etc aside) if they could match/surpass their refresh rate. It's just a res/budget/game/setting limit. :)

Edit: I bought two pcs for my kids out of pocket for 1080p 60hz ultra gaming. So while my daily driver is clearly not a common system, I know what it is like to buy them and shoot for a 1080 60hz ultra gaming goal with a budget more in line with the masses. ;)
 
Last edited:
Which titles do you play that sli doesn't work? I play most of the AAA games released in the last few years on my pair of 1080ti's in sli at 4k60fps.
SLI worked in PUBG since when? ;)

Lots of games. Everything. BFV, most recently, although its working fine now. F1 games... dude, there's more to gaming than just AAA games... you're proving my point.

I'm just saying that medium settings isn't all that uncommon... as a multi-GPU user since it was possible due to monitor resolution, medium settings on just one of those GPUs when SLI has issues is just the norm, really.

Dave, if I didnt review I'd still be shooting for ultra settings still. I would likely be doing it at the same res and hz as well. I wouldn't be using a 2080ti... likely still the 1080 which allowed the titles I play to be above 100 fps anyway. Most of the gaming public is at 1080p. It doesnt take much to game at 60hz 1080p and ultra. A 1060 6gb will do there. Point is the goal for everyone is ultra settings where possible. For some it isnt possible or simply a choice for more fps.

But that is ultimately everyone's goal to run ultra if possible. I'd be floored if anyone intentionally ran under ultra (glitches, etc aside) if they could match/surpass their refresh rate. It's just a res/budget/game/setting limit. :);)

Yeah, budget. That's my only thing. You're right though, most are at 1080p, and gaming at that res is pretty good.

Which brings me back to the OP, and why I looked at this thread, as the 1050TI is kind of on the limit of that, and the 570 is well in that territory. Because also, @ 1080p, when I can't play ultra with these GPUS, you know for sure medium is going to work...

oh, you made some edits. :p


Dude, do you actually talk to anyone in the real world outside of the PC industry that plays games? Do you talk to anyone in the 16-21 age group? That's my kids. They don't think like dis... they woke up like dis. :p They turn on the game and play it and don't fuck with settings, for the most part. They expect the PC to do it for them, and that's why things like Gefore Experience are still in use and NVidia still makes it, even though every single reviewer out there complains about it...

Like, ultimately, you're right, but also... so am I. Until you all can accept that there are many different types of users and that we need not stick to one usage model... well, I got some shit to say. Like, sorry... but you know... I got into doing reviews to be a different voice out there. Sneekypeet knows this full well since it's really due to him that I ended up doing reviews here on TPU for like nearly a decade...


I still feel the same way about much of it all, but I did get to talk to a lot of people that did agree with my side of things over the years. The average user.. the average gamer.. is at a far more average place than you guys want to think. You guys are high-end.
 
Last edited:
Like most have said power. The 570 uses about as much power as a GTX1080 so we're talking about 3 times the power of a 1050ti. AMD does great with price/performance especially with the 570 but if someone is upgrading and has a poor PSU then the 1050ti is the way to go unless they pay more for a new PSU.

I really don't understand why AMD needs so much more power to compete, is it the extra FP64 cores AMD continues to keep in their consumer GPUs or something else? Whatever it is they need to fix it because is ridiculous that if you buy an AMD you will be using more than double the power cost of an NVidia card with similar performance.
 
I've seen the two threads here which are essentially the same poll showing that ultra/highest is a goal.

My kids are unicorns...I get it. 11 and 7 year olds with a 970 and a r9 270. But both systems were built from scratch, used, for under 1k. Now, we fortnite together and my kids are building machines, and I stink!

The majority of those around that age are alienware, omen, Ibuypower,...canned systems.

I accept there are tons of users...but in the end, we would all run ultra if we could get acceptable fps. There are few reasons not to if we could. :)
 
@cadaveca is making a lot of sense. Not everybody is building a computer with a graphics card which alone costs more than what I make when I work 60 hours in a single week. Those same people are also likely to not play on ultra settings either. I'm not running ultra settings on my 1070. It doesn't take ultra graphics to have a good time playing a game. As long as my games don't look like smeared dog shit in the mud, like they tend to on the lowest settings, I'm good. I don't play much of the latest games, but I do still play a lot of 7 Days to Die... which my 1070 can probably run at the highest settings, but I don't, because the game is a turd and your FPS will tank no matter what... so I stick to med/high settings to keep good performance. In fact, I play with med/high settings on just about every game, because I don't need photorealistic shadows and stuff dragging down my FPS...
 
I've seen the two threads here which are essentially the same poll showing that ultra/highest is a goal.

My kids are unicorns...I get it. 11 and 7 year olds with a 970 and a r9 270. But both systems were built from scratch, used, for under 1k. Now, we fortnite together and my kids are building machines, and I stink!

The majority of those around that age are alienware, omen, Ibuypower,...canned systems.

I accept there are tons of users...but in the end, we would all run ultra if we could get acceptable fps. There are few reasons not to if we could. :)

Yeah, I mean, I went back to college for the HVAC stuff. These people are half my age. I made friends with some of them and still talk to them. My oldest is about to graduate highschool, and my youngest is almost out of elementary. The youngest... all his friends are all about fortnight. He's 11. My oldest, he's playing what? BFV, BO, tarkov? My girls play RPGs? My kids have the same systems that yours do, and I have four of those kids, but ALL of their friends... their friends PARENTS that paly games too...

dude.... you know what I'm on about here if you interact with other people, I think. :P

The ONE commonality, I'd say, with all these people that paly games, is that they want to have fun, and that fun means the games WORKING. What working means, to some people, seems to be a fluid thing. That just makes me think back to the NVidia study about gamers being willing to accept artifacts in graphics.... and here we are.

So I wanna turn this back to the OP... why do people buy what hey do? DUDE. I really know why. Nobody wants to really admit it.
 
Every 570 I check has an 8-pin connector and 500W recommendation, and the cards don’t ship with an adapter cable. That pretty much kills its chances in the OEM desktop world, where a buyer can drop in a 1050 Ti without any doubts. I think the 1050 Ti’s low power requirement puts it into a different category than the 570. AMD needs something better than the RX 560 to manage this, but without the power requirements of the 570. A 12nm 570 with reduced clocks maybe?
 
SLI worked in PUBG since when? ;)

Lots of games. Everything. BFV, most recently, although its working fine now. F1 games... dude, there's more to gaming than just AAA games... you're proving my point.

I'm just saying that medium settings isn't all that uncommon... as a multi-GPU user since it was possible due to monitor resolution, medium settings on just one of those GPUs when SLI has issues is just the norm, really.

Does pubg need even one card to play 4k60? A cellphone can pretty much max that garbage out.

BFV is long since fixed so that is mute.

F1 games I'll give you that one so we have some non-AAA titles that don't work.

Sli is pretty engrained in everything right now. Yes there are games that don't work, but those are getting fewer and fewer.

But I guess that really proves your point that everyone should just deal with medium settings. I'll keep enjoying running almost every game maxed out though. It looks prettier. With that pair of 1080ti's mind you, not even the latest cards.
 
link please. Id love to see a 97% difference from ultra to medium. I bet you can see such improvements with vram limited situations...other cards would likely see similar results as well.

Mostly I bet any improvements are half that or less.

EDIT: heres the link - https://us.hardware.info/reviews/74...polaris-update-hardwareinfo-performance-score

Anyway, the point is, that it's a reach for a 2560x1440 card. Users are required to lower settings to reach 60 fps in most titles. I saw the hw.info review and see they average, on medium, 55 fps for the highest oc'd one at 2560x1440. Those titles arent exactly GPU killers either. ;)

Like cdawall said, it made sense a short time ago...it doesnt now. But, anyone just needs to look the pricing and performance to see that. This isnt a groundbreaking point you are making. ;)

TPU users are so used to outdated reviews with old drivers that they even use them to proove a point. Here is a more recent review, showing almost 120fps avarage on 1080p and about 88fps avarage on 1440p. And, lets not forget that the drivers in the past 6 months have improved even further.

https://be.hardware.info/reviews/83...chips-hertest-hardwareinfo-gpu-prestatiescore

We have to keep in mind though that these games were benchmarks on a very fast CPU, but still 1440p 60fps is not unrealistic on a slower CPU.
 
It is really easy why a huge number of people purchased the 1050ti over the 570. There was a time less than 8 months ago that they cost 2-4 times as much.
At launch GTX1050Ti was priced against 4GB RX460 with MSRP of $139. RX470 MSRP was $179 and RX570 MSRP was $169.
That used to be the case for a long while and GTX1050Ti is a better card than RX460/RX570. I have no clue what is up with actual prices today putting it against a card that is one class higher up.
 
Status
Not open for further replies.
Back
Top