• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Why are some people still saying a 4-thread i5 is good enough with a beefy GPU...

Status
Not open for further replies.
get a life people, quad core is so 2012
 
get a life people, quad core is so 2012

Hey wait a darn minute.....

My kids HTPC with a 1400 x4 runs games A-OK. (2017)

Quad is still a thing, just best with HT or SMT.
 
Thread Title: Why do strawmen say strawman stuff??

Who is saying what exactly?

Yes, given a CPU-limited situation, many will suggest turning up the graphics settings, as there is often some slack GPU performance being wasted. It lessens the perceived CPU limitation, but the limitation still exists. There are also some specific situations (like someone on a very tight budget running a 60hz display paired with a low'ish end GPU) where upgrading the CPU alone might not be a worthwhile investment given their money situation. Are you sure you're not deliberately misinterpreting the context of what people are saying? I highly doubt anyone is suggesting to buy a $1200 GPU with an old 4/4 and saying it's A-okay.

For the record, today's AAA titles seem to prefer you have both of the following (or better):
6 cores
8 threads

Yes, I know 6/8 chips don't exist. What I mean is:

4/8 - Too few cores
6/6 - Too few threads
6/12 - Good
8/8 - Good
8/16 - Great

Thus far, only one game (RDR2) has done measurably better with more than 8 total threads, but this is an obvious bug in Rockstar's game engine, and only occurs under a weirdly specific set of circumstances (when CPU with less than 12 threads is fast enough to hit a max framerate cap (another bug) and bounce off of it; basically, only the 9700k).
 
get a life people, quad core is so 2012
:(
It's because 'good enough' is now a goal instead of the minimum. This has permeated all through live. Exceptionalism is now frowned upon.
While there is some truth to this, it is really simplistic and potentially biased thinking.

For many, yes, "good enough" is the goal but that is because budgets, rent/mortgage payments, insurance premiums, family, work and other commitments have higher priorities - as they should!

Many here are talking about "gaming" machines as though they should be everyone's top priority. :kookoo: I agree, "get a life!" Understand that for the masses, a nice gaming "toy" is not their top priority. Food, shelter, education for their kids, quality time with their loved ones, even work and other forms of entertainment may take precedence.

I'm a computer nerd too, and have been long before many of you were even born. But if money was tight, I sure would spend it on a pair of Gator Hardshells first. Those are not the minimum. They are not just "good enough". Those are the "best" for me.

I don't agree either that exceptionalism is frowned upon - at least not as a general rule. For sure, there are many who hate and are jealous of those who are exceptional and successful. But IMO, those haters don't count or deserve my attention. Then there are some who are exceptional who like to brag that they are. Mohamed Ali comes to mind. But IMO, he earned those bragging rights. Then there are those who simply think they are exceptional. They are often frowned upon, and deservedly so IMO.
 
Last edited:
Thread Title: Why do strawmen say strawman stuff??

Who is saying what exactly?

Yes, given a CPU-limited situation, many will suggest turning up the graphics settings, as there is often some slack GPU performance being wasted. It lessens the perceived CPU limitation, but the limitation still exists. There are also some specific situations (like someone on a very tight budget running a 60hz display paired with a low'ish end GPU) where upgrading the CPU alone might not be a worthwhile investment given their money situation. Are you sure you're not deliberately misinterpreting the context of what people are saying? I highly doubt anyone is suggesting to buy a $1200 GPU with an old 4/4 and saying it's A-okay.

For the record, today's AAA titles seem to prefer you have both of the following (or better):
6 cores
8 threads

Yes, I know 6/8 chips don't exist. What I mean is:

4/8 - Too few cores
6/6 - Too few threads
6/12 - Good
8/8 - Good
8/16 - Great

Thus far, only one game (RDR2) has done measurably better with more than 8 total threads, but this is an obvious bug in Rockstar's game engine, and only occurs under a weirdly specific set of circumstances (when CPU with less than 12 threads is fast enough to hit a max framerate cap (another bug) and bounce off of it; basically, only the 9700k).
Why does my good Intel CPU beat every great AMD in gaming...
 
Have lurked at TPU for years, one of my favorite tech sites, especially the BIOS repository which has been gold since I love to BIOS mod low end cards. Finally decided to join.

This has been an interesting discussion. I am on an extreme budget due to health issues, and many of my friends/acquaintances are in the same boat where money for high end hardware is not happening. I'd rather spend my money on supporting the game developers (I watch for lots of sales) then buy expensive hardware and have no money left for games. To me we are in the golden age of cheap computing and good gaming experiences if you have reasonable expectations. I've built many $150-$300 systems that have brought joy and entertainment to their owners.

My current system which I have around $150 total invested is an old Dell Optiplex 790. It came with an i3 2120 which I sold for $20 and replaced with a Xeon E3-1220 for $15. This is a Sandy Bridge quad-core, 3.1GHz with 8MB L3 cache. I added 2x4GB to the 2x2GB already in the machine to get it to 12GB.

Video card is an OEM R7 450 4GB GDDR5 I paid $24 for. Basically the same Cape Verde 512/32/16 GCN 1.0 GPU as you'd find on an old HD7750. I spent time BIOS modding card using the great VBE7 tool and undervolted GPU from stock 1.2v down to 1.050v with a slight overclock to 950MHz. Memory runs at stock 1125MHz. On newer games the 4GB of memory is actually quite useful even on such a low end card.

My eyes suck, so I use a Samsung 24" TV/Monitor with 1366x768 resolution. I love it, I do have to scroll more yet with the low resolution text is big and easy to read, and with the low resolution I can game on my potato card no problem. For me as long as I get 30FPS or above I'm happy. In fact I set the Frame Rate Target control to 30FPS in the driver for all games. GPU temps never go above 65C even in the most demanding games.

I have recently completed Far Cry New Dawn (medium settings - uses 2.5GB VRAM), Wolfenstein New Colossus (medium settings - uses 3-3.5GB VRAM), Doom 2016 (medium settings - uses 2-2.5GB VRAM), Metro Exodus (medium settings - uses 2.5-3GB VRAM) and am working through Far Cry 5 (low/medium settings, HD textures on - uses 3.5-4GB VRAM). All of these games play great on my potato of a system using a low end 2011 quad and a low end 2012 GPU.

So while I and many others would love to have a high end 8+ thread CPU, I think for most average people a quad with decent IPC is still quite viable. I have yet to be limited by a quad for my gaming and usage patterns, and other people I've thrown budget systems together for say the same thing.
 
Why does my good Intel CPU beat every great AMD in gaming...
Because it meets both requirements and has a tinge higher IPC. Nitpicking good/great wasn't the point.


Edit: Actually, make things equal. Compare the same architecture, your 9700k vs a 9900K
 
Why does my good Intel CPU beat every great AMD in gaming...

Quite a few reasons.
Better memory performance.
as mentioned, better IPC.
Game or Benchmark built on/optimized for Intel.
Legacy benchmarks boasted Intel compatibility as far back as 3DMark99 Max

Sadly AMD getting beaten at 7nm vs 14nm +++++++++++++++++

Of course we compare AMD to Intel. That's what competition is all about.

___________________

Back to topic......

Core i5 with 1660ti. OK example config.
Plays AAA title not at max detail and resolution.
Does it still play the game though??? Yes??

So that's- ... Umm... good enough.
i5 is good enough

Good enough, you played an AAA title not at max detail and resolution. It was playable.

Max settings. No.

This is not that hard of a concept here.
 
Viable and ... "I spend a huge portion of my life on the computer and want smooth and awesome FPS" are two different things.

for low end gaming or for someone who is playing games like league of legends then yeah for sure... a quad is fine... but if you want to fire up a modern game with beautiful settings and be immersed in all its stutter free glory at high fps, then a quad is not good.

The stutter/mins on a quad is quite bad these days in many titles and 6/12T really is the sweet spot. I think the Ryzen 3600 is the ideal upper mid range gamer's cpu. I would even be willing to say that a 3600 is great for high end gaming rigs as it allows you to splurge on a nice gfx card and still has enough power to feed it.
 
Have lurked at TPU for years, one of my favorite tech sites, especially the BIOS repository which has been gold since I love to BIOS mod low end cards. Finally decided to join.

This has been an interesting discussion. I am on an extreme budget due to health issues, and many of my friends/acquaintances are in the same boat where money for high end hardware is not happening. I'd rather spend my money on supporting the game developers (I watch for lots of sales) then buy expensive hardware and have no money left for games. To me we are in the golden age of cheap computing and good gaming experiences if you have reasonable expectations. I've built many $150-$300 systems that have brought joy and entertainment to their owners.

My current system which I have around $150 total invested is an old Dell Optiplex 790. It came with an i3 2120 which I sold for $20 and replaced with a Xeon E3-1220 for $15. This is a Sandy Bridge quad-core, 3.1GHz with 8MB L3 cache. I added 2x4GB to the 2x2GB already in the machine to get it to 12GB.

Video card is an OEM R7 450 4GB GDDR5 I paid $24 for. Basically the same Cape Verde 512/32/16 GCN 1.0 GPU as you'd find on an old HD7750. I spent time BIOS modding card using the great VBE7 tool and undervolted GPU from stock 1.2v down to 1.050v with a slight overclock to 950MHz. Memory runs at stock 1125MHz. On newer games the 4GB of memory is actually quite useful even on such a low end card.

My eyes suck, so I use a Samsung 24" TV/Monitor with 1366x768 resolution. I love it, I do have to scroll more yet with the low resolution text is big and easy to read, and with the low resolution I can game on my potato card no problem. For me as long as I get 30FPS or above I'm happy. In fact I set the Frame Rate Target control to 30FPS in the driver for all games. GPU temps never go above 65C even in the most demanding games.

I have recently completed Far Cry New Dawn (medium settings - uses 2.5GB VRAM), Wolfenstein New Colossus (medium settings - uses 3-3.5GB VRAM), Doom 2016 (medium settings - uses 2-2.5GB VRAM), Metro Exodus (medium settings - uses 2.5-3GB VRAM) and am working through Far Cry 5 (low/medium settings, HD textures on - uses 3.5-4GB VRAM). All of these games play great on my potato of a system using a low end 2011 quad and a low end 2012 GPU.

So while I and many others would love to have a high end 8+ thread CPU, I think for most average people a quad with decent IPC is still quite viable. I have yet to be limited by a quad for my gaming and usage patterns, and other people I've thrown budget systems together for say the same thing.
Welcome to TPU, and thanks for a normal person’s perspective!
 
Thread Title: Why do strawmen say strawman stuff??

Who is saying what exactly?

Yes, given a CPU-limited situation, many will suggest turning up the graphics settings, as there is often some slack GPU performance being wasted. It lessens the perceived CPU limitation, but the limitation still exists. There are also some specific situations (like someone on a very tight budget running a 60hz display paired with a low'ish end GPU) where upgrading the CPU alone might not be a worthwhile investment given their money situation. Are you sure you're not deliberately misinterpreting the context of what people are saying? I highly doubt anyone is suggesting to buy a $1200 GPU with an old 4/4 and saying it's A-okay.

For the record, today's AAA titles seem to prefer you have both of the following (or better):
6 cores
8 threads

Yes, I know 6/8 chips don't exist. What I mean is:

4/8 - Too few cores
6/6 - Too few threads
6/12 - Good
8/8 - Good
8/16 - Great

Thus far, only one game (RDR2) has done measurably better with more than 8 total threads, but this is an obvious bug in Rockstar's game engine, and only occurs under a weirdly specific set of circumstances (when CPU with less than 12 threads is fast enough to hit a max framerate cap (another bug) and bounce off of it; basically, only the 9700k).

This pretty much sums it up. Add far cry 5 to this as it behaves very similarly. Also 8/8 stutters on it.
 
Welcome to TPU, and thanks for a normal person’s perspective!

This is an awesome reason why someone would still game on a quad core paring it with a GPU that makes sense and doing the most with whatever budget you have.

This post is more about why people think a 6600k/7600k is still good enough for a $400+ GPU that especially at 1080p is going to be a stutter fest in a lot of new games assuming you're trying to get the maximum performance out of the gpu.

At the same time if they want to pair the same 6600k with a 1660 super they're not overspending on GPU performance they're never gonna see there isn't anything wrong with that.

Also if starting a new system a ryzen 2600 should be a minimum consideration if you're planning on using a 5700XT or higher GPU with preferably a 3600 being the real sweet spot right now for a high end gaming PC.
 
OK in what sense?

OK as in to sit on for another 6 months while playing at 60Hz? Maybe... (Not that the CPU market is unattractive at the moment)

OK as in buying a 4c4t CPU? Nah forget it, buy a second hand zen/zen+ 6 core, add a cheap 400 series board, and you will get way more for what you put in (Still <<<$130 if you look at the right places)... Not to mention the second hand market for intel stuff is still really bloated price wise.
 
This pretty much sums it up. Add far cry 5 to this as it behaves very similarly. Also 8/8 stutters on it.
Lol since when? This is the entire problem when you have fan boys who have no clue to what they are talking about making hyperbole comments to simply justify their part selection in order to mask their own personal ego or inferiority. This is the perfect example of someone who clearly has no clue

Intel 8 core matching AMD 8/16 @ .1% and running circles around it every where else

intel-i7-9700k-fc5-1080p_1.png
 
Last edited:

1575243875920.png


Didn't look at your own graph? Why don't you put up the frame time plot? Why would the 8C/8T be being beaten by a slower, cheaper 6C/12T in the 1% lows??? Could it be... i don't know... stutters!?!?

Or you can edit your post to be misleading, in a game where AMD does poorly, to state that "It matches the AMD 8C/16T in the .1% lows"
 
Lol since when? This is the entire problem when you have fan boys who have no clue to what they are talking about making hyperbole comments to simply justify their part selection in order to mask their own personal ego or inferiority. This is the perfect example of someone who clearly has no clue

Intel 8 core matching AMD 8/16 @ .1% and running circles around it every where else

intel-i7-9700k-fc5-1080p_1.png
The real takeaway from that graph is the 9700k needs to be oc to even have decent 1% lows loses to over a year older CPU by 15% and gets destroyed by a 9900k by 23% in that particular game stock vs stock you also get slightly better frame times from a 2600 a 130$ ish CPU in farcry...... in most games the 9700k is fine though but it was mostly a pointless CPU over the 8700k and doesn't really make sense over a 3600 even that typically cost about 46% less give or take sales.

I'm mean anyone who really loves a 9700k more power to them but Intel cheaped out with it just like they've been doing with the i5s for ages. Next year Intel will probably be back to a 6 core 12 thread i5 and an 8 core 16 thread i7 so I'm glad they seem to have learned their lesson.
 
If it has to be intel I'd rather the 8700k...
 
.....comments to simply justify their part selection in order to mask their own personal ego or inferiority. This is the perfect example of someone who clearly has no clue
Do we have a little bit of projection going on here? You had nothing to say about my entire post aside from the part where I called one "good" and the other "great", even though 8/16 actually had nothing to do with the rest of my post and I almost didn't include it. You had to immediately respond about yours not being marked "great".
 
View attachment 138235

Didn't look at your own graph? Why don't you put up the frame time plot? Why would the 8C/8T be being beaten by a slower, cheaper 6C/12T in the 1% lows??? Could it be... i don't know... stutters!?!?

Or you can edit your post to be misleading, in a game where AMD does poorly, to state that "It matches the AMD 8C/16T in the .1% lows"

lol you choose the game not me! My post simply proves you wrong with the 9700k beating the Ryzen 2700 a CPU with twice the threads. if you have an inferiority complex about your 8700k that is a you problem, not a problem with the 9700k results in beating it at most games...
 
I wanted to post my own experience. I just downgraded from Ryzen 5 1600 to older i5 4670k a couple of days ago (due to financial reason), all I could say is depending on games. If a game loaded all the cores and when it reaches 100% it will hitch and stutter, at that moment even variable refresh rate monitor can't help.

I only tested Borderlands 3 as its the only game I play currently. But the hitching/stuttering is momentary when in heavy gunfights. Does it affect gameplay? Slightly. Is it playable? Of course, it isn't as bad as low framerate of low end GPU.
 
Do we have a little bit of projection going on here? You had nothing to say about my entire post aside from the part where I called one "good" and the other "great", even though 8/16 actually had nothing to do with the rest of my post and I almost didn't include it. You had to immediately respond about yours not being marked "great".

you were generalizing and that never works, frankly I don't care what you mark I just showed the obvious flaws in your post and my response to your post was redundant (I knew the answer already)
 
lol you choose the game not me! My post simply proves you wrong with the 9700k beating the Ryzen 2700 a CPU with twice the threads. if you have an inferiority complex about your 8700k that is a you problem, not a problem with the 9700k results in beating it at most games...

I never said anything about a 2700 - I just know disabling hyperthreading in that game on an 8C CPU makes it stutter noticeably for me. You're the one beating the "my 9700k is the bestest" drum here and then posting graphs showing it being beaten in 1% by a 12 thread cpu that's a generation older.


^ you can go to the frametime plots there and see what I mean in action. Add Hitman to the list.
 
I never said anything about a 2700 - I just know disabling hyperthreading in that game on an 8C CPU makes it stutter noticeably for me. You're the one beating the "my 9700k is the bestest" drum here and then posting graphs showing it being beaten in 1% by a 12 thread cpu that's a generation older.

you choose the game and you made the statement. frankly I find them al great CPUs including the 2700 but to make stuff up just to justify your own individual CPU is the definition of pathetic
 
you choose the game and you made the statement. frankly I find them al great CPUs including the 2700 but to make stuff up just to justify your own individual CPU is the definition of pathetic

Quote me - where did I make stuff up and where did I post anything about my own CPU? (btw I have 2 main CPUs but I'm assuming you're referring to the 8700k)...

but go ahead - install farcry 5 on your system and be treated to:
 
you were generalizing and that never works, frankly I don't care what you mark I just showed the obvious flaws in your post and my response to your post was redundant (I knew the answer already)
So.....why argue 8/8 versus 8/16 even though it was just an anecdotal mention having nothing to do with the rest of the post? Also interesting that you immediately went apples-to-oranges (different brands/architectures) in a discussion about core counts alone. Why not do the direct comparison to the 9900k? Can you show me a case where a 9900k performs worse?

I'm to assume that if you were offered a direct trade -- your 8/8 9700k for a 9900k you wouldn't take it?
 
Status
Not open for further replies.
Back
Top