• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Why are some people still saying a 4-thread i5 is good enough with a beefy GPU...

Status
Not open for further replies.
I'm just going to say my girlfriend runs a 4690k and 1070 with no issues in any game at all. I'll still have quads running for the next few years since for 99% of my use cases and people I build rigs for don't need anything more.
 
Apparently they don't care if their frames tank in modern games.

View attachment 138067View attachment 138068View attachment 138069


This is the biggest reason as much as I love Techpowerup's reviews They really need to incorporate 1% lows because any person is going to feel massive drops in frame rates.
At the same time if the person want's to pair a 2070 with a 6600k now and upgrade his cpu later if it isn't up to par I see no issue with it.
The quad CPUs tank when paired with a 2080ti and monitor capable of handling 100fps+. You figure most people with a quad are using something like a GTX 1060 or AMD 580 at 1080p 60hz and probably have settings at very high rather then ultra then the FPS drops are not as severe.

I think enthusiast forums lose track of how the average gamer operates, for them the joy is in playing the game and they don't upgrade until the games don't run well at all. Buy a $1000 system, play it for 8 years, repeat. Sometimes these guys will buy a high end GPU mid way through.
Personally I am on something of a 4-5 year upgrade cycle.
You can go a long ways before it barely runs. I know a guy running a Sandy Bridge i7 laptop with Intel integrated graphics playing GTA V, same guy is hoping to run RDR2 on it, maybe its possible with lowest settings, who knows.
A long time ago when WoW first launched I raided with a player who used Intel integrated graphics. His fps would dip down to 10fps at times. For most people an awful gaming experience but he was happy as a clam he could play the game and not have to spend money for a dedicated graphics card...
 
While it may take a small hit in some games, 4 cores if still the standard even with a 2080.
 
There are people pairing it with a beefy GPU, you don't buy such GPU for 40-50 fps.
There is something stuck in my mind for years, if not decades. People buying the best CPU, the best GPU, the best motherboard and the best RAM, only to buy last the worst, but cheapest PSU. Another example is a cousin of mine. About 12 years ago he wanted a PC just so he can use a PCI satellite card to watch full HD contend. I created a list of parts and added a simple HD 2400 graphics card. It wasn't meant to play games or anything. Not even run 3D screensavers. The HD 2400 ended up not good enough for the task. I said him to go back at the shop and ask to replace it with an HD 2600. He bought an HD 3870, the most expensive AMD card of the time.

Most people don't know or wish to invest the time to learn about hardware. So they aren't going to throw too much thinking on their next PC. "What did my friend said about a gaming PC? That I need a strong graphics card. So I go out and buy the most expensive graphics card and then, with whatever is left, the rest of the hardware. If that means a cheap PSU, a cheap motherboard, the minimum amount of RAM, a hard disk and a 4 core CPU, then 4 core CPU it is".

Also most people don't care about frame rate. If it looks smooth or even semi smooth, it's more than enough. In most games everything over 15fps, yes 15, will be OK in the eyes of many. 30 fps will look as ultra smooth. Is that 4 core CPU cutting frame rate at half? Who cares. It's smooth. The end.
 
IMO it's not good enough anymore for gaming, especially for the latest AAA titles, and even some from years ago.
High fps AAA gaming? You can just forget about that with a 4-core i5.
Probably because the majority don’t game higher than 60fps. High fps is a minority. Heck, many people like me see 60fps as absolutely perfect with all details cranked up.

Now, will a 4 core still support that? Currently, yes, most of the time @1080p. A couple years from now? Probably not.
 
it's only enough if you use programs that dont use more than 4 threads.
 
R5 3600
RX580
1920x1200 ultra details
...and 3+hours of FarCry5 (2018)

----------------------------------current-------min-------max-------avg
FarCry5_3+hours_00.png

See the max and avg CPU usage


Clearly the GPU is the bottleneck here as its maxed out on almost every aspect

FarCry5_3+hours_02.png

Avg FPS = 74
1% low = 54
by MSI afterburner

By seeing only the 30% usage out of a 12 threaded CPU (12x0.3 = 3.6 threads) can one assume with confidence that a 4c/4t CPU is all needed to play smooth this game.
But this is not the case...
Look at the whole truth below about max and avg thread usage this time

FarCry5_3+hours_01.png
8 threads usage on avg and hitting the 12...

I'm not saying that a 4c/4t CPU is useless today... No by all means.
But it will suffer more or less depending the game. And this will become more and more noticable with modern AAA games, if not already.
 
Last edited:
I'm not saying that a 4c/4t CPU is useless today... No by all means.

Well, is not useless if a certain program that you want to use is made for 4 cores, now take resident evil 2 remake, minimum 6 cores, 4 cores full load 30 fps, 6 cores 60 fps. As always, aim for midrange, never low end and you should be okay, midrange as it stands is 6 cores 12 threads. Just follow the midrange trend and 90% of people should be okay.
 
P4-630 said:
Why are some people still saying a 4-thread i5 is good enough with a beefy GPU...
Because for them, it is good enough. Because most people are not obsessed with bragging rights or with achieving the most FPS possible.

Instead, many gamers only care about the "escape" - being entertained with the game play.

With that in mind and most significantly IMO, game developers know most people don't have the budgets to spend several $100 on the graphics card alone, plus several $100 more on the CPU, $100s more on RAM, motherboards, PSUs, etc. Many gamers have just $500 (or less!) to spend on their whole computer. In some cases, that budget also includes the monitor, keyboard and mouse too. So developers code their games to provide good "game play" on lessor systems thus increasing the entertainment value to the masses, instead of just the hardware enthusiasts with deeper pockets.

Therefore, a 4-thread i5 (and the AMD equivalent) is good enough for most people who use their computers to game too.
 
you're missing the point 'beefy gpu', as the discussion is about bottleneck holding back the fps with a relatively weak cpu
 
I never have problems with my other machines Core I5 3570K. Works like a charm in most games.
 
you're missing the point 'beefy gpu', as the discussion is about bottleneck holding back the fps with a relatively weak cpu

Four cores is nether week nor a bottleneck by any definition of the word
Shit I remember just a few years ago my OC G3258 was benchmarking the same as my i7
 
Last edited:
4/4 definitely is. in worse case scenario 2070 shows 40% load 40FPS at 1080p ultra max settings. best case is 75% cpu 100% gpu. 10 core is strongly recommended.
 
you're missing the point 'beefy gpu', as the discussion is about bottleneck holding back the fps with a relatively weak cpu
No I didn't. You missed my point. The question was, "is a 4-thread i5 good enough?" My answer was clear - for many, if not most people, the answer is "yes". It is "good enough".

Just because a GPU outclasses the CPU, that does NOT suggest the gaming experience comes to a complete halt! In fact, in terms of the perception, it does not even automatically suggest the player will notice anything at all!

Contrary to what some here seem to think (or worse, what they want everyone else to think :() , bottlenecks don't always impact perceived performance. It is simply wrong to suggest a less capable component will always slow over all performance all the time in every scenario in every game.

If a 3-lane road feeds into a 2-lane tunnel, that does NOT automatically mean ALL traffic will slow down all the time. If the 3-lane traffic is not jammed up bumper to bumper and is moving smoothly, it can just easily merge into 2-lanes and keep moving at full speed.

If a 2-lane road feeds into a 3-lane tunnel, that does NOT mean the tunnel is trying to suck up the vehicles faster than the road can deliver them. It just patiently waits.

Are there scenarios where a bottleneck will impact performance? Of course! No one is denying that. But even in very demanding games, that is not happening at every point in the game. Again, the developers want to include as many potential buyers as possible - they don't want to exclude any buyers.

FTR, I would MUCH RATHER have a lessor CPU feeding a "beefy" graphics card, than a beefy CPU trying to feed an entry level graphics card. Or worse, a beefy CPU and a beefy graphics card and only a tiny amount of RAM.
 
I don’t have any problems with my i7 and hyper threading off. Actually feels a bit quicker tbh. My pc, both of them, still kick the shit out of my ps4 and switch in every possible way. And they are quieter then the ps4 as well.. though the switch is pretty quiet.. :D
 
No I didn't. You missed my point. The question was, "is a 4-thread i5 good enough?" My answer was clear - for many, if not most people, the answer is "yes". It is "good enough".

Just because a GPU outclasses the CPU, that does NOT suggest the gaming experience comes to a complete halt! In fact, in terms of the perception, it does not even automatically suggest the player will notice anything at all!

Contrary to what some here seem to think (or worse, what they want everyone else to think :() , bottlenecks don't always impact perceived performance. It is simply wrong to suggest a less capable component will always slow over all performance all the time in every scenario in every game.

If a 3-lane road feeds into a 2-lane tunnel, that does NOT automatically mean ALL traffic will slow down all the time. If the 3-lane traffic is not jammed up bumper to bumper and is moving smoothly, it can just easily merge into 2-lanes and keep moving at full speed.

If a 2-lane road feeds into a 3-lane tunnel, that does NOT mean the tunnel is trying to suck up the vehicles faster than the road can deliver them. It just patiently waits.

Are there scenarios where a bottleneck will impact performance? Of course! No one is denying that. But even in very demanding games, that is not happening at every point in the game. Again, the developers want to include as many potential buyers as possible - they don't want to exclude any buyers.

FTR, I would MUCH RATHER have a lessor CPU feeding a "beefy" graphics card, than a beefy CPU trying to feed an entry level graphics card. Or worse, a beefy CPU and a beefy graphics card and only a tiny amount of RAM.
I argued this when I was using my fx8350, for four years so obviously I heartily agree.

The buyer of extreme GPUs who underspends(to some) on the cpu is just spending a bit too much dime, but I would say that I expect four cores to start becoming less useable within a few years personally, not now though.
 
4/4 definitely is. in worse case scenario 2070 shows 40% load 40FPS at 1080p ultra max settings. best case is 75% cpu 100% gpu. 10 core is strongly recommended.
Not everyone is gonna go out and get a 10 core or threaded chip. If they can afford only a quad then that's what they get. They can lower settings for frames of they want and that's that. I know plenty of people who go budget for their games since it's all they need.
 
Because the majority of people still game with GPU that are slower than a GTX 1070 and game on a 1080p/60 monitor and don't care if they are getting 40-50fps
To add to this
1. Most games are using modified engines that are 3-4 years old
and
2. Quad core i5 is still like 3-4x faster than the consoles CPU.........

CPU has been the most overrated component this gen. This is another case of an enthuasist making a thread asking why non-enthuasist use it for non-enthuasist purposes...... (lol)
 
I argued this when I was using my fx8350, for four years so obviously I heartily agree.

The buyer of extreme GPUs who underspends(to some) on the cpu is just spending a bit too much dime, but I would say that I expect four cores to start becoming less useable within a few years personally, not now though.

Already are
 
4/4 definitely is. in worse case scenario 2070 shows 40% load 40FPS at 1080p ultra max settings. best case is 75% cpu 100% gpu. 10 core is strongly recommended.
10 core? Anyone who recommended you a ten core CPU has no clue to what they are talking about

To add to this
1. Most games are using modified engines that are 3-4 years old
and
2. Quad core i5 is still like 3-4x faster than the consoles CPU.........

CPU has been the most overrated component this gen. This is another case of an enthuasist making a thread asking why non-enthuasist use it for non-enthuasist purposes...... (lol)
It will be next gen too as it is every generation. Console CPUs pull 18w compared to desktop CPUs that pull 100-150w while gaming. Even if you have a SoC chip consoles only have 175w to play around with and the graphics need the majority of that power as they should.
 
Nothing wong with a quad i5 as long as the game doesnt scale past the 4 cores.

I mean we are talking about good enough, not OMG overkill here.

So.... Whos got the game list that has games that scales past 4 cores? This would be a good enough thread to post it in!!
 
Most games are far from the most core demanding applications, so older CPUs are fine. Until the next generation consoles launch with updated game engines anyway. Even then engines like UE4 aren't too demanding. Outside of massively multicore aware game engines like EA's Frostbite & Stardock's Nitrous engines it's not that advanced yet CPU side.
 
Seems like quads are still okay. Like, if you were scrounging together a "just get it working" machine or maybe you're sitting on a quad core machine that you don't want to upgrade, you still have time.

But somehow I really doubt that will hold so true a couple of years from now. To put a quad core in a new gaming machine that you wanna get some years out of just seems questionable to me. It's definitely fine for now. I understand if you're not an enthusiast or whatever and you're looking to spend the minimum you can on a budget gaming setup. But even so, it's going to cost you some good money. The savings difference, to me, isn't enough to offset the fact that you are buying into something that's kind of on its way out. You're jumping on a part that's already starting to lag behind the rest and will only be less suitable as time goes by, and as things are already, it's not going to offer consistent performance from game to game. I don't buy the "It's about the games you play." I just think if you're going to build a gaming rig, you want it to be able to run ALL games well. And besides, how are you gonna know what games you might want to play next year? What if a CPU-heavy AAA title comes out that you really want to play, but the experience suffers as it brings your "good enough" quad-core to its knees? Doesn't matter that you don't usually play those kinds of games. You never know. It's a sad day when you're excited to play this new game, you buy it, fire it up, and are greeted with terrible stutter no matter what you do with your settings.

That's the other thing I find generally true with CPU requirements for games. You can't always do so much to offset it by changing graphical settings, as most of that stuff is now handled by the GPU. It's not always the case that you can ride-out settings tweaks like is often done to keep an old GPU going with new games. The CPU demands are more static. Unless you have a setting that lowers polygon counts, AI actors, or whatever... your CPU is up to what it is up to. CPU bottlenecks are nasty business. If the game needs the threads, it has to have them.

Enthusiast or not, a gaming rig is an extravagant purchase. Hell, I'd argue that if you're spending money on a gaming PC instead of a console, you're already an enthusiast. You buy the PC instead of the console because you want to take things to the next level. Otherwise, you wouldn't even be considering it. You don't need any of it. So my mindset is that if you're gonna spend the money, get as much performance as you can for the money. Just spend the cash. Save up if you gotta. Better to have a machine that's sometimes a little overkill, but always capable than one that in a couple of years is no longer cutting it. Now, if you want to upgrade, you're probably out more money than if you had just paid up for a little more than the bear-minimum standard. I don't see how that could be worth it.

I guess that's what it comes down to for me. Future-proofing is one thing... you can't win that game. But you can make some reasonable assumptions about the future value and viability of a part. My instinct with PC building is to never buy the bear minimum, because it doesn't take an expert to tell you that the bear minimum today is becoming obsolete tomorrow. If it's juuuust good enough today that's great, but where does it go from there? There's going overboard with insane parts, and then there's building a machine from tie-overs. If you like playing games and that's something you're gonna be doing for the foreseeable future, you don't want the tie-over.

I can't see myself ever recommending a cheap quad to someone looking to buy a high-end GPU. I guess if you wanna go cheap all around, that's fine. I still wouldn't recommend it but you could pair a CPU like that with an entry-level or midrange card and that just makes sense. But if you're going to toss almost $1000 at a GPU and then ~$100 a CPU, I really don't understand you at all. I understand money is what it is and maybe that's all you can swing. But if you can save that kind of cash for a GPU and everything else on the rig, why wouldn't you save another hundred bucks and get a 6 or even an 8 core? It's not like you have to go way out there to have a 6-core CPU these days. People are talking about it like the quads are such a better deal, but a hexacore isn't exactly high-end. Good, modern ones can be had for under $200. It'd be one thing if they were really expensive, but they're mid-range, you know? $200 actually buys you a solid 8-core! If you can buy the top-tier GPU, why skimp on the CPU?

I think if you have a quad and it's working out for you, no reason to feel like you need to upgrade. Clearly you don't. But if we're talking about a new machine, 6c/12t is a nice place to be for a good, consistent gaming and desktop experience that is much more likely to hold up long term. It's not like before where all we could say about high core counts was "they're going to be used more one day!" That day is already here, from what I can see.
 
Status
Not open for further replies.
Back
Top