• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

yet another game that fails to understand CPU requirements & performance

yet anouther op that fails to understand what hes talking about
 
Why is he testing with GTX 980Ti and not with GTX 670/GTX 970 or R9-280/290 range like mentioned in the specs? Of course things look like this when graphic card gives CPU unlimited breathing space... It's different when parts of the system start to struggle...
 
Reality is, we have a distorted view of playable range. I've seen people play a game on a subpar laptop, at 17 fps. And beat the game. So game developers list minimal specs as what appears to us as unplayable. Where we like to play at 100+ fps range. Its always been this way. Get the best hardware you can afford. New games will come and go and requirements will get tougher

Years ago, I wrote a game that was largely "twitch" combat based, in old school DirectDraw. No one could beat it. Along comes my best friend with his 486DX Compaq Contura notebook at all of 75Mhz. "Watch this" he says. He beats the game in a little under an hour, at what could generously be considered 1 FPS. It was boring, but he sure beat that game. It wasn't very "twitch" based when all the combat was being played at the rate of a board game.
 
sorry, what am i missing?

The game is more than playable on CPUs far below the minimum requirement. So either Intel has paid them off, or they're just clueless about the hardware requirements of their own game.

While the listed requirements aren't very accurate, they do give an idea of what's required.

Rule of thumb is to go for the most powerful system you can afford and that includes the CPU. One can't have too much power when it comes to games.

While dirtyferret has been somewhat enigmatic and contrarian in his contributions, I'm inclined to agree with his assessment. The most powerful CPU that you can afford is almost certainly wasteful, unless you're fairly poor. Many people could afford a ten-core i7, and for most of them it would be wasteful. Do not assume that people using older or weaker CPUs do so because they are poor or incompetent - on the contrary, they may simply be sensible.
 
Just give it up now. :rolleyes:

I see that even a warning from a moderator isn't enough to make you understand.
For the record, the warning was to you, not Qubit, just wanted to clean that misconception up considering you wrote it in your report.

I vote both of you take a breather, stop being asshats and resorting to down-the-nose comments, and consider that we're talking about computer games here. Nobody is above anyone else here.
 
The 2500k is still such a beast. Which CPU is it's successor in the Intel line-up?
 
The 3570k, 4670k, 6600k, all the unlocked i5s.
 
My personal conclusion...
The Xeon e3-1230v2 that I bought back in 2012 was well damn worth its 200,-€.
 
The 2500k is still such a beast.
Agreed. I've got the 2600K which is very similar in gaming performance and that still plays all the latest games very well. It's only starting to show its age a bit now in that it can't hit the highest frame rates of 120fps+ in some modern games, but it's still well over 60fps.
 
Last edited:
The only game that made my PII 980 struggle (besides heavy console emulators) is project cars with over 30 bots on a race, it reaches a top of 50fps.
CPU performance is a little stagnant nowdays. Maybe that's why the developers recommend such over the top CPUs as minimum, just to help Intel and AMD sell them?
 
The 3570k, 4670k, 6600k, all the unlocked i5s.
You left out the 2550K and 4690K(not really since they're incl. in the unlocked category). And the plethora of locked i5s that beat it stock for stock. Such as the 5675C for instance. Which would pulverize even a decently OCed 2500K.
 
Apparently doom runs very smooth on a 4670+Sapphire 290 maxed out 1080p, or so says my eldest son.
But that's good in my book!
Because I don't want to force-upgrade every two or 3 years just to play the latest in all its glory.

But I understand the game devs. If they would list a low end budget card as recommended, everybody would say the graphics must stink or something like that.
So I don't see a fail, just some misinformation.
 
I checked :
Doom 2016 can run on toaster (if paired with "good enough" GPU ;)) :
 
P4 netburst hahaha epic, btw the right side tool with info cpu/gpu is from afterburner ?
 
P4 netburst hahaha epic, btw the right side tool with info cpu/gpu is from afterburner ?
Nope ;)
Rivatuner/Afterburner is on the left (off center), on the right U can see build-in statistics tool for Doom 2016 :)
 
Op would have been better if FX-4*** + gtx 670 were added as well.
 
Years ago, I wrote a game that was largely "twitch" combat based, in old school DirectDraw. No one could beat it. Along comes my best friend with his 486DX Compaq Contura notebook at all of 75Mhz. "Watch this" he says. He beats the game in a little under an hour, at what could generously be considered 1 FPS. It was boring, but he sure beat that game. It wasn't very "twitch" based when all the combat was being played at the rate of a board game.

Back in 2004 I was still playing on el cheapo laptops connected to HDTV. So that was 15 fps on a 50-70ms delay panel. One adapts :) Didn't play that much worse than I do now, and I tend to be competitive.

From time to time, I put a 20fps frame cap on random games just for nostalgia :D
 
What the hell are you guys talking about? Doom ran great on my 180Mhz Pentium Pro, and it only had 2MB of VRAM. lol
 
dang look at the 2500k... 5 years old and still only a few frames off the leaders. Who knew you'd basically be set for life with such an economical cpu back in the day? I mean I get that the majority saw it as a steal back then, but even they have to be surprised that 5 years later it hasn't suffered at all especially given that 5 years before that we had Athlon 64 X2's and Cedar Mill based Pentium 4's (conroe didn't release until july of that year) they weren't exactly keeping up in 2011 yet in 2016 the 2500k does just fine.
 
dang look at the 2500k... 5 years old and still only a few frames off the leaders. Who knew you'd basically be set for life with such an economical cpu back in the day? I mean I get that the majority saw it as a steal back then, but even they have to be surprised that 5 years later it hasn't suffered at all especially given that 5 years before that we had Athlon 64 X2's and Cedar Mill based Pentium 4's (conroe didn't release until july of that year) they weren't exactly keeping up in 2011 yet in 2016 the 2500k does just fine.

With so few advances in game CPU requirements, if that is one of the major uses of a rig someone owns, the only limiting factor will be how long till the CPU, motherboard or RAM break down.

At that point, economics would likely dictate finally upgrading to the latest.
 
dang look at the 2500k... 5 years old and still only a few frames off the leaders. Who knew you'd basically be set for life with such an economical cpu back in the day? I mean I get that the majority saw it as a steal back then, but even they have to be surprised that 5 years later it hasn't suffered at all especially given that 5 years before that we had Athlon 64 X2's and Cedar Mill based Pentium 4's (conroe didn't release until july of that year) they weren't exactly keeping up in 2011 yet in 2016 the 2500k does just fine.

Yeah it all comes down to the incredible performance jump Intel made with Sandy Bridge. One arch to rule them all, to be honest. I think in hindsight, 20-30 years from now, we will still be reminiscing that architecture as one of the greater leaps in CPU land. So far Intel has been feeding off that ever since, because lets face it, under the hood all they've done from that point is move around different CPU parts between chipset and die and keep everything up to date (PCI 3.0, SATA, M2, etc.).

With that in mind, it makes sense for the Zen release to aim for 'Sandy Bridge IPC levels' because really that's where the holy grail's at. Once they've reached that point, they can eclipse Intel's meagre performance gain over the years with one or two architectural refinements.

In hindsight, it will probably also explain why Intel CPU's have progressed so little over the past five years. I am almost convinced that with the current CPU tasks, there is little to gain in terms of IPC or efficiency - something that is contrary to the thought of 'no competition = no perf gain' many people like to think.

ONLY when a new kind of CPU task comes to light that really tanks performance, will there be a new incentive to make CPU's a lot faster. Let's face it, the only real need for more performance is whatever has a good business case to it. If the vast majority doesn't need more perf, the market won't get it. Thát is why CPU's haven't progressed too much, they are capable at all the tasks we can throw at them.
 
Last edited:
I know it's logical to line up and bench CPU by FPS. This is just my thought, but how about measuring loading time, maybe if obvious, where CPU calculates the geometry of the world. Or monitors the utilization and compare each CPU Load (average of cores). Wouldn't that supplement the benchmark (not just of that time where the GPU dominates work). Just a thought.
 
Wait, what? Why would you read them like that?

What I don't believe is that the game just takes up 6GB.

Dunno about how he reads it but AMD kick ass in that chart, a <$170 cpu v's a $300+ cpu
 
Back
Top