• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Challenges AMD to Beat it in "Real World Gaming"

So, innovation means more cores (which already exist in HEDT) at a cheaper price than the competition and (an assumption) it's just as fast.

I'm glad they caught up, but at a high level, I dont see bringing more cores to a market which doesnt needed it, innovation. I call it moving the goal posts and watching the unadorned and ill advised get giddy. ;)

If anyone from intel shows you 1080p or 720p benchmarks you can safely ignore their claims. I think 1080 is starting to be overtaken by 1440p for "real world gaming".
No.

Read the steam stats. A full 78% are using 720p and 1080p. 4.6% is on 2560x1440.
 
If anyone from intel shows you 1080p or 720p benchmarks you can safely ignore their claims. I think 1080 is starting to be overtaken by 1440p for "real world gaming".

Hey, haven't you seen the comments? Everyone has a 2080ti and plays at 1080 medium for 200 fps.

In reality, they bought the most expensive Intel cpu to play at 60 fps on a GTX 1060 lol
 
sad for intel that money is made in enterprise industry
sad for gamers tho we get fucked in the ass
 
If anyone from intel shows you 1080p or 720p benchmarks you can safely ignore their claims. I think 1080 is starting to be overtaken by 1440p for "real world gaming".
1080p is bad but showing Cinebench is okay.
 
So, innovation means more cores (which already exist in HEDT) at a cheaper price than the competition and (an assumption) it's just as fast.

In reality, yes. Innovation is all about coming up with new methods to shift markets. They have come up with a method that allows them to sell cpus at one half to one third of the price of their competitor. It's innovation in the true sense and not just innovation of features.
 
Nor for the common customer, in most cases. ;)

I mean, more cores for cheaper is a benefit, just not so many cores in that same segment is the biggest concern.
 
Nor for the common customer, in most cases. ;)

I mean, more cores for cheaper is a benefit, just not so many cores in that same segment is the biggest concern.

Hey, I'm all for the stupid getting fleeced. Ultimately, it means more R&D money after all the bonuses are paid out, cocaine runs out, and strippers go home. Those with at least a little common sense or who can take advice will come out just fine.
 
More cores means more & better VMs, one of those VMs can be configured to run games why not ;)
 
More cores means more & better VMs, one of those VMs can be configured to run games why not ;)

Because a 'gamer' isn't going to shell out what it would take for a VM system that can actually use a gpu.
 
Intel will remain King of the 8 cores CPUs with the power hungry over priced i9-9900KS

9900KS is an 3800X Killer!

3900X is an 9900KS Killer but it has 12 cores, Until Q4 2019 Intel upcoming 10 Cores on new socket and PCIe 4.0
400 series boards.

AMD playing catch up to Intel, Coffeelake/Coffeelake Refresh pretty much destroyed all PC Gaming benchmarks vs Zen/Zen+ but now with Zen2 3rd time a charm?
 
Can we challenge Intel to make a CPU architecture that isn't full of security vulnerabilities?
 
More Cores and more CPU power is mandatory for better AI, simultaneous animations, decals, physics, minimum FPS, etc, etc.
One of the best examples are the strategy games, ex: Total War, Civilization, etc, where a strong CPU can provide better animationa during fighting scenes and improved AI tactics.
 
PS: That shady move to drop most MDS-patches just the day after Ryzen 3xxx will be released is so damn obnoxious by Intel, it's despicable …

I've been running a MDS-patched microcode since like 2 weeks ago... what are you trying to say here?
 
Ah, so it's actually worse than thought; they had the patches available and chose not to use them.
 
More Cores and more CPU power is mandatory for better AI, simultaneous animations, decals, physics, minimum FPS, etc, etc.
One of the best examples are the strategy games, ex: Total War, Civilization, etc, where a strong CPU can provide better animationa during fighting scenes and improved AI tactics.
True, but, these titles dont respond past 6c/6t. Anything over 8c/16t is overkill unless the user has a specific use case. I'd still buy nothing more than 8c/16t today for a CPU I expect toast 5 years. Maybe then most software will be able to use more...
 
I mind about who's the King of the Hill very much, because I have a liquid nitrogen chamber running at -190C with my system in it, and cooling system using the loop of Bose-Einstein condensate to cool CPU, GPU, sound card, floppy disk and RAM. Everything is overclocked to a value I'm not going to talk about, but in 8K i have >500FPS in everything. I notice even the smallest FPS drops, because I see far more than typical 25FPS human eye can. So, welcome to real gaming, let's see who's the only rightful King!

That system aside, Steam is pretty much representative sample of 'real gaming' and I have another rig which is within the Steam range... Whatever is difference on over 100 FPS doesn't concern me at all, because I have 60Hz display. I have my budget, and only thing I'm actually interested in is having a satisfying experience in relatively low demanding games that I play. I'm obviously out of 'real gaming', and so are 80% of so called 'players' on Steam.

To tell the truth, I always bought AMD CPUs and hated near-monopoly that Intel held. And similar to NVIDIA. This call-out interest me only if it's in my price range, and somehow proves that Intel has an offer which really makes a noticeable difference... For my System 2, that is. For System 1, to be honest, actual silicon area is of the more importance, since everything is delidded and cooled directly by the condensate... Gaming near super-conductivity rules, and it's the only thing worth living for...
 
I mind about who's the King of the Hill very much, because I have a liquid nitrogen chamber running at -190C with my system in it, and cooling system using the loop of Bose-Einstein condensate to cool CPU, GPU, sound card, floppy disk and RAM. Everything is overclocked to a value I'm not going to talk about, but in 8K i have >500FPS in everything. I notice even the smallest FPS drops, because I see far more than typical 25FPS human eye can. So, welcome to real gaming, let's see who's the only rightful King!

That system aside, Steam is pretty much representative sample of 'real gaming' and I have another rig which is within the Steam range... Whatever is difference on over 100 FPS doesn't concern me at all, because I have 60Hz display. I have my budget, and only thing I'm actually interested in is having a satisfying experience in relatively low demanding games that I play. I'm obviously out of 'real gaming', and so are 80% of so called 'players' on Steam.

To tell the truth, I always bought AMD CPUs and hated near-monopoly that Intel held. And similar to NVIDIA. This call-out interest me only if it's in my price range, and somehow proves that Intel has an offer which really makes a noticeable difference... For my System 2, that is. For System 1, to be honest, actual silicon area is of the more importance, since everything is delidded and cooled directly by the condensate... Gaming near super-conductivity rules, and it's the only thing worth living for...

Peer Edited Cos i can and i just stopped larf.........ing :roll:o_O
 
I've been running a MDS-patched microcode since like 2 weeks ago
Gigabyte still hasn't released new UEFI versions for their Z370 boards. :(
 
Gigabyte still hasn't released new UEFI versions for their Z370 boards. :(

More on Gigabyte than Intel. But yeah, Gigabyte sucks for quick updates.
 
More on Gigabyte than Intel. But yeah, Gigabyte sucks for quick updates.
I'm still on F14 version UEFI. Microsoft hasn't released new microcode updates for Windows 10 v1903 to be loaded at boot time either. Double :(

Wait, I found the microcode update package from Microsoft. Unfortunately they really don't make it easy to find. For those that want it, it's located here. Stupid Microsoft didn't list it on their microcode summary page as they should have. :mad:
 
True, but, these titles dont respond past 6c/6t. Anything over 8c/16t is overkill unless the user has a specific use case. I'd still buy nothing more than 8c/16t today for a CPU I expect toast 5 years. Maybe then most software will be able to use more...

That is not true. If it were the 2950X would not be the fastest CPU that AMD has for Total War.
 
That is not true. If it were the 2950X would not be the fastest CPU that AMD has for Total War.
There is always an exception (and I mentioned specific use cases already...)... but clearly that is not the rule. The rule will last long while the exception list gets longer. There is seriously little reason to think more than that, outside of an outlier, will be needed in even 5 years.

Also... looks like scaling take a sharp dive going from 6-8c... https://www.overclock3d.net/reviews/software/total_war_three_kingdoms_pc_performance_review/7

Can you kindly link the test that shows a 2950X in it? Many thanks! :)
 
Last edited:
There is always an exception... but clearly that is not the rule. The rule will last long while the exception list gets longer. There is seriously little reason to think more than that, outside of an outlier, will be needed in even 5 years.

Agreed that it was not the rule until AMD started AM4 and since Sony and MS will be using 8/16 Ryzen 2 CPUs in their consoles, games will be use more cores. Though there are not a ton of examples from previous games, the fact that multi core CPUs higher than 4 cores are now commonplace means that games will be optimized to use every core in your machine. Just 5 years ago the argument was that you did not need more than 1 core for gaming.
 
Agreed that it was not the rule until AMD started AM4 and since Sony and MS will be using 8/16 Ryzen 2 CPUs in their consoles, games will be use more cores. Though there are not a ton of examples from previous games, the fact that multi core CPUs higher than 4 cores are now commonplace means that games will be optimized to use every core in your machine. Just 5 years ago the argument was that you did not need more than 1 core for gaming.
It still isn't the rule regardless of what AMD ahs now and Sony will bring to the table (in mid 2020+).

Things will change... but people's time tables are off (IMO). Again if buying today for a 5 year cycle, I wouldn't get more than 8c/16t...unless you can use the threads now.

5 years ago it was a dual core or dual with HT... more than 4 CPU cores I wouldn't call commonplace today either. You'll note that according to steam stats a full 80% of users are on dual or quad core CPUs. 13% are hex, 2% octo. Surely we will see things change over the next few years, but I wouldn't hold my breath, as an average user, that more than 8c/16t will be useful for the majority. Again, AMD and Intel has had Hex's out for almost 8 years already... its going to take more time than most people think, regardless if it is accelerated now.

You got that link with the 2950X and Total War?
 
Last edited:
Back
Top