• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Core i5-10400F

I was actually agreeing with you and I actually wanted to state, the MB in your post actually had two m2 slots. Something basic b450 motherboards don't generally get.

One of those is for a wi-fi card isn't it? So it only has one for an SSD. The slightly more expensive Z490 boards from ASRock have two.
 
Great, I'm between 10400F or 10600K, both interest me to play, my choice is not negotiable.
 
Did you even look at the charts. It lost <7% at 720p using a 2080ti which is the absolute worst case scenario.

CPU performance was down a whopping 2%.

Most intel fans will take that same 2% as a CRUSHING victory in gaming lol.......
 
Most intel fans will take that same 2% as a CRUSHING victory in gaming lol.......

They really do, it gets kind of funny. And those same people dont have 2080ti's, so there is no difference for them.
 
They really do, it gets kind of funny. And those same people dont have 2080ti's, so there is no difference for them.
Most intel fans will take that same 2% as a CRUSHING victory in gaming lol.......

It's just sad to see. In some games it's up to 10%, but what's the point of having 200 fps vs 220 fps...
For competitive gamers, yeah. Back in the day I still play SC2 in Korea, yeah that will matter but nowadays it's different
In my gaming world, as long as it won't slow down below 60fps, I consider it a pass for both CPU and GPU. I value the time of my work more than 5% FPS gain.
It's difficult to recommend an Intel CPU with current line up.
 
It's just sad to see. In some games it's up to 10%, but what's the point of having 200 fps vs 220 fps.
Don't look at it that way. 10% is the difference between reaching 60 fps or sitting at 55. It also can be the difference for turning up IQ settings and still reaching 60/144 fps.

10% is the difference between some card tiers.

Intel needs high end hardware to OC, ryzen needs low end.
?

You need a K chip and a "Z" based board. They have these up and down their lineup. Maybe I'm splitting hairs.

What I dont see is an overclockable i3 4c/8t part in 10th gen though. That said, i wouldn't start 2020 with a 4c/8t part either...
 
Last edited:
Don't look at it that way. 10% is the difference between reaching 60 fps or sitting at 55. It also can be the difference for turning up IQ settings and still reaching 60/144 fps.

10% is the difference between card tiers.

the 10% we see is more often in 5FPS to 6FPS at high res, or 200FPS and 220FPS at low res

seriously, the difference rarely ends up in that 'important' range of 60-120FPS because the extremes are where the higher percentage gains show up
 
Intel needs high end hardware to OC, ryzen needs low end.
?

You need a K chip. They have these up and down their lineup. Granted it isn't every single chip or low end board From amd...I'm with you there. But its not like its only HEDT or flagship processors.
the 10% we see is more often in 5FPS to 6FPS at high res, or 200FPS and 220FPS at low res

seriously, the difference rarely ends up in that 'important' range of 60-120FPS because the extremes are where the higher percentage gains show up
10% is 10%. Again, it can be anywhere in the fps range. Don't fool yourself into thinking otherwise. It depends on the game, settings, and card used.

10% is the difference between turning IQ up or not...and in some cases, another tier video card...There is also 1% values too. While not The Gospel as some make them out to be, that is significant in gaming.
 
Last edited:
?

You need a K chip. They have these up and down their lineup. Granted it isn't every single chip or low end board From amd...I'm with you there. But its not like its only HEDT or flagship processors.
10% is 10%. Again, it can be anywhere in the fps range. Don't fool yourself into thinking otherwise. It depends on the game, settings, and card used.

10% is the difference between turning IQ up or not...and in some cases, another tier video card...There is also 1% values too. While not The Gospel as some make them out to be, that is significant in gaming.

That would be ideal, but your scenario (difference between turning IQ up or not...and in some cases, another tier video card) is like 1% of the total cases. The rest is 110 to 120 and the like, no difference...
 
I OC'd my 9600k 41%, all cores fully stable, it tickles all the right nostalgia spots and takes me back to early 2000s and overclocking golden age. Most people can't get 41% with any Ryzen, any generation, even if it's a 1% chip and they are poring LN on it like syrup over pancakes! My Ryzen could do 10 to 15% (depending on voltage and alignment of the planets) and that's with a 360mm AIO and 6 external push-pull fans. And if million reddit threads are any indication, most Ryzen owners are tweaking for stability almost out of the box or trying a timing setting #74732 to see if they can get their RAM stable at a mediocre overclock to squeeze more performance out infinity fabric. Whoever said Ryzen's can't OC for poop (Gmr_Chick?) was factually right. These things are maxed at the plant. You are mostly flipping switches that are not connected to anything when playing with Ryzen.

...
..
.
 
That would be ideal, but your scenario (difference between turning IQ up or not...and in some cases, another tier video card) is like 1% of the total cases. The rest is 110 to 120 and the like, no difference...
lol, my guy...10% is 10%. It will allow you to turn up the IQ or is the difference between tiers of some cards in ALL cases. It could be the difference between 55 FPS and 61. or 130 vs ~144, or high to ultra.... 4xAA instead of 2xAA....etc. It isn't much, I get that, but let's stop pretending that the difference is only in select (1%) of situations. It isn't.

It can also be the difference between 100 and 110 fps or 200 and 220 fps and not matter... but 10% is 10% all of the time. IQ settings don't know when it can and can't.
 
Last edited:
lol, my guy...10% is 10%. It will allow you to turn up the IQ or is the difference between tiers of some cards in ALL cases. It could be the difference between 55 FPS and 61. or 130 vs ~144, or high to ultra.... 4xAA instead of 2xAA....etc. It isn't much, I get that, but let's stop pretending that the difference is only in select situations. It isn't.

Yeah, i get it, it could be whaterev you say, except in the majority of cases it ins't. I think the majority dictates the importance, i don't buy a cpu to play 1 game.
 
Yeah, i get it, it could be whaterev you say, except in the majority of cases it ins't. I think the majority dictates the importance, i don't buy a cpu to play 1 game.
Who does (did I say such a thing?)? I agree with that (unrelated) point! Don't move the goal posts! :)

I'm simply saying that 10% difference can be significant...especially when we aren't dealing with a 2080 Ti but mid-range cards which show similar peformance drops..........and where it DOES matter more than "1%" trying to reach 60/120/144/165 fps or that next notch up in IQ without losing FPS (regardless of threshold).
 
Last edited:
Who does (did I say such a thing?)? I agree with that (unrelated) point! Don't move the goal posts! :)

I'm simply saying that 10% difference can be significant...especially when we aren't dealing with a 2080 Ti but mid-range cards which show similar peformance drops..........and where it DOES matter more than "1%" trying to reach 60/120/144/165 fps or that next notch up in IQ without losing FPS (regardless of threshold).

I agree it can be significant, but! in most cases it isn't :p, that's why for gaming i pay more atention to gpu than cpu. Right now i'm planning to upgrade my 2400g for a 3600 or a 4600 (that will really make a difference), on the same board i have since 2 years ago. I'm also waiting for the next rtx 3060/rx 6600xt for decent raitracing gaming.
 
I agree it can be significant, but! in most cases it isn't :p, that's why for gaming i pay more atention to gpu than cpu. Right now i'm planning to upgrade my 2400g for a 3600 or a 4600 (that will really make a difference), on the same board i have since 2 years ago. I'm also waiting for the next rtx 3060/rx 6600xt for decent raitracing gaming.
To each their own. I liken it to a glass ceiling. In many cases, you won't hit your head, but there is nothing wrong with a little more headroom. Obviously if there are budget constraints, that $50-100 can go towards the nest class up GPU anyway. :p
 
Who does (did I say such a thing?)? I agree with that (unrelated) point! Don't move the goal posts! :)

I'm simply saying that 10% difference can be significant...especially when we aren't dealing with a 2080 Ti but mid-range cards which show similar peformance drops..........and where it DOES matter more than "1%" trying to reach 60/120/144/165 fps or that next notch up in IQ without losing FPS (regardless of threshold).

10% with a 2080ti at max FPS is 1% with any lesser GPU, or with higher graphics settings.

you're acting like that 10% applies to all situations when it doesnt
 
It applies to a lot more than just a 2080ti.

No, it doesnt. The amount gets smaller and smaller and smaller until its meaningless.
 
10% with a 2080ti at max FPS is 1% with any lesser GPU, or with higher graphics settings.
Most tests I have seen are done with graphics on ultra at 1080p in the first place. Can't use TPUs because Wiz lowers things to 720p and medium to exacerbate the issue for the article.

No, it doesnt. The amount gets smaller and smaller and smaller until its meaningless.
I'm sure it does in some capacity...but please provide links to support your assertion. :)
10% with a 2080ti at max FPS is 1% with any lesser GPU, or with higher graphics settings.


I'm not talking GTX 1650 and 5500 XT, but mid-range cards, 2070/2070S/2080/2080S, 5700XT/5600xt... etc... more common cards also show similar differences, though again, it does go down.
 
Last edited:
Oops.. ultra? Low? Not the point anyway. Its the artificial lowering of the resolution that throws things off. :)
I'm using same ultra settings for all resolutions, that's why 720p and not 1080p ultralow or something, as that would just confuse readers, to have 2x 1080p with wildly different FPS
 
I'm using same ultra settings for all resolutions, that's why 720p and not 1080p ultralow or something, as that would just confuse readers, to have 2x 1080p with wildly different FPS
I know you are using the same settings. I didn't say otherwise, just recalled incorrectly that you use medium. :)

I take exception to the artificially lowered resolution to magnify the issue.

Is there a new article where you moved past 720 and went to 1080p? If I missed it (link! would love to read it), apologies, and TYTY!!!!

Do you know of any modern tests like this that use more mid-range GPUs to test? All I know is that while Mussels is right at a high level (the difference goes down with lesser GPUs), it isn't "ONLY' the 2080 Ti that shows notable differences like he initially stated.

It would be interesting to see that article with low-end, mid-range, and high-end cards from both camps....... :)
 
Do you know of any modern tests like this that use more mid-range GPUs to test? All I know is that while Mussels is right at a high level (the difference goes down with lesser GPUs), it isn't "ONLY' the 2080 Ti that shows notable differences like he initially stated.

It would be interesting to see that article with low-end, mid-range, and high-end cards from both camps.......
Not aware of any such article, could be something interesting for a rainy week .. quite a lot of testing.

Once I'm finished with Comet Lake I'll work on updates to the CPU bench, maybe such an article could introduce that new setup, will be late summer at least
 
Nice reviews here on the new Comet Lake chips. I was hoping to find some reviews on chips in the series besides the 2 Intel was (apparently) handing out to see how they compared, for a few different usage cases. I appreciate that you offered this, and that you spend a lot of time running all the different categories of software benchmarks, but I would like to ask you about possibly adding 1 or 2 more.

Any chance you might look at adding something like a Nero suite in these reviews, for a couple tests.

A couple of my usage examples are using Nero video to manually strip out commercials from 1080 captured video of cable programming (and yes, there is a built-in process for this which should be perfect for automated testing), which I then use to burn to movies or multi-episode/multi-hour Blu-ray disks. I am interested in the encoding times, etc. as I usually do this as an overnight thing, and I'm wondering how such processes work on these newer chips. I have been watching/waiting especially with all the emphasis on the Intel side about building in optimized AVX512 encoding / decoding for playback etc. in the new chips.

I'm looking at several usage cases for PCs right now, looking for lower limits ie. some quiet living room type devices as possible set-top box streaming devices / video servers, and for video capture like with a Hauppauge card, rising to workstation machines for video processing, encoding, and media authoring (testing I mentioned), and then of course gaming, hopefully some multi-screen maybe 3 x 2K monitors for gaming / possibly a VR headset once things move a little further along.

Anyway, thank you for all the effort.
 
Back
Top