• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Gaming benchmarks: Core i7 6700K hyperthreading test

Joined
Feb 22, 2009
Messages
786 (0.13/day)
Processor Ryzen 7 5700X3D
Motherboard Asrock B550 PG Velocita
Cooling Thermalright Silver Arrow 130
Memory G.Skill 4000 MHz DDR4 32 GB
Video Card(s) XFX Radeon RX 7800XT 16 GB
Storage Plextor PX-512M9PEGN 512 GB
Display(s) 1920x1200; 100 Hz
Case Fractal Design North XL
Audio Device(s) SSL2
Software Windows 10 Pro 22H2
Benchmark Scores i've got a shitload of them in 15 years of TPU membership
Greetings all!

According to my previous benchmark Intel Ivy Bridge Core i7 processor offers no hyper-threading (HT) performance in gaming. On the contrary, HT turned on on the Core i7 3770 did hurt the performance slightly when compared to just the 4 physical cores of Core i7 3770. This time i've put the Intel Skylake Core i7 6700K to the shooting wall for "execution". This test can not be compared to any other tests i've made before, based on my profile, due to new NVIDIA drivers, updated game versions, different PC, some different game settings, some different testing methods.

I've tested 21 game on 1920x1080 resolution with all graphical settings set on maximum, turned on, except: no anti-aliasing was used. Some exclusive NVIDIA features in games like Far Cry 4 and Witcher 3 were turned off. Physics effects were turned on, except for NVIDIA's exclusively based Physx effects, which were turned off.

Some games have their own build-in benchmarks, while for others i used Fraps custom 15 seconds benchmarks.

TEST SETUP

Intel Core i7 6700K 4 - 4.2 GHz
Asus Maximus 8 Ranger
Kingston Hyperx Fury 2X8 GB DDR4 2133 MHz C14
Patriot Pyro 120 GB sata3 Windows drive
WD Red 2 TB sata3 game drive
Gigabyte GeForce GTX780 Ti GHz Edition 3 GB

Windows 7 Pro 64 bit
NVIDIA Forceware 361.43



For those who prefer video presentation:


Let's begin.

Alan Wake American Nightmare



There is no difference between i7 mode and i5 mode.

Arma 3



There is no difference between i7 mode and i5 mode in this most demanding FPS game i've ever tested.

Batman Arkham Origins



I've deleted maximum FPS bar, since it was pulling over 300 and was irrelative.

Battlefield 4



There is almost no difference between i7 mode and i5 mode.

Bioshock Infinite



There is no difference between i7 mode and i5 mode.

Call of Duty Advanced Warfare




HT slightly decreases performance.

Company of Heroes 2



HT clearly hurts performance in this very demanding RTS

Crysis 3



There is quite a notable performance drop with HT on.

Dragon Age Inquisition



HT hurts minimal frame rate performance, while average and maximum remain the same.

F1 2015



HT slightly decreases performance.

Far Cry 4



There is no difference between i7 mode and i5 mode.

Hard Reset



Once again HT hurts minimal frame rate performance, while average and maximum remain competent.

Hitman Absolution



HT only improves maximum frame rate performance in this game - the same pattern was observed with Core i7 3770.

Max Payne 3



HT slightly decreases performance.

Metro Last Light Redux



Once again HT hurts minimal frame rate performance the most in this demanding game - the the same pattern was observed with Core i7 3770.

Rainbow Six Siege



There is a very small decrease in performance with HT on.

Serious Sam 3



HT decreases performance slightly.

Starcraft 2 Legacy of the Void



HT decreases performance slightly, yet constantly.

Tomb Raider



Like in Hitman Absolution, HT slightly increases maximum frame rates.

Watch Dogs



The performance drop with HT in this game is just too big to justify Core i7 over Core i5.

Witcher 3 Wild Hunt



Let's call it a draw.

-------------------------------------------------------------------

I've made these benchmarks 5 times in a row and they are as real as you can get.

CONCLUSIONS

Ever since i've got my first Nehalem Core i7 920, i've noticed no performance improvement in games with hyper-threading turned on. I have "cementified" these observations with testing my Core i7 3770 and now i do the same with a Core i7 6700K at my friends place. It's a pattern that continues for 6 years now... HT is not worthless in games however - it delivers awesome performance in Core i3 processors, but not in Core i7. Also HT might significantly improve online game performance, but that is not my domain.

1. Intel Core i7 HT offers no improvement in single player games.

2. Intel Core i7 HT slightly hurts gaming performance in most of the tested single player games.

I wish HT would improve gaming performance, but it actually hurts!!! It's a big disappointment and my friend made a mistake by replacing his Core i5 3570K with Core i7 6700K, because all he does is game, and nothing more.
 
Last edited:
Interesting, wonder what a 5820k or 5960x would get. Have a feeling that it will be less of a impact.

If it were not for the fact that i am away from my main computer this weekend i would check it my self.
 
It really doesn't do anything in gaming...This was a known thing.
 
HT is not worthless in games however - it delivers awesome performance in Core i3 processors

It was nice to see this mentioned as a footnote, as these chips are the one area I have observed a definate benefit to HT in gaming
 
tell your friend to buy 2 more monitors to match the ht. then they can play a game, do homework and watch a movie at the same time.
 
Its cool that it was mentioned an i3 benifits from ht in gaming. I still feel more likely to buy an i7 in the future than an i5 due to better future proofing.
 
Guys, i hope you won't ask me to make this test again on the lowest resolution and quality settings - i will not do this. I already did such thing in my previous Core i7 3770 test and it made no difference! Judging on that conclusion i see no point in doing the same thing here, especially now that i have upgraded my video card from GTX760 OC to both GTX980 G1 Gaming and GTX780 Ti GHz Edition, which are equally powerful and have no trouble maxing games out at 1080P. Besides, these tests take forever to set up and make...
 
It was nice to see this mentioned as a footnote, as these chips are the one area I have observed a definate benefit to HT in gaming

2 threads vs. 4 must be some magic threshold.

The i7 has more cache over the i5, (8 vs 6), so there may be the difference.

It'd be really cool to see the 3770 (4 threads) vs. the 3570. But hats off to taking the time to test and publish these results, nice job. :toast:
 
I wish HT would improve gaming performance, but it actually hurts!!! It's a big disappointment and my friend made a mistake by replacing his Core i5 3570K with Core i7 6700K, because all he does is game, and nothing more.
Did you compare a 3570K with a 6700K? Perhaps I missed it, but it seems like you made an assumption there? Don't forget, there is a several % increase in per clock performance . Perhaps that would be made up with the higher stock clock of the 6700K, or even at the same clock speeds because of cache? I just don't think that assumption should be made without actually testing it. Sounds logical, but... needs tested.

I apologize if I missed this information.


It'd be really cool to see the 3770 (4 threads) vs. the 3570.
Maybe the 2MB of cache makes a difference.. though, I doubt it.





DEAR LORD is photobucket slow (for me) to load.................. Oy.
 
I thought it was known that HT doesn't really benefit in games with exception of i3.

But HT is a huge factor to me because HT makes around at least 20% difference in rendering time.
 
But HT is a huge factor to me because HT makes around at least 20% difference in rendering time.

This is (obviously) all application specific. The ~10% overhead of HT vs. 100% more core would surely make a difference when the additional core(s) are being hit.

More Cowbell!
 
end words:
a 6600K is always the top choice over a 6700K if you intend to game only (well even in some other task the 6600K is close to the 6700K )
and HT is useless in gaming except on 2 core type i3

well nothing new but glad to see it confirmed :D
 
so the games tested dont use more than four threads as would be expected.. with four real cores HT is redundant.. interesting to see that it seems to cause a small performance hit though.. maybe i should turn mine off.. :)

it would allow for higher clocks or lower temps when things that do use 8 threads are being used (as is mostly the case) as stability and load testers..

i did try turning my HT off.. running stuff like prime95 does show much lower temps.. if all a person does is game.. maybe HT is better turned off..

trog
 
if all a person does is game.. maybe HT is better turned off..

trog
if all a person do is game, they'd better off with a i5-XXXXK/non K and spare nearly 209chf (were i live in case of 6600K to 6700K ) than HT turned off :D that's a pretty expensive turnoff :laugh: ;)
 
if all a person do is game, they'd better off with a i5-XXXXK/non K and spare nearly 209chf (were i live in case of 6600K to 6700K ) than HT turned off :D that's a pretty expensive turnoff :laugh: ;)

true but i was thinking about folks that already had the I7 chip.. :)

just thinking aloud that was all.. having forked out all that extra dosh to get HT it would take some extra large balls to turn it off.. he he

trog
 
@Artas1984 Interesting how HT helps when the CPU has less cores and it kinda makes sense, too.

I suggest running one of those games with the following CPU settings in the BIOS:

- 4 cores enabled, no HT

- 2 cores enabled, plus HT

Logic dictates that the 4 full cores should beat the 2+HT, but there might possibly be an anomaly somewhere and the difference may not be that much either. Be interesting to see the result.

Finally, if you want to check the impact of the CPU alone on framerate, bench at something like 1024x768 or even 800x600 if the game will let you and perhaps turn some of the quality settings down too, so the performance isn't being masked by GPU bottlenecks.
 
running abnormally low resolutions just to get abnormally high frame rates to imply that one cpu is better for gaming than another never has made the slightest sense to me even though it is common practise..

nobody actually plays games like this so what is the point.. :)

trog
 
There are some exceptions to this:
1: Crysis 3 uses more than 4 threads in the Jungle level (and only there afaik), so it does benefit from a i7 there. In a PCGH review some time ago (after the i7 patch that fixed the bug), i7 was clearly faster than i5 there. Also before patch the FX 8350 was faster than any other CPU. After patch i7 leads, behind it FX 8350 and i5 on par with eachother.
2. Battlefield 4 needs much more CPU power in Multiplayer mode. The SP mode is basically irrelevant anyway and afaik all CPUs that have over 4 threads benefit from it in multiplayer mode. There was even a multi core patch that fixed performance on 6 core Phenoms + FX 8 cores. i7 4 core and especially 6 core went through the roof after patch.
3. All games in multiplayer mode need more ressources, the OP already mentioned MMORPGS(or "online games").

So this is true, but there are exceptions. Also there are some SP games that use more than 4 threads too ... for a fact I know that GTA 5 and Fallout 4 do. Also I played GTA Online and it had a pretty high usage on my CPU and Fallout 4 in SP mode too (both games used all threads).
 
Great work, that was a lot of time spent!
Makes me feel better that I decided 6600k $164 from gift card.
 
why i3 HT can boost perf, and i7 HT hurt performance?
different HT tech or what?
 
running abnormally low resolutions just to get abnormally high frame rates to imply that one cpu is better for gaming than another never has made the slightest sense to me even though it is common practise..

nobody actually plays games like this so what is the point.. :)

trog
To isolate the performance of the CPU from the graphics card to measure its true performance, as I explained. Let me illustrate this with a hypothetical example.

Graphics card is working at 1080p and max quality in a particular game and tops out at 70fps. You're testing 4 CPUs, each of which easily achieves 70fps in this game. Having the card bottleneck the performance therefore invalidates the test, as 70fps will be measured for all 4 CPUs. Hence, one has to take the strain off the card so that we get the true performance of the CPUs under test. If we set that graphics card to 800x600 and medium details it might well achieve 300fps or more in the game, thus lifting that bottleneck and giving a true result for the CPUs:

CPU A - 100fps
CPU B - 120fps
CPU C - 155fps
CPU D - 180fps

On top of this, these framerates are not especially high nowadays, particularly with the new 165Hz monitors on the market. On top of that, all you need is a more demanding game or a performance drop somewhere in the game and suddenly CPU D becomes very desirable to keep that minimum framerate up. Remember, it's the minimum framerate that matters, not the maximum framerate.

It would be different if one were testing whole system performance. Then, sure, run the PC with everything maxed out, but that's not what's being tested here.
 
why i3 HT can boost perf, and i7 HT hurt performance?
different HT tech or what?
No. Same HTT. It's just that most games only need 4 threads/4 cores and a 4 core i7 has 8 threads out of 4 cores, it means your power gets divided, if the game starts to use the HTT thread instead of the real thread from the core itself. The i3 on the other hand does benefit from HTT because some games need more than 2 threads to function properly, this essentially helps the i3 be a viable gaming processor, while CPUs like the Pentium Anniversary Edition without HTT and only 2 cores have problems in some games that won't start because it has too few threads/virtual cores, even if overclocked to 4,5 GHz and theoretically fast enough.

@trog:
What qubit said. And on top of that: it's for future proofing, maybe a Sandy Bridge - PC is as fast as a Skylake - PC in every game now, because its GPU limited, but in a few years when more CPU power is needed it starts to bottleneck the GPU. That's why they benchmark it low resolution so that it is CPU limited that you can see the real power of the CPU and what it can do if and when the power is needed.
 
Did you compare a 3570K with a 6700K? Perhaps I missed it, but it seems like you made an assumption there? Don't forget, there is a several % increase in per clock performance . Perhaps that would be made up with the higher stock clock of the 6700K, or even at the same clock speeds because of cache? I just don't think that assumption should be made without actually testing it. Sounds logical, but... needs tested.

I apologize if I missed this information.


Maybe the 2MB of cache makes a difference.. though, I doubt it.

Don't have the direct comparison results, but 6700K was notably faster in some of the games than 3570K, that is true. I was referring to 6700K as being a disappointment for my friend, as it offered less performance than a Skylake Core i5 6600K "would have". He wasted a lot of money, and would have ended better with a Core i5 6600K cheaper. However, in most of the games, the difference from 3570K to 6700K was not that much obvious, his Core i5 3570K would have kept him happy for the rest of his video card warranty days at least.

It sure is. So much effort for an already tested and proven point... :)


When i did a test with Core i7 3770, people told me to do that again with Core i7 6700K. This was not a case of "everybody knows Core i7 HT is worthless in games", because some folks believed, including me, that perhaps Skylake architecture would make a difference in games when it came to HT.

i did try turning my HT off.. running stuff like prime95 does show much lower temps.. if all a person does is game.. maybe HT is better turned off..

trog

Thank you for this test. I was wondering about that myself.

@Artas1984 Interesting how HT helps when the CPU has less cores and it kinda makes sense, too.

I suggest running one of those games with the following CPU settings in the BIOS:

- 4 cores enabled, no HT

- 2 cores enabled, plus HT

Logic dictates that the 4 full cores should beat the 2+HT, but there might possibly be an anomaly somewhere and the difference may not be that much either. Be interesting to see the result.

Finally, if you want to check the impact of the CPU alone on framerate, bench at something like 1024x768 or even 800x600 if the game will let you and perhaps turn some of the quality settings down too, so the performance isn't being masked by GPU bottlenecks.

First setting is what i did already: 4 cores, no HT.
Second setting would leave 2 physical cores, and 4 virtual cores - but that would prove nothing. We already saw that single player games do not use more that 4 cores, whether it is 2 physical and 2 virtual (Core i3) or just 4 physical (Core i5). Do you agree?

Also, if i am to turn the graphical settings to the lowest, i would be getting ridiculous amounts of FPS. Look at how much FPS i am getting at the maximum settings already with GTX780 Ti GHz: 70-80 FPS in Crysis 3 and Far Cry 4, over 200 FPS in Call of Duty Advanced Warfare and Bioshock Infinite, over 150 FPS in Battlefield 4, Hitman Absolution, Max Payne 3 and Hard Reset.

If you would like to know how the situation would change with all settings set to minimum, just look at my Core i7 3770 HT benchmark: in that test i did disable all the settings later on:

http://www.techpowerup.com/forums/t...rthreading-test-20-games-tested.216466/page-2

Post 39. GTX760 was bottle-necking me, so i had to lower everything to see would the benchmark look any different, but it did not...
 
Last edited:
Back
Top