• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Gaming benchmarks: Core i7 3770 hyperthreading test (20 games tested)

Status
Not open for further replies.
Should have added total rome to the benchmarks. That game is a lot more cpu heavy than any of the games listed important
 
Again, I can't see this review. The thread starts at post #2... what am I doing wrong?
 
Again, I can't see this review. The thread starts at post #2... what am I doing wrong?

Do you have Artas1984 on ignore? I've never put anyone on ignore so I'm not sure how that works for someone starting a thread that is on ignore followed by members responses that aren't on ignore.
 
Pretty much old news, but still thanks for all the effort and time.
 
Wait till November and i will remake these benchmarks with GTX970, ok? Same 1920x1080 resolution:toast: That should clear up things a bit more for those who have doubts..

I don't understand why you turned off AA in every game for your benchmarks.

I use AA mostly when i test video card benchmarks, it's pixel fill-rate and memory bandwidth dependent thing, i would only be chocking my video card more and the HT results would be even more inacurate.
 
I use gtx970 for 1440p with pretty much maxed out games. I would agree if you said 4K. GTX970 is not a mid range card, not even in 2015, nor it delivers just 'good' 1080p performance. This card is complete overkill for 1080p.

Ummm, you may like it's performance (I notice you didn't say it could completely max out 1440p), but that doesn't change the fact that within the Maxwell hierachy, a GTX 970 is a mid-tier card.

What matters is where it is positioned in the product stack, not how much it costs in another country and whether it is affordable or not. It has 2 above and 2 below. It is actually the perfect 1080p card, because there are games that it cannot max out at 1080p.
Yeah, it's a firmly mid-tier card.
 
Last edited:
Do you have Artas1984 on ignore? I've never put anyone on ignore so I'm not sure how that works for someone starting a thread that is on ignore followed by members responses that aren't on ignore.
ROFLMAO.. hahaha, that's it...
 
I don't understand why you turned off AA in every game for your benchmarks.
well for my point of view it's plenty valid, i never use AA (pointless, specially in 1323 DSR ) but even in 1080p i rarely have a game that looks better or worse, with or without it o_O (yes i have a case of "hyperopic astigmatism" let say i have a biologic AA feature...)

The only people who buy i7's instead of i5's are usually those that need them for different reasons other than gaming.
totally true ... or peoples who don't care about 100-120 bucks more for "just" having more L2, mhz and HT

Ummm, you may like it's performance (I notice you didn't say it could completely max out 1440p), but that doesn't change the fact that within the Maxwell hierachy, a GTX 970 is a mid-tier card.

What matters is where it is positioned in the product stack, not how much it costs in another country and whether it is affordable or not. It has 2 above and 2 below. It is actually the perfect 1080p card, because there are games that it cannot max out at 1080p.
Yeah, it's a firmly mid-tier card.
nahhh it's a high low end ... (higher tier of the low end...) the mid tier is the 980 :D ;) (joke joke)
 
I can't use DSR because this stupid thing forces games to run at 60Hz and that looks absolutely horrible on a 144Hz monitor.

Frankly, it's easier to run 1080p with FXAA. Almost no performance hit and it still looks nice and can run at 144Hz.
 
while I definitely appreciate the time and effort (mostly the time) put into conducting the tests and posting the results, but I don't see why this is even a debate. CPUs haven't been a bottleneck in gaming performance for a few years now. even if you have an i5 2500K, which will turn 5 years old in January, you are still set for any game that is out now or will come along anytime soon. i didn't get a 6 core/12 thread CPU because I thought it would help my gaming. I KNEW that it would not. i needed it for other reasons.
 
while I definitely appreciate the time and effort (mostly the time) put into conducting the tests and posting the results, but I don't see why this is even a debate. CPUs haven't been a bottleneck in gaming performance for a few years now. even if you have an i5 2500K, which will turn 5 years old in January, you are still set for any game that is out now or will come along anytime soon. i didn't get a 6 core/12 thread CPU because I thought it would help my gaming. I KNEW that it would not. i needed it for other reasons.

Hell, even some i3's (i3-4160 for one) are perfectly adequate in most games! (see signature block)
 
Hell, even some i3's (i3-4160 for one) are perfectly adequate in most games! (see signature block)
or my Alpha .... i3-4130T 2x2.9ghz +HT... pretty much fine in any game i throw on it ... even World Of Warships, it need some tweaks but i can run perma 60fps (tho the network lag is a bit different ... but i am not at home ... rather far from it, atm :roll: )
 
On the i5 vs i7 issue, some buy the i7s for reasons other than just HT for heavily threaded apps. You also get higher clock speed, and some prefer not to OC, or just want to be assured that guaranteed speed. Not all chips OC well, and lately they're setting the speeds on i7s quite a bit higher. It's part of the selling point. If they only added HT, they wouldn't sell nearly as many, esp to gamers.
 
IMPORTANT UPDATE!!!

As per request by some members, i have redone all the benchmarks at lowest settings on 1024x768 resolution in order to disable possible GPU bottleneck.



As you can see, the results speak basically of the same pattern as in the highest settings benchmark. The only small difference might be Hitman Absolution slightly benefiting from HT.

However, notice that Batman Arkham Origins, Bioshock Infinite, Far Cry 3, Metro Last Light Redux suffered a penalty in minimal frame rates with HT on!!!

I now i have really proved that HT is worthless in gaming. If you did not believe me before, perhaps you do now? :)
 
I now i have really proved that HT is worthless in gaming. If you did not believe me before, perhaps you do now? :)

All you've done is verify what most already know. What compelled you to even try and prove HT is worthless in gaming? Most that hang out on tech forums with intelligent minded gamers are well aware of it.

I can think of better things to do with one's time.
 
All you've done is verify what most already know. What compelled you to even try and prove HT is worthless in gaming? Most that hang out on tech forums with intelligent minded gamers are well aware of it.

I can think of better things to do with one's time.
I suspect hyper-threading is more useful for games on dual-core CPUs.
 
I now i have really proved that HT is worthless in gaming.

*Unless you're on a multi-GPU setup
*Unless you're streaming
*Unless you do more things than just gaming

Your graphs show small percentage increases in framerates in some titles. So while the results may not be worth it to you, there are still gains to be had for those that do care.

Sweeping statements man, don't do them.

You know what's handy for me? Being able to play games whilst rendering a video for two hours!

EDIT: May I ask if you used the 95th percentile for your results? Most of us do to eliminate spikes in results.
 
Last edited:
When it comes to the i5 versus i7 and just for gaming, there is no reason whatsoever to get the i7.

That is the blanket statement you can actually make. Beyond that, any blanket statement is dangerous.

People who use an i3 with HT will find tangible gains because the two main threads can be more fully utilized for the game itself because the CPU can offload the less intensive threads (background tasks) to HT. Also, newer engines can make better use of HT, some engines run less intensive threads that can be scheduled to HT.

Like others pointed out, there are lots of users that record while gaming. At that point the i7 is no luxury, although if you use Shadowplay the i5 will suffice still, but this is proprietary for Nvidia cards. So again, no basis for saying the i5 is enough.

About the 970 and 'mid range'. The primary advantage of 970 is in newest titles at 1080p, if you look at anything before 2014, you will see the 970 has no real merit at 1080p. Look at TW3: turn off Hairworks, and I run that 'heavy game' on my mid range GTX 770 with relative ease at high settings. No urge to upgrade until you turn that performance hog 'on'. Just for a little bit of perspective right there on what is 'needed to run'...

Price wise, the 970 is not mid range but borders the high end. The fact that we have big chips now that are part of the yearly refreshes with an astronomical price tag, does not change the playing field of mid range. Mid range, to me, is still the 250 eur maximum and if ANYTHING happened to mid range territory, it is actually the other way around: getting a mid range card has become cheaper because gaming needs have not advanced all that much, evidenced by the re-re-refresh of AMD's cards. The 670 was also launched as a high end card, and the 690 was the dual gpu version of that high end card. Let's keep calling apples apples, and not change them to oranges because Nvidia and AMD have upped their pricing game and because a couple % of the gaming market likes to lead the 'master race' with a lot of fanfare about their newest acquisitions. 1440p is still not an argument in terms of being mid range, because it simply is not mid range but high end. 1080p is mid range, and for that, you only need a 970 for the newest of games at the highest settings. People on this forum sometimes forget Ultra settings have never been mid range territory, and neither has 60 fps ever been mid range. Mid range is ultra @ 30-45+ or medium/high @ 60 fps.
 
Last edited:
If i recall correctly back in the days Enemy territory quake wars had pretty good multi threading support. Rage (another game using an engine from id, id tech 5) makes heavy used of multi-threading (afaik) to deal with all the megatexture stuff and so on. I would love to see how rage scales, if possible...
 
@Vayra86 overall I agree with you, except for the 970. As I said previously, how many Euros is costs does not dictate if it's mid range. But if you want to go there, it's 350 and below U.S. Dollars, which is very affordable no matter WHAT year it is. It may be a higherend of the mid tier, but it is definately a mid-tier card.

It has the 980 and 980 Ti above it, and 960 and 950 below it. And it struggles to play all games at high settings at 1080p. Mostly however it gets that part perfect. You also have to look at the chip used. OMG...it's not the GM200! Knowing Nvidia's naming scheme is crucial to see right away it's mid-tier. Notice nowhere did I say it sucks. It's a very good card, but that doesn't position it higher than the middle of the pack. Edited
 
Last edited:
@Vayra86 overall I agree with you, except for the 970. As I said previously, how many Euros is costs does not dictate if it's mid range. But if you want to go there, it's 350 and below U.S. Dollars, which is very affordable no matter WHAT year it is. It may be a higherend of the mid tier, but it is definately a mid-tier card.

It has the 980 and 980 Ti above it, and 960 and 950 below it. And it struggles to play all games at high settings at 1080p. Notice nowhere did I say it sucks. It's a very good card, but that doesn't position it higher than the middle of the pack.

Still it is the wrong way to go about it. The 670 launched as high end, the 970 launched as high end. Remember back when it launched how people were saying 'this will be succeeded by a big chip'? It is an exact repeat of the Kepler release scheme, which touted 670 as high end, and that is exactly what it was. And just like today, people frowned upon the 680/980 for its astronomical price difference with only a very small performance advantage. Affordability wise, 350 dollars is a pretty serious investment for a GPU. You buy a console for that. Our 'bottom line' has shifted, let's keep that in mind, but the market still works in the exact same way as it has always done.
 
No, there were overpriced video cards way back....8800GTX anyone? Noone touted the 670 as high end. Everyone who actually knows how Nvidia numbers their chips knows what their designations mean and what tier they fall into.
 
No, there were overpriced video cards way back....8800GTX anyone? Noone touted the 670 as high end. Everyone who actually knows how Nvidia numbers their chips knows what their designations mean and what tier they fall into.

By that analogy, 7970 has also never been a high end chip. ???

Let's get some facts. http://www.anandtech.com/show/5818/nvidia-geforce-gtx-670-review-feat-evga

Introduction to this review:
In a typical high-end GPU launch we’ll see the process take place in phases over a couple of months if not longer. The new GPU will be launched in the form of one or two single-GPU cards, with additional cards coming to market in the following months and culminating in the launch of a dual-GPU behemoth (read: GTX 690). This is the typical process as it allows manufacturers and board partners time to increase production, stockpile chips, and work on custom designs."

My point stands. The introduction of a bigger chip does not push the GK104 to a lower price tier. The big chip created its own price tier. When it comes to current mid range, that means we are talking about 960 or 770. Not 970/980.

Kepler introduced the GK106, aka GTX 660, which was true mid range at the time, and 960ti was also the mid range card above that. That's where mid range stops. Back in the day there was nothing better than 680. So early adopters paid high end prices for a mid range card back in the day. That makes sense! :p We all know 680 was only leading the 670 by a very small margin, hence everyone bought a 670. And what happened today with 970 and 980? Exactly.
 
Last edited:
However, just because something leads the naming scheme and price does not mean that it is a high end card. Everyone who knew anything about which CHIP was used knew that the 680 was not a true high end card. It was merely the highest performing of that series. Same with 970 and 980...when released, again, top performers of 9 series, but done using their mid-tier chip. So my point stands too.
 
Status
Not open for further replies.
Back
Top