Thursday, July 18th 2013

More Core i7-4960X "Ivy Bridge-E" Benchmarks Surface

More benchmarks of Intel's upcoming socket LGA2011 flagship client processor, the Core i7-4960X "Ivy Bridge-E," surfaced on the web. Tom's Hardware scored an engineering sample of the chip, and wasted no time in comparing it with contemporaries across three previous Intel generations, and AMD's current generation. These include chips such as the i7-3970X, i7-4770K, i7-3770K, i7-2700K, FX-8350, and A10-5800K.

In synthetic tests, the i7-4960X runs neck and neck with the i7-3970X, offering a 5 percent performance increment at best. It's significantly faster than the i7-3930K, Intel's $500-ish offering for over 7 quarters running. Its six cores and twelve SMT threads give it a definite edge over quad-core Intel parts in multi-threaded synthetic tests. In single-threaded tests, the $350 i7-4770K is highly competitive with it. The only major surprise on offering is power-draw. Despite its TDP being rated at 130W, on par with the i7-3960X, the i7-4960X "Ivy Bridge-E" offers significantly higher energy-efficiency, which can be attributed to the 22 nm process on which it's built, compared to its predecessor's 32 nm process. Find the complete preview at the source.
Source: Tom's Hardware
Add your own comment

72 Comments on More Core i7-4960X "Ivy Bridge-E" Benchmarks Surface

#26
drdeathx
cdawallThe one in hardware canucks was throttling. Thats why it under performed. Although you intel folks don't seem to like any responses othrr than intel is god. This is a maybe 5% performance increase what a waste of a release.
I agree. Intel is failing ATM allowing AMD to race up their sphinkter.
Posted on Reply
#27
TheHunter
They gained some extra space might as well make it a 8core. But nooo :laugh:
Posted on Reply
#28
ensabrenoir
haswrongare you really sure no one can utilize more computational power? you are talking about enthusiast platform. i dont care if intels focus shifted. if they shift it the wrong way, im not buying and intel goes bankrupt. intel clearly doesnt want new money from customers.
i actually have an asrock x79 extreme 9 with a 3820 i got only because i wanted to wait for ivy-e (pc atm). I plan on going to a six core with ivy-e. Your right some do need more comp power though as some use their rigs for more than gaming. The xeon users prob won't be complaining about anything though. Intel wants more of the everydayer/upgrade everytime a new ones out/facebookers-tablet-tweeters money

Haswell -e if i remember correctly will shift the enthusiast line to 6 & 8 core cpus with the next chipset.
Posted on Reply
#29
Am*
AquinusPower consumption numbers look pretty nice to me.
www.techpowerup.com/img/13-07-18/143e.jpg
Agreed, power consumption numbers look pretty impressive, especially for an engineering sample. If these chips are done right and use fluxless solder instead of that shitty thermal paste, they may well have a killer enthusiast chip on their hands that will clock past the 4.5GHz mark.
Posted on Reply
#30
Jstn7477
Am*Agreed, power consumption numbers look pretty impressive, especially for an engineering sample. If these chips are done right and use fluxless solder instead of that shitty thermal paste, they may well have a killer enthusiast chip on their hands that will clock past the 4.5GHz mark.
They're soldered. There was a report of one being delidded a couple weeks back and it destroyed the die. www.overclockers.com/forums/showthread.php?t=733766
Posted on Reply
#31
xorbe
that power consumption == where's my 8 core
Posted on Reply
#32
Lionheart
Jstn7477Sorry, but being decent in roughly 7 out of 40 tests at Hardware Canucks doesn't bode well for a processor that is clocked 1200MHz above the competition, costs 1.5-3x more unless you are comparing to 3970X, gets poopy frame rates in games compared to a mainstream Intel CPU, and consumes a ridiculous amount of power for the rather lackluster performance you get, although you've already stated that you don't care if your chip takes almost 300w.

Since I am entitled to an opinion just as you are, I'll stick with my relatively quiet computer, fairly cool room and a ~100w processor that provides great, predictable performance across the board because I don't want my HD 7970 and 120Hz monitor being under-utilized in non-DX11 games which already happens even with my processor. Have fun with your FX 9370 or whatever you ordered. ;)
A simple comment really brings out the Intel fanboy in you :pimp:
Posted on Reply
#33
cdawall
where the hell are my stars
drdeathxI agree. Intel is failing ATM allowing AMD to race up their sphinkter.
That's what happens when you get left in every other market. They still can't beat amd in cores per cluster and the server market is moving that way. Oh well let the performance hounds have their fun. Games still play the same between this and the 9590 at the same price.
Posted on Reply
#34
1c3d0g
xorbethat power consumption == where's my 8 core
Word on the street has it that Haswell-E will be the one to have 8 cores. ;)
Posted on Reply
#35
radrok
cdawallOh well let the performance hounds have their fun. Games still play the same between this and the 9590 at the same price.
On CPU heavy games? No way on earth, you are dreaming.

Higher IPC from Intel CPUs also helps a ton on minimum framerates, average doesn't tell the whole story.
Posted on Reply
#36
cdawall
where the hell are my stars
radrokOn CPU heavy games? No way on earth, you are dreaming.

Higher IPC from Intel CPUs also helps a ton on minimum framerates, average doesn't tell the whole story.
Depends on the game higher IPC is cool until it is heavily multithreaded like most modern games and then AMD starts to take the lead.
Posted on Reply
#37
MxPhenom 216
ASIC Engineer
radrokOn CPU heavy games? No way on earth, you are dreaming.

Higher IPC from Intel CPUs also helps a ton on minimum framerates, average doesn't tell the whole story.
Don't worry he defends anything AMD like it's his own child, it's okay.
Posted on Reply
#38
radrok
cdawallDepends on the game higher IPC is cool until it is heavily multithreaded like most modern games and then AMD starts to take the lead.
You really are convinced of what you are saying :eek:

Are you completely sure that an AMD Octa can/starts to take the lead on a multithreaded game scenario against an Intel Hexa?

If so please show me proof because every single overclock review I've seen shows the AMD CPU way behind.
Posted on Reply
#39
cdawall
where the hell are my stars
radrokYou really are convinced of what you are saying :eek:

Are you completely sure that an AMD Octa can/starts to take the lead on a multithreaded game scenario against an Intel Hexa?

If so please show me proof because every single overclock review I've seen shows the AMD CPU way behind.
There are multiple times it actually leads in minimum FPS.







Posted on Reply
#40
radrok
Heavy GPU bound situations do not reflect CPU performance, imho.

I'm betting the CPUs here don't even get properly stressed.

Throw in one or a couple more GPUs and you'll see the difference.
Posted on Reply
#41
cdawall
where the hell are my stars
radrokHeavy GPU bound situations do not reflect CPU performance, imho.

I'm betting the CPUs here don't even get properly stressed.

Throw in one or a couple more GPUs and you'll see the difference.
Are those not acceptable frame rates in multiple AAA titles at a resolution people play at?

Win loose or draw it shows the minimum FPS on the AMD side for multiple games. That means there is something within the AMD setup allowing better performance.

You can argue GPU bound vs CPU bound all you want, but in the real world those graphs show it all there will be no difference in actual performance while gaming. Mind you that is a basically stock clocked 9590 as well.
Posted on Reply
#43
cdawall
where the hell are my stars
radrokActually those minimum framerates are in the margin of error of GPU rendering performance.

Wanna bet the average/maximum and minimum framerate with a 5GHz 3930/3960 will always be favouring the Intel CPU?

Take a look here, how the game changes when you add more than one GPU :)

www.anandtech.com/show/6985/choosing-a-gaming-cpu-at-1440p-adding-in-haswell-/7
That game sucks.



Two GPU's Dirt 3 and it is within 10%. I would honestly be curious how it did with one of the boards running PCI-E 3.0 vs 2.0. It appears to have madea huge difference with the intel chips.
Posted on Reply
#44
radrok
That's better, for sure.

What I really don't like about the AMD CPU (for gaming) is the fact it blows on poorly threaded games.

It's a really good multithreaded CPU but you can feel it lacks on the IPC department and that's a deal killer for me.

They need to increase that to get back into the game and not get laughed at.
Posted on Reply
#45
MxPhenom 216
ASIC Engineer
cdawallThere are multiple times it actually leads in minimum FPS.

www.kitguru.net/wp-content/uploads/2013/07/AvP5.png www.kitguru.net/wp-content/uploads/2013/07/sleeping-dogs4.png

www.kitguru.net/wp-content/uploads/2013/07/tomb-raider3.png www.kitguru.net/wp-content/uploads/2013/07/max-payne-3.png

www.kitguru.net/wp-content/uploads/2013/07/dirt-showdown5.png www.kitguru.net/wp-content/uploads/2013/07/metro-last-light5.png

www.kitguru.net/wp-content/uploads/2013/07/GRID-21.png
Are you for real?
Posted on Reply
#46
cdawall
where the hell are my stars
radrokThat's better, for sure.

What I really don't like about the AMD CPU (for gaming) is the fact it blows on poorly threaded games.

It's a really good multithreaded CPU but you can feel it lacks on the IPC department and that's a deal killer for me.

They need to increase that to get back into the game and not get laughed at.
Hopefully steamroller will fix that issue. For now AMD owners just have to deal with the added multithreading performance. This will be my first FX chip so it will be interesting.
MxPhenom 216Are you for real?

Using those benchmarks to determine CPU performance in gaming is wrong. Set the resolution and settings the lowest it will go, and then you have relevant benchmarks to determine overall CPU performance in games.
I personally couldn't give two shits how it performs at 640x480 I care about actual real world performance. If it can play my game just as good during eyefinity as an Intel why swap.
Posted on Reply
#47
radrok
Anyway, let's get out of gaming for a second, where GPU is way more important than CPU.

From the same review



Really shows you AMD has some serious catch up if they want to price their products where they have.

Doesn't even begin to match a STOCK i7 hexa.
Posted on Reply
#48
cdawall
where the hell are my stars
radrokAnyway, let's get out of gaming for a second, where GPU is way more important than CPU.

From the same review

www.kitguru.net/wp-content/uploads/2013/07/3dstudio-max.png

Really shows you AMD has some serious catch up if they want to price their products where they have.

Doesn't even begin to match a STOCK i7 hexa.
Use an encoder that utilizes AVX or FMA4 and the performance is very close.
Posted on Reply
#49
radrok
That's the issue, to get the best performance out of it you need to use this and use that.

Why do you have to compromise when you can get a product that performs well everywhere?
Posted on Reply
#50
cdawall
where the hell are my stars
radrokThat's the issue, to get the best performance out of it you need to use this and use that.

Why do you have to compromise when you can get a product that performs well everywhere?
Same reason those CPU's are taking over the server market pretty quickly. They do perform well. I don't use any single application were the performance of an FX CPU is so awful I cannot happily/quickly complete my task, but there are applications I use that it works exceedingly well. (3D CAD with proper rendering tools/the games I play)
Posted on Reply
Add your own comment
Apr 19th, 2024 03:16 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts