Thursday, July 18th 2013

More Core i7-4960X "Ivy Bridge-E" Benchmarks Surface

More benchmarks of Intel's upcoming socket LGA2011 flagship client processor, the Core i7-4960X "Ivy Bridge-E," surfaced on the web. Tom's Hardware scored an engineering sample of the chip, and wasted no time in comparing it with contemporaries across three previous Intel generations, and AMD's current generation. These include chips such as the i7-3970X, i7-4770K, i7-3770K, i7-2700K, FX-8350, and A10-5800K.

In synthetic tests, the i7-4960X runs neck and neck with the i7-3970X, offering a 5 percent performance increment at best. It's significantly faster than the i7-3930K, Intel's $500-ish offering for over 7 quarters running. Its six cores and twelve SMT threads give it a definite edge over quad-core Intel parts in multi-threaded synthetic tests. In single-threaded tests, the $350 i7-4770K is highly competitive with it. The only major surprise on offering is power-draw. Despite its TDP being rated at 130W, on par with the i7-3960X, the i7-4960X "Ivy Bridge-E" offers significantly higher energy-efficiency, which can be attributed to the 22 nm process on which it's built, compared to its predecessor's 32 nm process. Find the complete preview at the source.
Source: Tom's Hardware
Add your own comment

72 Comments on More Core i7-4960X "Ivy Bridge-E" Benchmarks Surface

#51
radrok
So basically Intel is secretly awful and no one recognizes that?

Please.
Posted on Reply
#52
MxPhenom 216
ASIC Engineer
cdawallHopefully steamroller will fix that issue. For now AMD owners just have to deal with the added multithreading performance. This will be my first FX chip so it will be interesting.



I personally couldn't give two shits how it performs at 640x480 I care about actual real world performance. If it can play my game just as good during eyefinity as an Intel why swap.
Uh yeah disregard what I said. I was at work when I said that, then on my way home I thought about it and facepalmed myself.

What I find interesting though is the fact that CPU performance differences in gaming becomes far less when game resolutions are increased and all that. You are right that AMD and Intel are pretty close in gaming performance, but that's only close, Intel still has the performance crown, and they will continue to have it till AMD does a big overhaul on their sockets and chipsets.
Posted on Reply
#53
cdawall
where the hell are my stars
radrokSo basically Intel is secretly awful and no one recognizes that?

Please.
Well actually...link
MxPhenom 216Uh yeah disregard what I said. I was at work when I said that, then on my way home I thought about it and facepalmed myself.

What I find interesting though is the fact that CPU performance differences in gaming becomes far less when game resolutions are increased and all that. You are right that AMD and Intel are pretty close in gaming performance, but that's only close, Intel still has the performance crown, and they will continue to have it till AMD does a big overhaul on their sockets and chipsets.
Technically only close is with both reasonably overclocked...FX at 5ghz and SB-E@4.4ghz neither is really low wattage I feel stock vs stock the 9590 and 3960x would be close in all ways as far as games go.

The Haswell mainstream chips are a nice toss into the mix, but the half ass heatspreader and heat issue would make me mad, not to mention the extra cost of a entire system change for me makes it not worth it. :laugh:
Posted on Reply
#54
ViperXTR
well, them FX chips seems to be better in Crysis 3's grassy areas where teh grass is physicially simulated by the CPUs, more threads the better.
Posted on Reply
#55
ensabrenoir
.....don't know who you people are ...or how you got into my computer but stop all this arguing!!! your giving my HDD the willies!!!!:roll:

The amd or intel debate is just like the chicken or the egg debate.... It don't matter.... people eat'em both.
Posted on Reply
#56
Deadlyraver
Oh, a super expensive chip for my gaming? Why not! I am sure I will be hijacking NORAD soon.

Plus it ain't Haswell, why bother?
Posted on Reply
#57
HumanSmoke
cdawallSame reason those CPU's are taking over the server market pretty quickly.
Define "pretty quickly". From the shipping estimates I've seen, AMD's clawing back of server/HPC market share is predicated upon ARM-based solutions from SeaMicro, and a fairly leisurely ramp for conventional x86 based Opterons.
From Mercury Research'sown analysis (presumably "96%" is rounded up from 95.7%) :
Intel controlled 96 percent of the market for servers that run on PC processors in the fourth quarter, according to Cave Creek, Arizona-based Mercury Research Inc. Advanced Micro Devices Inc. (AMD), Intel’s only rival in that market, had 4.3 percent.
As for the hoohah about gaming and CPUs...the bulk of games are of course graphics constrained, so a user would need to have some pretty narrow focus to argue gaming as a fundamental feature of a $1K processor IMO. Personally, if its a gaming benchmark comparison, I'd be more inclined to check out games that actually keep a CPU gainfully employed - i.e. strong AI routines, comprehensive CPU physics etc. - usually the province of RTS games - i.e. Skyrim

It's little wonder that games such as Tomb Raider and MLL are showing little differentiation between CPUs or clockspeeds
Posted on Reply
#58
cdawall
where the hell are my stars
I swear If I see one more shitty skyrim benchmark. Its poorly coded amd sucks with it we fucking get it.
Posted on Reply
#59
radrok
cdawallI swear If I see one more shitty skyrim benchmark. Its poorly coded amd sucks with it we fucking get it.
Have to agree with you here, the engine on which Skyrim runs is old as the universe itself and it should have been dropped with Oblivion.



On a side note I'd really want both AMD and Intel to increase core count for the Desktop platform.
Posted on Reply
#60
HumanSmoke
radrokHave to agree with you here, the engine on which Skyrim runs is old as the universe itself and it should have been dropped with Oblivion
Unfortunately, RTS titles used in gaming benchmark suites are a fairly small subset of a fairly small subset (games that are CPU intensive) now that Civ5 has largely passedfrom benchmarking suites. Likewise, large maps and competent AI aren't synonymous with most games used in benchmarking- which tend towards the corridor-shooter FPS variety...and even when you find a game that is taxing the CPU, it may be the product of stalls due to coding and/or memory usage...Borderlands 2 likely fitting some of those parameters.


Feel free to stone the messenger, but until sites utilize a CPU intensive game (preferably one that scales with core count adequately), you're stuck with what is benchmarked in CPU comparative game testing - and of course, the paucity of games that fit the bill says something in itself about the general requirement.
Posted on Reply
#61
cdawall
where the hell are my stars
Skyrim is still and always will be a worthless benchmark. Doesn't scale with cores, doesn't multithread...POS benchmark and you know it.
Posted on Reply
#62
MxPhenom 216
ASIC Engineer
cdawallSkyrim is still and always will be a worthless benchmark. Doesn't scale with cores, doesn't multithread...POS benchmark and you know it.
Agreed. You could disable all but one core, and overclock it to the sky, and get the same performance as with the cores enabled and overclocked to the same speed lol.

Reason AMD gets bad performance in Skyrim is because of just that too, AMD FX chips are not known for their single thread performance, where as Intel is.
radrokHave to agree with you here, the engine on which Skyrim runs is old as the universe itself and it should have been dropped with Oblivion.



On a side note I'd really want both AMD and Intel to increase core count for the Desktop platform.
There's not much of a reason though. Software that everyday consumers use has kind of hit a wall in terms of the amount of cores they need. Once software begins to demand more cores, then we will begin to see more cores in our hardware.
Posted on Reply
#63
radrok
MxPhenom 216Software that everyday consumers use has kind of hit a wall in terms of the amount of cores they need.
This is also the reason why Intel is more appealing than AMD right now.

AMD seems to have kinda forgotten that single threaded performance is also important.
Posted on Reply
#64
cdmikelis
Do not scorn truck if you need a car
Jstn7477Sorry, but being decent in roughly 7 out of 40 tests at Hardware Canucks doesn't bode well for a processor that is clocked 1200MHz above the competition, costs 1.5-3x more unless you are comparing to 3970X, gets poopy frame rates in games compared to a mainstream Intel CPU, and consumes a ridiculous amount of power for the rather lackluster performance you get, although you've already stated that you don't care if your chip takes almost 300w.

Since I am entitled to an opinion just as you are, I'll stick with my relatively quiet computer, fairly cool room and a ~100w processor that provides great, predictable performance across the board because I don't want my HD 7970 and 120Hz monitor being under-utilized in non-DX11 games which already happens even with my processor. Have fun with your FX 9370 or whatever you ordered. ;)
--
Why people gets so upset about SB/IB-E if all you need is Gaming oriented i5-K chip? Intel give "E's" to us who need it, not for thoose who don't. It's like a truck: It does consume x-times more fuel than a personal vehicle, it's not good on cornering, is noisy and heats a lot. But try to convince some cargo company, that using a family car would be better for them. I do a lot of video rendering: Secods faster are multiplyed many times and at the end of day I save tens of minutes, at the end of month I save hours at the end of year I save days of my valuable working time. My ROI for "only" 10% faster chip or next generation of GPU is 3-6 months. But for office computer, for making invoices, I still have E6800 with 4GB ram. (and will have it for a while).

2nd: Intel CAN NOT give us Haswell-E until server platform wil do that move. Server owners does not care about computer games, but cares about of TCO and ROI, so changing platform every year would instantly penalize Intel and reward compeption. That's it.
Posted on Reply
#65
TheHunter
cdawallWell actually...link



Technically only close is with both reasonably overclocked...FX at 5ghz and SB-E@4.4ghz neither is really low wattage I feel stock vs stock the 9590 and 3960x would be close in all ways as far as games go.

The Haswell mainstream chips are a nice toss into the mix, but the half ass heatspreader and heat issue would make me mad, not to mention the extra cost of a entire system change for me makes it not worth it. :laugh:
What heat issue? I have 4770k OC'ed to 4.6ghz@ 1.238v and i never saw per core temperature over 60C, usually its 47-55C in any game so far.


In something like Cinebench11.5 or Any video converter using 8threads x264 codec it hits 62-74C it will be better in winter though and im only at H90, atm outside 40C inside 29C helps too xD
Posted on Reply
#66
fullinfusion
Vanguard Beta Tester
TheHunterWhat heat issue
The heat issue all reviews report on the IB and Hasswell cpu's

That H90 you have is one hell of a cooler at the claimed 29c inside temperature.

:shadedshu

Try loading it up under OCCT or better yet Prime95!
I betcha you stop the program once you see the temp shoot to the moon ;)
Posted on Reply
#67
Aquinus
Resident Wat-man
TheHunterIn something like Cinebench11.5 or Any video converter using 8threads x264 codec it hits 62-74C
Sounds a lot like my 3820; it consumes a lot of power and I only have a Zalman CNPS 9900 cooling it to get similar temperatures with a similar OC. I wouldn't call that optimal. I read that IVB-E has the heatspreader soldered to the die, but I'm not sure if it's true or not though.
Posted on Reply
#68
brandonwh64
Addicted to Bacon and StarCrunches!!!
Not that big of a performance increase to justify the price.
Posted on Reply
#69
TheHunter
fullinfusionThe heat issue all reviews report on the IB and Hasswell cpu's

That H90 you have is one hell of a cooler at the claimed 29c inside temperature.

:shadedshu

Try loading it up under OCCT or better yet Prime95!
I betcha you stop the program once you see the temp shoot to the moon ;)
Yep its how it performs in real world scenario.


I ran IBT, linx, Prime95 it and yeah it was 80-90C, so? I will never ever see that in any app. or game so im not bothered with those tools.. Even intel said its useless and not a real stability indicator.. And its true, I passed all torture tests and yet failed in Bf3 lol :D
Sounds a lot like my 3820; it consumes a lot of power and I only have a Zalman CNPS 9900 cooling it to get similar temperatures with a similar OC. I wouldn't call that optimal. I read that IVB-E has the heatspreader soldered to the die, but I'm not sure if it's true or not though.
I never said its optimal and that's the worse case scenario in summer. When i get higher static pressure fans 2.7mm h2o it will change a lot (atm 1.5mm h20):)


About soldering, someone leaked IB-e deliding and it showed proper soldering.
Posted on Reply
#70
radrok
TheHunterYep its how it performs in real world scenario.


I ran IBT, linx, Prime95 it and yeah it was 80-90C, so? I will never ever see that in any app. or game so im not bothered with those tools.. Even intel said its useless and not a real stability indicator.. And its true, I passed all torture tests and yet failed in Bf3 lol :D



I never said its optimal and that's the worse case scenario in summer. When i get higher static pressure fans 2.7mm h2o it will change a lot (atm 1.5mm h20):)


About soldering, someone leaked IB-e deliding and it showed proper soldering.
If you manage to pass 1 hr LinX AVX enabled I'm pretty sure you won't find anything that makes your PC crash.

I use that as stability test and I've never had my pc hang after passing that.

Beware it shoots your temps to the moon (unlike Linx non-AVX), makes my 3930K go up to 80C on a gigantic custom water loop.
Posted on Reply
Add your own comment
Apr 19th, 2024 20:16 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts