• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

More Core i7-4960X "Ivy Bridge-E" Benchmarks Surface

That's the issue, to get the best performance out of it you need to use this and use that.

Why do you have to compromise when you can get a product that performs well everywhere?

Same reason those CPU's are taking over the server market pretty quickly. They do perform well. I don't use any single application were the performance of an FX CPU is so awful I cannot happily/quickly complete my task, but there are applications I use that it works exceedingly well. (3D CAD with proper rendering tools/the games I play)
 
So basically Intel is secretly awful and no one recognizes that?

Please.
 
Hopefully steamroller will fix that issue. For now AMD owners just have to deal with the added multithreading performance. This will be my first FX chip so it will be interesting.



I personally couldn't give two shits how it performs at 640x480 I care about actual real world performance. If it can play my game just as good during eyefinity as an Intel why swap.

Uh yeah disregard what I said. I was at work when I said that, then on my way home I thought about it and facepalmed myself.

What I find interesting though is the fact that CPU performance differences in gaming becomes far less when game resolutions are increased and all that. You are right that AMD and Intel are pretty close in gaming performance, but that's only close, Intel still has the performance crown, and they will continue to have it till AMD does a big overhaul on their sockets and chipsets.
 
So basically Intel is secretly awful and no one recognizes that?

Please.

Well actually...link

Uh yeah disregard what I said. I was at work when I said that, then on my way home I thought about it and facepalmed myself.

What I find interesting though is the fact that CPU performance differences in gaming becomes far less when game resolutions are increased and all that. You are right that AMD and Intel are pretty close in gaming performance, but that's only close, Intel still has the performance crown, and they will continue to have it till AMD does a big overhaul on their sockets and chipsets.

Technically only close is with both reasonably overclocked...FX at 5ghz and SB-E@4.4ghz neither is really low wattage I feel stock vs stock the 9590 and 3960x would be close in all ways as far as games go.

The Haswell mainstream chips are a nice toss into the mix, but the half ass heatspreader and heat issue would make me mad, not to mention the extra cost of a entire system change for me makes it not worth it. :laugh:
 
well, them FX chips seems to be better in Crysis 3's grassy areas where teh grass is physicially simulated by the CPUs, more threads the better.
 
.....don't know who you people are ...or how you got into my computer but stop all this arguing!!! your giving my HDD the willies!!!!:roll:

The amd or intel debate is just like the chicken or the egg debate.... It don't matter.... people eat'em both.
 
Oh, a super expensive chip for my gaming? Why not! I am sure I will be hijacking NORAD soon.

Plus it ain't Haswell, why bother?
 
Same reason those CPU's are taking over the server market pretty quickly.
Define "pretty quickly". From the shipping estimates I've seen, AMD's clawing back of server/HPC market share is predicated upon ARM-based solutions from SeaMicro, and a fairly leisurely ramp for conventional x86 based Opterons.
From Mercury Research's own analysis (presumably "96%" is rounded up from 95.7%) :
Intel controlled 96 percent of the market for servers that run on PC processors in the fourth quarter, according to Cave Creek, Arizona-based Mercury Research Inc. Advanced Micro Devices Inc. (AMD), Intel’s only rival in that market, had 4.3 percent.
As for the hoohah about gaming and CPUs...the bulk of games are of course graphics constrained, so a user would need to have some pretty narrow focus to argue gaming as a fundamental feature of a $1K processor IMO. Personally, if its a gaming benchmark comparison, I'd be more inclined to check out games that actually keep a CPU gainfully employed - i.e. strong AI routines, comprehensive CPU physics etc. - usually the province of RTS games - i.e. Skyrim
FX-9590-67.jpg

It's little wonder that games such as Tomb Raider and MLL are showing little differentiation between CPUs or clockspeeds
 
I swear If I see one more shitty skyrim benchmark. Its poorly coded amd sucks with it we fucking get it.
 
I swear If I see one more shitty skyrim benchmark. Its poorly coded amd sucks with it we fucking get it.

Have to agree with you here, the engine on which Skyrim runs is old as the universe itself and it should have been dropped with Oblivion.



On a side note I'd really want both AMD and Intel to increase core count for the Desktop platform.
 
Have to agree with you here, the engine on which Skyrim runs is old as the universe itself and it should have been dropped with Oblivion
Unfortunately, RTS titles used in gaming benchmark suites are a fairly small subset of a fairly small subset (games that are CPU intensive) now that Civ5 has largely passed from benchmarking suites. Likewise, large maps and competent AI aren't synonymous with most games used in benchmarking- which tend towards the corridor-shooter FPS variety...and even when you find a game that is taxing the CPU, it may be the product of stalls due to coding and/or memory usage...Borderlands 2 likely fitting some of those parameters.
CPU1.png


Feel free to stone the messenger, but until sites utilize a CPU intensive game (preferably one that scales with core count adequately), you're stuck with what is benchmarked in CPU comparative game testing - and of course, the paucity of games that fit the bill says something in itself about the general requirement.
 
Skyrim is still and always will be a worthless benchmark. Doesn't scale with cores, doesn't multithread...POS benchmark and you know it.
 
Skyrim is still and always will be a worthless benchmark. Doesn't scale with cores, doesn't multithread...POS benchmark and you know it.

Agreed. You could disable all but one core, and overclock it to the sky, and get the same performance as with the cores enabled and overclocked to the same speed lol.

Reason AMD gets bad performance in Skyrim is because of just that too, AMD FX chips are not known for their single thread performance, where as Intel is.

Have to agree with you here, the engine on which Skyrim runs is old as the universe itself and it should have been dropped with Oblivion.



On a side note I'd really want both AMD and Intel to increase core count for the Desktop platform.

There's not much of a reason though. Software that everyday consumers use has kind of hit a wall in terms of the amount of cores they need. Once software begins to demand more cores, then we will begin to see more cores in our hardware.
 
Last edited:
Software that everyday consumers use has kind of hit a wall in terms of the amount of cores they need.

This is also the reason why Intel is more appealing than AMD right now.

AMD seems to have kinda forgotten that single threaded performance is also important.
 
Do not scorn truck if you need a car

Sorry, but being decent in roughly 7 out of 40 tests at Hardware Canucks doesn't bode well for a processor that is clocked 1200MHz above the competition, costs 1.5-3x more unless you are comparing to 3970X, gets poopy frame rates in games compared to a mainstream Intel CPU, and consumes a ridiculous amount of power for the rather lackluster performance you get, although you've already stated that you don't care if your chip takes almost 300w.

Since I am entitled to an opinion just as you are, I'll stick with my relatively quiet computer, fairly cool room and a ~100w processor that provides great, predictable performance across the board because I don't want my HD 7970 and 120Hz monitor being under-utilized in non-DX11 games which already happens even with my processor. Have fun with your FX 9370 or whatever you ordered. ;)

--
Why people gets so upset about SB/IB-E if all you need is Gaming oriented i5-K chip? Intel give "E's" to us who need it, not for thoose who don't. It's like a truck: It does consume x-times more fuel than a personal vehicle, it's not good on cornering, is noisy and heats a lot. But try to convince some cargo company, that using a family car would be better for them. I do a lot of video rendering: Secods faster are multiplyed many times and at the end of day I save tens of minutes, at the end of month I save hours at the end of year I save days of my valuable working time. My ROI for "only" 10% faster chip or next generation of GPU is 3-6 months. But for office computer, for making invoices, I still have E6800 with 4GB ram. (and will have it for a while).

2nd: Intel CAN NOT give us Haswell-E until server platform wil do that move. Server owners does not care about computer games, but cares about of TCO and ROI, so changing platform every year would instantly penalize Intel and reward compeption. That's it.
 
Well actually...link



Technically only close is with both reasonably overclocked...FX at 5ghz and SB-E@4.4ghz neither is really low wattage I feel stock vs stock the 9590 and 3960x would be close in all ways as far as games go.

The Haswell mainstream chips are a nice toss into the mix, but the half ass heatspreader and heat issue would make me mad, not to mention the extra cost of a entire system change for me makes it not worth it. :laugh:

What heat issue? I have 4770k OC'ed to 4.6ghz@ 1.238v and i never saw per core temperature over 60C, usually its 47-55C in any game so far.


In something like Cinebench11.5 or Any video converter using 8threads x264 codec it hits 62-74C it will be better in winter though and im only at H90, atm outside 40C inside 29C helps too xD
 
What heat issue
The heat issue all reviews report on the IB and Hasswell cpu's

That H90 you have is one hell of a cooler at the claimed 29c inside temperature.

:shadedshu

Try loading it up under OCCT or better yet Prime95!
I betcha you stop the program once you see the temp shoot to the moon ;)
 
In something like Cinebench11.5 or Any video converter using 8threads x264 codec it hits 62-74C

Sounds a lot like my 3820; it consumes a lot of power and I only have a Zalman CNPS 9900 cooling it to get similar temperatures with a similar OC. I wouldn't call that optimal. I read that IVB-E has the heatspreader soldered to the die, but I'm not sure if it's true or not though.
 
Not that big of a performance increase to justify the price.
 
The heat issue all reviews report on the IB and Hasswell cpu's

That H90 you have is one hell of a cooler at the claimed 29c inside temperature.

:shadedshu

Try loading it up under OCCT or better yet Prime95!
I betcha you stop the program once you see the temp shoot to the moon ;)

Yep its how it performs in real world scenario.


I ran IBT, linx, Prime95 it and yeah it was 80-90C, so? I will never ever see that in any app. or game so im not bothered with those tools.. Even intel said its useless and not a real stability indicator.. And its true, I passed all torture tests and yet failed in Bf3 lol :D

Sounds a lot like my 3820; it consumes a lot of power and I only have a Zalman CNPS 9900 cooling it to get similar temperatures with a similar OC. I wouldn't call that optimal. I read that IVB-E has the heatspreader soldered to the die, but I'm not sure if it's true or not though.

I never said its optimal and that's the worse case scenario in summer. When i get higher static pressure fans 2.7mm h2o it will change a lot (atm 1.5mm h20):)


About soldering, someone leaked IB-e deliding and it showed proper soldering.
 
Yep its how it performs in real world scenario.


I ran IBT, linx, Prime95 it and yeah it was 80-90C, so? I will never ever see that in any app. or game so im not bothered with those tools.. Even intel said its useless and not a real stability indicator.. And its true, I passed all torture tests and yet failed in Bf3 lol :D



I never said its optimal and that's the worse case scenario in summer. When i get higher static pressure fans 2.7mm h2o it will change a lot (atm 1.5mm h20):)


About soldering, someone leaked IB-e deliding and it showed proper soldering.

If you manage to pass 1 hr LinX AVX enabled I'm pretty sure you won't find anything that makes your PC crash.

I use that as stability test and I've never had my pc hang after passing that.

Beware it shoots your temps to the moon (unlike Linx non-AVX), makes my 3930K go up to 80C on a gigantic custom water loop.
 
Back
Top