Friday, February 1st 2013

Intel "Haswell" Quad-Core CPU Benchmarked, Compared Clock-for-Clock with "Ivy Bridge"

Russian tech publication OCLab.ru, which claims access to Intel's next-generation Core "Haswell" processor engineering-sample (and an LGA1150 8-series motherboard!), wasted no time in running a quick clock-for-clock performance comparison with the current Core "Ivy Bridge" processor. In its comparison, it set both chips to run at a fixed 2.80 GHz clock speed (by disabling Turbo Boost, C1E, and EIST), indicating that the ES OCLab is in possession of doesn't go beyond that frequency.

The two chips were put through SuperPi 1M, PiFast, and wPrime 32M. The Core "Haswell" chip is only marginally faster than Ivy Bridge, in fact slower in one test. In its next battery of tests, the reviewer stepped up iterations (load), putting the chips through single-threaded SuperPi 32M, and multi-threaded wPrime 1024M. While wPrime performance is nearly identical between the two chips, Haswell crunched SuperPi 32M about 3 percent quicker than Ivy Bridge. It's still to early to take a call on CPU performance percentage difference between the two architectures. Intel's Core "Haswell" processors launch in the first week of June.

Source: OCLab.ru via X-bit Labs
Add your own comment

118 Comments on Intel "Haswell" Quad-Core CPU Benchmarked, Compared Clock-for-Clock with "Ivy Bridge"

#1
Aquinus
Resident Wat-man
by: Patriot
431s for wprime 1024m ... Darn thats slow.
I can do it in ~30s
Don't be cocky. Not everyone has a 4P server to fold with. Run SuperPi instead and that result will change pretty quickly because you won't be using 48 cores. :wtf:
My 3820 @ 4.3ghz did it in 215s.
Posted on Reply
#2
Patriot
by: Aquinus
Don't be cocky. Not everyone has a 4P server to fold with. Run SuperPi instead and that result will change pretty quickly because you won't be using 48 cores. :wtf:
My 3820 @ 4.3ghz did it in 215s.
It actually wasn't that bad... just didn't set any world records.
at 3.8ghz Magny cours is pretty potent. It was folding stable at 3.48ghz... a 75% overclock. :)
And its a personal rig... built her from the ground up. Cost less than many a SB-E rigs.

But yes High clocked uP have their place. Though I tend to drift towards more threads being a folder. I do enjoy a SB-E for a daily driver.
Posted on Reply
#3
Totally
by: Dj-ElectriC
I don't believe in clock to clock benchmarks and find them pointless.
1. A CPU can be as fast as another C2C but the other comes naturally clocked higher
2. A CPU can be overclocked much further than another
So, of course a bugatti veiron at 60km/h would be as fast as fiat uno at 60km/h

For those who want examples, benchmark an i7 920 against 3770K C2C and see what im talking about.
well, your car analogy is very wrong, and they do matter to some for example current ivy owners who aren't sold on just the e-peen factor this just shows them that a simple trip into the bios, the tweak of a few parameters will net them similar performance without spending money on a new chip. Or it matters to those who are in a position to upgrade now and are faced with the question 'upgrade now or wait for Haswell'

The correct analogy would be pitting a stock Gallardo (490hp) against Gallardo lp570-4 superellegga (de-tuned to 490hp) and seeing that they make roughly the same times around any given track
Posted on Reply
#4
Redspeed93
by: NC37
Hey uhh Intel...yeah, AMD called, they want their mediocre speed bump back.
Yeah if these graphs are true it's kinda sad. It's partially AMDs fault. If they were actually pushing innovation and making some decent chips Intel wouldn't be able to produce chips like this and still make a profit. But they will as long as AMD fails to impress.
Posted on Reply
#5
FreedomEclipse
~Technological Technocrat~
by: Redspeed93
as long as AMD fails to impress.
AMD has been impressing in other areas - such as APUs. as a whole, AMD hasnt targeted the high end market in a long while.
Posted on Reply
#6
Redspeed93
by: FreedomEclipse
AMD has been impressing in other areas - such as APUs. as a whole, AMD hasnt targeted the high end market in a long while.
But that's not really the topic at hand here. The sad truth is that there is very little reason right now to put an AMD CPU in your desktop PC.

Succes with APUs or not, that still means quite a substantial loss of potential profits.
Posted on Reply
#7
Thefumigator
by: Redspeed93
But that's not really the topic at hand here. The sad truth is that there is very little reason right now to put an AMD CPU in your desktop PC.

Succes with APUs or not, that still means quite a substantial loss of potential profits.
the FX 8320 is only 179 bucks, and its a better cpu compared to any intel at the same price, unless you are just a gamer, there are several reasons to choose AMD. At least if you treat a desktop as one, and not a gaming-only device, then AMD has some shine. Just saying giving them zero credit is somewhat extreme.

On the other hand, I feel APUs are somewhat pointless on desktop, I prefer them on laptops.
Posted on Reply
#9
LAN_deRf_HA
by: Thefumigator
the FX 8320 is only 179 bucks, and its a better cpu compared to any intel at the same price, unless you are just a gamer, there are several reasons to choose AMD. At least if you treat a desktop as one, and not a gaming-only device, then AMD has some shine. Just saying giving them zero credit is somewhat extreme.

On the other hand, I feel APUs are somewhat pointless on desktop, I prefer them on laptops.
The latest FX chips seem nice for the price in certain aps. Then you get to the power consumption page. Kinda invalidates any sense of accomplishment. Certainly not cost effective to use for any large scale business installations. The only AMD chip I'd grab is one of their 65w APUs, and seriously only under very specific circumstances.
Posted on Reply
#10
TheGuruStud
by: LAN_deRf_HA
The latest FX chips seem nice for the price in certain aps. Then you get to the power consumption page. Kinda invalidates any sense of accomplishment. Certainly not cost effective to use for any large scale business installations. The only AMD chip I'd grab is one of their 65w APUs, and seriously only under very specific circumstances.
Since when did that stop businesses from buying junk in a box pentium 4s and Ds ;)
They buy whatever has the best marketing. They don't know any better. And usually Dulls.
Posted on Reply
#11
Thefumigator
by: Aquinus
I don't know about that. With the right hardware you could build an ultra-portable desktop using something like these. ITX chassis tend to get pretty small and Trinity has a lot to offer.
ASRock FM2A85X-ITX FM2 AMD A85X (Hudson D4) HDMI S...
AMD A10-5800K Trinity 3.8GHz (4.2GHz Turbo) Socket...
If you consider HTPC a desktop then trinity is fine, and of course I didn't thought on small form factor desktops, it could be useful in some scenarios. But I still believe trinity is killer on laptop. Such a decent CPU and impressive GPU sported in sub 650$ laptops. Its just... insane.

by: LAN_deRf_HA
The latest FX chips seem nice for the price in certain aps. Then you get to the power consumption page. Kinda invalidates any sense of accomplishment. Certainly not cost effective to use for any large scale business installations. The only AMD chip I'd grab is one of their 65w APUs, and seriously only under very specific circumstances.
Yeah I know the power consumption page, and also heat, I had to replace the stock AMD cooler because it sounded like a jet engine on my FX-8320, I bought a TX3 and its much more silent. Still, the price of the chip itself made this system possible, I think I can trade off a price cut for those ineficiencies sometimes. Still, its not a superhuge difference in power consumption. 95watts piledrivers will be out soon...

by: TheGuruStud
Since when did that stop businesses from buying junk in a box pentium 4s and Ds ;)
They buy whatever has the best marketing. They don't know any better. And usually Dulls.
Amen to that. Except Intel did sell completely inefficients P4 and Ds for a huge price. While in this case AMD just give the prices down to make them attractive.
Posted on Reply
#12
Aquinus
Resident Wat-man
by: Thefumigator
95watts piledrivers will be out soon...
That's TDP not power consumption. I feel like I have to point this out because everyone seems to think that all of a CPU's power is used to make heat. It's a processor not a space heater. :banghead:

The TDP is how much power it takes to keep the CPU thermal within spec. Watts is the measurement of power (not electricity,) and since heat is kinetic energy, how much heat that needs to be dissipated is described in watts. Since all heat in a CPU is generated by ohmic heating and has a non-infinite resistance/impedance, only a portion of the CPU's power usage actually gets made into heat. AMD tends to release less energy as heat (hence, higher power consumption without incredibly higher TDPs,) where Intel tends to release more of that energy as heat, but never will the TDP truly match the consumption of the CPU. ...but as I've said before, I suspect that the difference in how much heat is generated has to do with the manufacturing process for the CPU (Intel's HKMG vs AMD's SOI.)
Posted on Reply
#13
sergionography
by: LAN_deRf_HA
The latest FX chips seem nice for the price in certain aps. Then you get to the power consumption page. Kinda invalidates any sense of accomplishment. Certainly not cost effective to use for any large scale business installations. The only AMD chip I'd grab is one of their 65w APUs, and seriously only under very specific circumstances.
measuring maximum power consumption does not indicate anything objective in the real world, especialy when comparing an 8 core with a quad core
now for typical desktop use i bet you 1 amd core is as efficient if not more efficient than an intel core(power consumption wise not performance dont get me wrong), and for applications that barely stress the cores which are pretty much like 90% of apps out there then surely amd isnt bad at all, not to mention amd has the multithread advantage, and i dont mean best benchmarks results, im talking about running multiple things at once while still having consistent performance. intel quads on the other hand when using an app that stresses 3 cores then only one core is left for everything else, to the end user thats a big deal
Posted on Reply
#14
Aquinus
Resident Wat-man
by: sergionography
measuring maximum power consumption does not indicate anything objective in the real world, especialy when comparing an 8 core with a quad core
now for typical desktop use i bet you 1 amd core is as efficient if not more efficient than an intel core(power consumption wise not performance dont get me wrong), and for applications that barely stress the cores which are pretty much like 90% of apps out there then surely amd isnt bad at all, not to mention amd has the multithread advantage, and i dont mean best benchmarks results, im talking about running multiple things at once while still having consistent performance. intel quads on the other hand when using an app that stresses 3 cores then only one core is left for everything else, to the end user thats a big deal
Not that I disagree with anything you've said, but when he says "65w", I think he really means TDP and is confusing it with power draw. In which case he needs to see this post.

For a desktop, in business I can tell you TDP is one of the last things I think about. The only exception to that is a server that needs to run on low power in case there is a power outage in order to maximize battery life. A good example of that would be a gateway server and/or a VoIP server.
Posted on Reply
#15
cdawall
where the hell are my stars
by: Redspeed93
Yeah if these graphs are true it's kinda sad. It's partially AMDs fault. If they were actually pushing innovation and making some decent chips Intel wouldn't be able to produce chips like this and still make a profit. But they will as long as AMD fails to impress.
They did push innovation. The FX 8350 is faster than the 3820/3770K/3570K in heavily multithreaded apps. Like Metro 2033, transcoding+games, video editing the list goes on. Just because Intel wins the single threaded IPC race doesn't make them a better CPU just depends the applications you are running. Want to be upset at someone for this marginal performance jump? Get upset at he developers not bothering to push current systems. Skyrim is just as fast on an i3 as an i7 that should say something. The AMD APU being in the PS4 should say something else. :laugh:

[media=youtube]4et7kDGSRfc[/media]
Posted on Reply
#16
Ikaruga
by: cdawall
They did push innovation. The FX 8350 is faster than the 3820/3770K/3570K in heavily multithreaded apps. Like Metro 2033, transcoding+games, video editing the list goes on. Just because Intel wins the single threaded IPC race doesn't make them a better CPU just depends the applications you are running. Want to be upset at someone for this marginal performance jump? Get upset at he developers not bothering to push current systems. Skyrim is just as fast on an i3 as an i7 that should say something. The AMD APU being in the PS4 should say something else. :laugh:

[media=youtube]4et7kDGSRfc[/media]
Because the guy on the video is not here and can't defend himself, I will greatly soften my opinion about him (below) to "favor" his side:

He is clearly a sad attention-whore, who can't be taken seriously. That staged setup about his precious "belongings" he put around him is nothing but a terrible joke. That's not a table of a computer enthusiast, more like a table of an 8 year old boy posting his first facebook picture.
And after all of that, he is trying to convince everybody on the Internet that AMD CPUs are just as good as Intel ones, and you gain nothing if you go Intel. So basically everybody (all the professional and trusted people we know and listen to), who spent days and weeks with hard work running tests and CPU benches and making reviews until now were simply clueless, and he is the only one who finally holds and gives us the real truth.

Sure, no problem:toast:
Posted on Reply
#17
[H]@RD5TUFF
Seems a little underwhelming, but shady unsourced benchmarks are not to be trusted.
Posted on Reply
#18
Frick
Fishfaced Nincompoop
I didn't see the video, but AMD does well in all reviews, in certain situations. There is no denying that. The problem is that the performance is all over the place. Anand's final words sums it up pretty well, especially the highlighted part:
Ultimately Vishera is an easier AMD product to recommend than Zambezi before it. However the areas in which we'd recommend it are limited to those heavily threaded applications that show very little serialization. As our compiler benchmark shows, a good balance of single and multithreaded workloads within a single application can dramatically change the standings between AMD and Intel. You have to understand your workload very well to know whether or not Vishera is the right platform for it. Even if the fit is right, you have to be ok with the increased power consumption over Intel as well.
Posted on Reply
#19
Prima.Vera
by: cdawall
They did push innovation. The FX 8350 is faster than the 3820/3770K/3570K in heavily multithreaded apps. Like Metro 2033, transcoding+games, video editing the list goes on. Just because Intel wins the single threaded IPC race doesn't make them a better CPU just depends the applications you are running. Want to be upset at someone for this marginal performance jump? Get upset at he developers not bothering to push current systems. Skyrim is just as fast on an i3 as an i7 that should say something. The AMD APU being in the PS4 should say something else. :laugh:

[media=youtube]4et7kDGSRfc[/media]
You must be joking with that rigged video. That's the worst lie since Watergate. :shadedshu
Posted on Reply
#20
Thefumigator
by: Aquinus
Not that I disagree with anything you've said, but when he says "65w", I think he really means TDP and is confusing it with power draw. In which case he needs to see this post.

For a desktop, in business I can tell you TDP is one of the last things I think about. The only exception to that is a server that needs to run on low power in case there is a power outage in order to maximize battery life. A good example of that would be a gateway server and/or a VoIP server.
You are discussing this with an electrical engineering student... not that I'm a good one :o.

C'mon, we all know a 65w processor can't really make more heat and consume more power than a 125w processor. Of course the relation is not linear and it has to be measured on full throttle.

Also if you put a 125 watt processor into a 95 watt motherboard, the mobo will die, sooner or later. Not because of heat, but because of power draw overheating its regulators. It could even overheat the tracks.

I didn't say TDP is exactly heat dissipation power draw, but it comes almost hand in hand.

I'm pretty sure those 95w FX processors will be more competitive in the power consumption and heat dissipation area where today the 125w versions are struggling.
Posted on Reply
#21
Aquinus
Resident Wat-man
by: Thefumigator
You are discussing this with an electrical engineering student... not that I'm a good one :o.

C'mon, we all know a 65w processor can't really make more heat and consume more power than a 125w processor. Of course the relation is not linear and it has to be measured on full throttle.

Also if you put a 125 watt processor into a 95 watt motherboard, the mobo will die, sooner or later. Not because of heat, but because of power draw overheating its regulators. It could even overheat the tracks.

I didn't say TDP is exactly heat dissipation, but it comes almost hand in hand.

I'm pretty sure those 95w FX processors will be more competitive in the power consumption and heat dissipation area where today the 125w versions are struggling.
Right, but a 125 watt TDP CPU like the 8150 actually can consume closer to 180 watts (depending on the leakage for that particular CPU), it's just that 125 watts of that is converted to heat (which is actually a lot of wasted energy, but that's a different topic). Of course this is all at stock, but I'm just mentioning this because a lot of people are using TDP and power draw interchangeably and depending on the CPU the amount of leakage the circuit generates can vary.
Posted on Reply
#22
NeoXF
Considering the power usage should be worse (at least in load) for Haswell... And if AMD can improve theirs (along with the obvious IPC/uArch improvements and hopefully some platform as well)... I'd say this would be a good time for AMD to catch up...

Also, que 5 or 6 module cosumer FX CPUs for extra squeeze on Intel...
Posted on Reply
#23
Thefumigator
by: Aquinus
Right, but a 125 watt TDP CPU like the 8150 actually can consume closer to 180 watts (depending on the leakage for that particular CPU), it's just that 125 watts of that is converted to heat (which is actually a lot of wasted energy, but that's a different topic). Of course this is all at stock, but I'm just mentioning this because a lot of people are using TDP and power draw interchangeably and depending on the CPU the amount of leakage the circuit generates can vary.
surely, I get it now. Funny thing is, I'm sitting next to a prescott processor desktop computer and it doesn't seem such a disaster... at least on idle. Its also very snappy, from a subjective point of view. Oh well, its running XP...
Posted on Reply
#24
pjl321
It's time we moved on!

I guess the days of significant performance increases are over. The only thing that would get my wallet out is a mid-range priced 8-core (real 8-core, 16-core in AMD world). How long have we had quad-core now for? 2006 I think and they have been mid-range for well over 5 years.

It's time we moved on!
Posted on Reply
#25
Frick
Fishfaced Nincompoop
by: pjl321
I guess the days of significant performance increases are over. The only thing that would get my wallet out is a mid-range priced 8-core (real 8-core, 16-core in AMD world). How long have we had quad-core now for? 2006 I think and they have been mid-range for well over 5 years.

It's time we moved on!
Naah, the increases are just not here in this instant. And your avarage computer user has no need for more speed. Go back ten years and it was new and hot and cool, now all the action happens on tablets and phones. The jiggahurtz race ended with the Core 2 line.
Posted on Reply
Add your own comment