Friday, February 1st 2013

Intel "Haswell" Quad-Core CPU Benchmarked, Compared Clock-for-Clock with "Ivy Bridge"

Russian tech publication OCLab.ru, which claims access to Intel's next-generation Core "Haswell" processor engineering-sample (and an LGA1150 8-series motherboard!), wasted no time in running a quick clock-for-clock performance comparison with the current Core "Ivy Bridge" processor. In its comparison, it set both chips to run at a fixed 2.80 GHz clock speed (by disabling Turbo Boost, C1E, and EIST), indicating that the ES OCLab is in possession of doesn't go beyond that frequency.

The two chips were put through SuperPi 1M, PiFast, and wPrime 32M. The Core "Haswell" chip is only marginally faster than Ivy Bridge, in fact slower in one test. In its next battery of tests, the reviewer stepped up iterations (load), putting the chips through single-threaded SuperPi 32M, and multi-threaded wPrime 1024M. While wPrime performance is nearly identical between the two chips, Haswell crunched SuperPi 32M about 3 percent quicker than Ivy Bridge. It's still to early to take a call on CPU performance percentage difference between the two architectures. Intel's Core "Haswell" processors launch in the first week of June.

Source: OCLab.ru via X-bit Labs
Add your own comment

118 Comments on Intel "Haswell" Quad-Core CPU Benchmarked, Compared Clock-for-Clock with "Ivy Bridge"

#1
pjl321
by: Frick
Naah, the increases are just not here in this instant. And your avarage computer user has no need for more speed. Go back ten years and it was new and hot and cool, now all the action happens on tablets and phones. The jiggahurtz race ended with the Core 2 line.
You have to ask yourself, why are Intel (or any tech company) bring out new products?

Simple answer, to make money.

I agree most users don't need even the power of today's CPU so why would anyone buy something ever so slightly more powerful than what they have now? They won't, but they probably would buy it if the performance doubled. Yes they still wouldn't need this extra power but they will still like that it is 2x the performance of their current product and put their hand in their pocket to buy it.

Maybe some of them will find a use for the extra power, maybe they won't but Intel will be happy and so will I.
Posted on Reply
#2
Thefumigator
by: pjl321
I guess the days of significant performance increases are over. The only thing that would get my wallet out is a mid-range priced 8-core (real 8-core, 16-core in AMD world). How long have we had quad-core now for? 2006 I think and they have been mid-range for well over 5 years.

It's time we moved on!
Well you still can buy at newegg a dual G34 with two 16 core opteron processors (or 8 real cores as you wish) and you'll have enough cores FTW. I was planning that but my budged was too tight, I just didn't make it. I got an FX 8320 instead. I still hope AMD to bring more cores to the AM3+ platform...

by: pjl321
You have to ask yourself, why are Intel (or any tech company) bring out new products?

Simple answer, to make money.

I agree most users don't need even the power of today's CPU so why would anyone buy something ever so slightly more powerful than what they have now? They won't, but they probably would buy it if the performance doubled. Yes they still wouldn't need this extra power but they will still like that it is 2x the performance of their current product and put their hand in their pocket to buy it.

Maybe some of them will find a use for the extra power, maybe they won't but Intel will be happy and so will I.
I still feel computers today are slow. I mean, not long ago I borrowed a movie and I converted it to DivX, it took some minutes without counting the DVD ripping. I think it could be faster, faster! faster! faster! no waiting! I would pay for a -2x performance- computer If price was OK. But talking about performance, we are ages behind what I could expect for.
Posted on Reply
#3
cdawall
where the hell are my stars
by: Ikaruga
Because the guy on the video is not here and can't defend himself, I will greatly soften my opinion about him (below) to "favor" his side:

He is clearly a sad attention-whore, who can't be taken seriously. That staged setup about his precious "belongings" he put around him is nothing but a terrible joke. That's not a table of a computer enthusiast, more like a table of an 8 year old boy posting his first facebook picture.
And after all of that, he is trying to convince everybody on the Internet that AMD CPUs are just as good as Intel ones, and you gain nothing if you go Intel. So basically everybody (all the professional and trusted people we know and listen to), who spent days and weeks with hard work running tests and CPU benches and making reviews until now were simply clueless, and he is the only one who finally holds and gives us the real truth.

Sure, no problem:toast:
Plenty of places support what he said just because most sites don't review the same doesn't make his wrong. Teksyndicate isn't exactly known for being biased...

by: Prima.Vera
You must be joking with that rigged video. That's the worst lie since Watergate. :shadedshu
Rigged because Intel lost to a $179 cpu in applications Vishera performs well in?
Posted on Reply
#4
xenocide
by: cdawall
Rigged because Intel lost to a $179 cpu in applications Vishera performs well in?
Vishera does outperform in applications that support 8 threads (or more), but those are so few and far between, it's not exactly a deciding metric for most people. Unless you play exclusively Metro 2033 and encode videos daily (very few people) it's not necessarily a higher performing CPU.
Posted on Reply
#5
cdawall
where the hell are my stars
by: xenocide
Vishera does outperform in applications that support 8 threads (or more), but those are so few and far between, it's not exactly a deciding metric for most people. Unless you play exclusively Metro 2033 and encode videos daily (very few people) it's not necessarily a higher performing CPU.
There are a lot of people who transcode videos of what they are playing to youtube. A lot of people do that is video encoding and youtube is full of gameplay videos as far as the eye can see. That is an application Vishera does substantially better in and one that does not make it into many reviews which is sad considering how many people do it.
Posted on Reply
#7
james888
by: Mr.EVIL
Haswell will use HyperThreading ? again ?
Why would it not?
Posted on Reply
#8
pjl321
by: james888
Why would it not?
How else would Intel separate the Core i5 and the Core i7 and try to justify the massive price premium for much a small performance increases!
Posted on Reply
#9
james888
by: pjl321
How else would Intel separate the Core i5 and the Core i7 and try to justify the massive price premium for much a small performance increases!
I asked why would they not use hyperthreading not why do they use.
Posted on Reply
#10
pjl321
by: james888
I asked why would they not use hyperthreading not why do they use.
Woops, meant to quote James888.
Posted on Reply
#11
Frick
Fishfaced Nincompoop
by: pjl321
You have to ask yourself, why are Intel (or any tech company) bring out new products?

Simple answer, to make money.

I agree most users don't need even the power of today's CPU so why would anyone buy something ever so slightly more powerful than what they have now? They won't, but they probably would buy it if the performance doubled. Yes they still wouldn't need this extra power but they will still like that it is 2x the performance of their current product and put their hand in their pocket to buy it.

Maybe some of them will find a use for the extra power, maybe they won't but Intel will be happy and so will I.
I'm not sure I agree they would. Would they do it if it happened on a tablet? In a heartbeat. To be honest I don't think people care about PC's enough anymore. This is of course taken from my arse and is pure speculation. :P

And with cloud computing and all that most personal computing devices will probably be more and more like the terminals of old. It's both good and bad imo.
Posted on Reply
#12
Aquinus
Resident Wat-man
by: Frick
And with cloud computing and all that most personal computing devices will probably be more and more like the terminals of old. It's both good and bad imo.
It's good when you have a service like YouTube where you can upload a video from a mobile device where their cloud re-encodes it for you.
Posted on Reply
#13
Slizzo
by: Thefumigator
I still feel computers today are slow. I mean, not long ago I borrowed a movie and I converted it to DivX, it took some minutes without counting the DVD ripping. I think it could be faster, faster! faster! faster! no waiting! I would pay for a -2x performance- computer If price was OK. But talking about performance, we are ages behind what I could expect for.
I use xilisoft to transcode videos I get online. Uses CUDA, so it transcode quite quickly.
Posted on Reply
#14
niko084
by: ...PACMAN...
You're wrong, clock to clock is the best way to benchmark and show differences in the architecture of two different chips at the same speed. It's all about IPC.

I understand where you are coming from with regards overclocking headroom but that normally forms part of a FULL review also.

Needless to say, my FX 4100@3.6 is miles behind a 2600K@3.6 :D
Exactly!

It's a direct comparison of base line architecture, meaning unless they prove to clock to 6ghz a high end ivy bridge owner has little to no reason to upgrade considering raw cpu power alone. From the looks of it here anyways.
Posted on Reply
#16
Kaynar
by: Thefumigator

C'mon, we all know a 65w processor can't really make more heat and consume more power than a 125w processor. Of course the relation is not linear and it has to be measured on full throttle.
Then how can you explain that my i7 930 at 4Ghz with 130+W TDP at 1.3v needs a corsair H100, 4 fans, and aftermarket thermal past to stay under 80c, while my flatmate's i5 3570K at 4.5Ghz with 70+W TDP and near 1.3v doest go over 55c with a plain H60 and two fans.... I'm just curious cause I can't figure out another reason than TDP...
Posted on Reply
#17
Aquinus
Resident Wat-man
by: Kaynar
Then how can you explain that my i7 930 at 4Ghz with 130+W TDP at 1.3v needs a corsair H100, 4 fans, and aftermarket thermal past to stay under 80c, while my flatmate's i5 3570K at 4.5Ghz with 70+W TDP and near 1.3v doest go over 55c with a plain H60 and two fans.... I'm just curious cause I can't figure out another reason than TDP...
I bet your 920 is consuming at least twice as much power as the 3570k at the same voltage as well, so I suspect that the amount of current your CPU is eating is that much more than a 3570k which is why your 920 is getting hotter. Heat increases exponentially as current draw increases, not voltage. Increasing the voltage only enables current to increase since the impedance in any part of the CPU at any given time won't change (much). A good example is my i7 3820. I have it running at 4.5Ghz @ ~1.4v and I don't see much higher than 65*C fully loaded with a Zalman CNPS9900. For me to get to 80*C I need to pump 1.5v or more through my CPU and at that point I'm too concerned about heat.

Also, the amount of leakage a chip generates varies from CPU to CPU, so two CPU that are the same could produce different amounts of heat at the same voltages but that is a different topic.

All in all, your 920 eats more power than an IVB chip. I honestly wouldn't be surprised if it consumed more power than mine at a high voltage as well.
Posted on Reply
#18
malyleo
You're right

by: RejZoR
I guess i'll be keeping my trusty Core i7 920 for another year or two. The chip was so good it's probably the longest owned single CPU in my systems ever. It's so long i don't even remember what year i bought it, which is unusual...
I bought the i7-960 back in 2010 and I still have no problems with it, everything running fast and smooth. I've been thinking about an upgrade later in the year, but I guess I'll hold off for another year. Maybe just a graphic card and memory, I'll see. It was a great purchase as far as I'm concerned... :) :rockout:
Posted on Reply
Add your own comment