Saturday, November 12th 2011

Sandy Bridge-E Benchmarks Leaked: Disappointing Gaming Performance?

Just a handful of days ahead of Sandy Bridge-E's launch, a Chinese tech website, www.inpai.com.cn (Google translation) has done what Chinese tech websites do best and that's leak benchmarks and slides, Intel's NDA be damned. They pit the current i7-2600K quad core CPU against the upcoming i7-3960X hexa core CPU and compare them in several ways. The take home message appears to be that gaming performance on BF3 & Crysis 2 is identical, while the i7-3960X uses considerably more power, as one might expect from an extra two cores. The only advantage appears to come from the x264 & Cinebench tests. If these benchmarks prove accurate, then gamers might as well stick with the current generation Sandy Bridge CPUs, especially as they will drop in price, before being end of life'd. While this is all rather disappointing, it's best to take leaked benchmarks like this with a (big) grain of salt and wait for the usual gang of reputable websites to publish their reviews on launch day, November 14th. Softpedia reckons that these results are the real deal, however. There's more benchmarks and pictures after the jump.

Source: wccftech.com
Add your own comment

171 Comments on Sandy Bridge-E Benchmarks Leaked: Disappointing Gaming Performance?

#51
Unregistered
ramcozaBut I couldn't get the point. People who buy this series of CPUs will never play at such lower resolutions(whoever afford to buy this, should be already owning at least a top tier GPU & Display). So there is no sense to test it at lower res, even though you have to test it's processing power while gaming. Is it fair to test Today's CPU with a decade old configuration and come to a conclusion according to those results? I didn't stand for SB-E or Intel. I thought it's unfair to come to a conclusion with these low Res benchies. I may be wrong. But anyway we can find out the real story tomorrow.. ;)
You're thinking of it wrong. One can't test true potential of a CPU at high res (hence the GPU). You have to lower down resolution to take GPU out of question, only then you can see how fast a CPU can get. Look at this:

www.hardwarecanucks.com/charts/index.php?pid=70,76&tid=3

Then this;

#52
r9
It is funny how they promote CPUs. When they market Atoms they say how it can do everything why some one would need anything more and for Sandy they are trying to convince us that with anything less you could not surf the web.
Posted on Reply
#53
Dent1
n-sterBesides, we know many games are GPU intensive
That rule was forgotten when Bulldozer was launched. And remembered when Sandy Bridge-E is showing signs of floppping.
Wile Egaming performance does not represent a cpu's true potential.
I wish people remembered that saying when Bulldozer was received negatively on launch.
Posted on Reply
#54
Super XP
This sounds quite logical, how much faster can Intel make its already fast CPU? If these Benchmarks reflect its true performance, then my question was answered.
I would also have agree with Wile E in regards to gaming performance does not represent a CPU’s True Potential, and IMO the blame needs to go to the lazy gaming developers.
Posted on Reply
#55
ramcoza
John DoeYou're thinking of it wrong. One can't test true potential of a CPU at high res (hence the GPU). You have to lower down resolution to take GPU out of question, only then you can see how fast a CPU can get. Look at this:

www.hardwarecanucks.com/charts/index.php?pid=70,76&tid=3

Then this;

www.legitreviews.com/images/reviews/1751/hawx.jpg
I understand, dude. You look at those chart again. So a $120 cpu matches $999 cpu and $245 AMD flagship at higher resolution. So what the past and present say is, a $120 CPU matched/matches flagship $999's performance(high res), when it comes to games. So why you see this weird when a $310 matches a probable $999 cpu in gaming performance(low res)?

What I told already is, the people, who buy this cpu will already have a top GPU, if not having a multiple card configuration. So they won't get any performance boost when comparing SB-E with an i3 even. But that is what the story we already know.

The people who buy this CPU, will only buy this for bragging rights(having world fastes CPU) or to work with a heavy threaded environment like photoshop works, encordings, encryptions, etc., (like they did on i7 980x/990x). But one thing is for sure, this thing is not for me. :D
Posted on Reply
#56
Disparia
I'm liking it, as one of those few who game and get stuff done.

And on a strictly gaming perspective, I'm not disappointed either. Should be able to get four users running at a decent performance on a quad GPU, 32GB RAM, 6-core SB-E box. ;)
Posted on Reply
#57
dicobalt
Just another case of "MOAR CORES!!!!!11"
Posted on Reply
#58
KieX
Disappointment? 6c/12t CPUs are intended for those that require performance for multi-threaded software (and those who can afford it for the sake of it).

These are chips for those who want to set new benchmark records, video editing/encoding, run distributed computing projects, run Virtual Machines... Which these benchmarks don't really show much of.

For gamers, the only realistic criteria is: how well does it run multiple-GPU setups. Which again.. these benchmarks aren't showing.

The one thing this does show is that per-core performance is as expected. So when the NDA is lifted, we should start seeing more meaningful results. And those who have money to spend for fun, won't care either way. If anyone thinks this is disappointing they really need to reconsider their point of view and vent their frustration at game devs ;)
Posted on Reply
#59
qubit
Overclocked quantum bit
qubitOne wouldn't game at it, but one would certainly bench at it when comparing CPU performance, to remove the GPU from the equation.
For some reason, there seems to be quite a bit of confusion by various posters here over the above statement, starting with erocker:
erockerWhich is a useless way to test a CPU as the data provided shows no useful and/or real world benefit. Benchmarks/applications that actually use the CPU show it as being quite a bit better than SB.
People, I don't see how I can put it any more clearly. The idea is to isolate each individual CPU's true performance, so the last thing you want to do is give the graphics card any significant work to do. Heck, if the card could be switched off altogether (possible in Unreal Tournament 2003/4) then you'd have an even more accurate result.

And it doesn't matter if one CPU achieves 200fps and the other 1200fps (6 times faster) you're measuring performance differences between them. This difference will become plenty obvious as time goes by and games become more demanding, giving the faster one a longer useful life. So for example, when the slower one achieves only a useless 10fps, the faster one will still be achieving 60fps and be highly useable.

Of course, it's also a good idea to supplement these tests with real-world resolutions too, as there can be unexpected results sometimes.

Thanks to John Doe for replying with some excellent answers to this misunderstanding. :toast:
entropy13Just go check it's page in Wikipedia, "Touhou Project."
Will do. :)
entropy13Depends though. Comparing over 200 fps to another over 200fps is pretty stupid too. :laugh:
It isn't, as I've explained above in this post.
entropy13It should have been obvious already what I was trying to point out already. There are "benchmarks" where CPU power doesn't really matter at all. ;)

Touhou is just a very extreme and ridiculous example. :laugh:
Yes, of course, lol. Benchmark a bunch of older games with vsync on and they'll all peg at a solid 60fps.
Posted on Reply
#60
ensabrenoir
mustangs and bugattis

Um guys....en·thu·si·ast. CHIP. ENTHUSIAST as in sr2-ed quad sli 580 or tri fired 6990s water cooled mountain mod case...you get the idea. We have more money/credit than common sense will spend an extra $300 for a 3% increase... we want the best period. Not price/performance... just the highest high end. yeah we know the speed limit is 65 and the mustang is an adequate sports car but no...give me that Bugatti. Thats the market for this thing. People who run six screens eyefinity. Our 3 screened 6870 xfired logic don't apply.
Posted on Reply
#61
MilkyWay
What are people expecting? Its a tweaked Sandybridge with 2 extra cores. Its not going to be miles better just a bit better in some tasks due to the 6 cores.
Posted on Reply
#62
qubit
Overclocked quantum bit
MilkyWayWhat are people expecting? Its a tweaked Sandybridge with 2 extra cores. Its not going to be miles better just a bit better in some tasks due to the 6 cores.
I wish they'd hurry up and release an 8 core version. I'd totally nerd out to having one of those in my rig!
Posted on Reply
#63
MilkyWay
qubitI wish they'd hurry up and release an 8 core version. I'd totally nerd out to having one of those in my rig!
People would probably expect double the performance lol. Honestly id get a 2500k now with a decent motherboard and wait for Ivybridge (which im doing). I personally wouldn't even bother with Sandybridge-E.

An 8core version would cost me a house mortgage.
Posted on Reply
#64
KieX
qubitI wish they'd hurry up and release an 8 core version. I'd totally nerd out to having one of those in my rig!
They are releasing a bunch of 8c/16t on Xeon E5 platform. At this stage it's unknown whether they're compatible with the X79 chipset though.. and with up to 150W TDP it's also unknown what type of OC can be expected.

www.cpu-world.com/news_2011/2011102701_Prices_of_Xeon_E5-2600-series_CPUs.html
Posted on Reply
#65
qubit
Overclocked quantum bit
MilkyWayPeople would probably expect double the performance lol. Honestly id get a 2500k now with a decent motherboard and wait for Ivybridge (which im doing). I personally wouldn't even bother with Sandybridge-E.

An 8core version would cost me a house mortgage.
Ah, you're right about the price. :ohwell:

I'm waiting for the SB-E reviews tomorrow to decide. If it really doesn't provide a significant gaming boost and/or prices are sky high (they likely will) then I'll just get a 2700K and be done with it.

Note that while Ivy Bridge is an improvement over SB and has those high res integrated graphics, it's not considered an enthusiast platform by Intel.
Posted on Reply
#66
johnnyfiive
I highly doubt people who can afford a 2011 setup are even reading this with any inkling of giving a crap about what some random chinese benchmark says.

Gaming is almost always GPU limited anyway, especially when you compare Sandy Bridge to Sandy Bridge-E.

Hey chinese people, leak something that actually matters.
Posted on Reply
#67
radrok
Can't wait to see some encoding and rendering results other than that (C4D)CB 11.5
Sandy Bridge was already an awesome architecture performance wise, 6 core Sandy Bridge-E will be more than awesome :)

I'm not talking gaming wise though, can't really expect much improvement on them unless games start using 6+ threads :laugh:
Posted on Reply
#68
Wile E
Power User
John DoeYou're thinking of it wrong. One can't test true potential of a CPU at high res (hence the GPU). You have to lower down resolution to take GPU out of question, only then you can see how fast a CPU can get. Look at this:

www.hardwarecanucks.com/charts/index.php?pid=70,76&tid=3

Then this;

www.legitreviews.com/images/reviews/1751/hawx.jpg
You're thinking of it wrong. One can't test the true potential of a cpu by using games, period.

Again, I don't care how many people here buy chip specifically for gaming. That has absolutely no bearing on the fact that gaming is still a TERRIBLE cpu benchmark. SB-E isn't the only thing I mention this about.

All current chips will allow you to have essentially the same gaming experience, because they are not the main bottleneck. What I want to know is, who actually expected the 2 extra cores of SB-E to make a difference in gaming?

These results are not disappointing at all, they are expected. Games do not take advantage of this much cpu power. Anybody that expected a significant difference obviously hasn't been paying attention to the gaming industry.
Dent1That rule was forgotten when Bulldozer was launched. And remembered when Sandy Bridge-E is showing signs of floppping.



I wish people remembered that saying when Bulldozer was received negatively on launch.
My negativity towards BD had nothing to do with gaming. I am disappointed in it's per core IPC.
Posted on Reply
#69
Makaveli
This is nothing new and the same thing was evident last gen.

Westmere vs Bloomfield/lynnfield.

Unless you are doing something that requires the extra cores there usually isn't much of a benefit.

Why this is a surprise to anyone is beyond me.

Nothing to see here move along folks!
Posted on Reply
#70
cadaveca
My name is Dave
qubitOne wouldn't game at it, but one would certainly bench at it when comparing CPU performance, to remove the GPU from the equation.
Except most don't realize that depending on VGA, this doesn't move the bottleneck from the GPU to the CPU...it moves it from the GPU to the memory controller. IT does NOT move the bottleneck to JUST the CPU proper.

Triangle and setup data is sent from the CPU to the GPU for every single frame, so really, by lowering resolution and increasing framerate, you are not exactly getting the same effect as it's portrayed by reviewers. You need to lower resolution, and NOT use a high powered-VGA, unless you are just testing CPU-GPU communication. It is NOT 100% just testing CPU performance, and this way of testing is SERIOUSLY flawed.

Falsely increasing the workload turns a real-world benchmark into a synthetic benchmark. Except this synthetic benchmarking practice has no correlation to real-world workloads, at all.


This is why I don't do CPU reviews. I WILL NOT review perforamcne for a CPU in the manner it is currently, by nearly every reviewer out there.
Posted on Reply
#71
erocker
*
qubitFor some reason, there seems to be quite a bit of confusion by various posters here over the above statement, starting with erocker:



People, I don't see how I can put it any more clearly.
There's no confusion. The other actual CPU benchmarks give a good indication of what this chip is capable of. You seem to be the one who is confused, afterall, the title of your news post is a question. cadaveca and Wile E have the rest of my thoughts covered.
MilkyWayWhat are people expecting? Its a tweaked Sandybridge with 2 extra cores. Its not going to be miles better just a bit better in some tasks due to the 6 cores.
Thank you, well said.
Posted on Reply
#72
HumanSmoke
Just a few observations...

The 3960X for all intents and purposes is a slightly slower version of the 2600K if the app uses up to 4C/8T (3.3G vs 3.4G)

Gaming would depend heavily on how much of the coding load is handled by the CPU. Typically compute-heavy games (i.e. RTS- once the maps start filling up esp.) would likely show a small benefit for the hexcore.

Of the three commonly used methods to test CPU gaming performance, all could be said to be in some way flawed:
Testing a low res and/or 0xAA/0xAF takes the GPU out of the equation but isn't indicative of "real world" use, and not likely a scenario that anyone would encounter in actual gaming.

Testing at common resolution and default game i.q. more often than not presents a GPU-bound result...in which case a Core i3 makes as much sense as BD, i7 or Xeon

Testing at common res and standard game i.q. with CFX/SLI removes some of the GPU limiting factor...but can also introduce new factors - drivers and PCI-E bandwidth constraints- the latter probably minor, but would be an influence if you were testing quad GPU (say dual HD6990/GTX590) in a heads-up comparison ( X79 PCI-E @ 2 x 16 versus Z68 PCI-E @ 2 x 8)...adding a lane multiplier such as the NF200 would possibly open the system to increased latency also from what I understand.

From a personal PoV, I would use my system as much for content creation (Handbrake, Photoshop etc.) and productivity (Excel, PowerPoint etc.) as I do gaming. If the performance metric is in favour of all the apps I use then I would definitely consider the platform. I would like a better understanding of where the launch stepping/revision stands before I'd commit - My C0 step for Bloomfield was noticeably inferior to D0 I upgraded to.
Posted on Reply
#73
jblanc03
this processor is ment for SICK gamers

You all forget that there is no z68 Motherboard that will support 4way-sli!

so those of you who want to do only 3 way, than this processor is a waste.

remember that this processor can handle 40 PCIE lanes.
Posted on Reply
#74
lashton
lol amd vs Intel

its funny everyones posiutive praise when intel releases early benchies and when AMD does it EVERYONE is critical, typical human population though, unintelligent
Posted on Reply
#75
cadaveca
My name is Dave
lashtonits funny everyones posiutive praise when intel releases early benchies and when AMD does it EVERYONE is critical, typical human population though, unintelligent
Haven't seen ANYONE "praising" Intel in this thread.

All I saw is everyone discussing the validity of how the benchmarks are presented, according to their own uses and needs. Nothing here is disappointing, as those htat tend to know hiow hardware works, realize that CPUs mean very little in day-to-day tasks, and most of us here game @ 1920x1080, not 1680x1050 or lower.
Posted on Reply
Add your own comment
Apr 26th, 2024 07:35 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts