• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Haswell-EX an 18-core Leviathan

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,785 (7.40/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel's biggest enterprise CPU silicon based on its "Haswell" micro-architecture, the Haswell-EX, is a silicon monstrosity, according to its specs. Built in the 22 nm silicon fab processes, the top-spec variant of the chip physically features 18 cores, 36 logical CPUs enabled with HyperThreading, 45 MB of L3 cache, a DDR4 IMC, and TDP as high as 165W. Intel will use this chip to build its next-gen Xeon E7 v3 family, which includes 8-core, 10-core, 12-core, 14-core, 16-core, and 18-core models, with 2P-only, and 4P-capable variants spanning the E7-4000 and E7-8000 families. Clock speeds range between 1.90 GHz and 3.20 GHz.



View at TechPowerUp Main Site
 
I need the 18 core for Minecraft so freaking bad .. somone get me one !
 
If I get this, will minesweeper stop lagging?

Jokes aside, this is pretty good. I wonder what idle power looks like, rather than just TDP. I'm still running a somewhat older 2x E5472. The idle consumption and heat is the spoiler. If the new Haswell-EX E7v3's manage to noticeably cut the idle+average power consumption, then upgrades are merited for that reason alone.
 
If you don't want this cpu something is wrong with you.
 
Now this I like... a lot! :)

If they optimise all game engines to use at least 90 % (see how modest I am) of its power and computing capabilities, we would already enjoy a much more beautiful world even than that of Crysis 3. :)
 
Now this I like... a lot! :)

If they optimise all game engines to use at least 90 % (see how modest I am) of its power and computing capabilities, we would already enjoy a much more beautiful world even than that of Crysis 3. :)

CPUs aren't a limiting factor in games.
 
Now this I like... a lot! :)

If they optimise all game engines to use at least 90 % (see how modest I am) of its power and computing capabilities, we would already enjoy a much more beautiful world even than that of Crysis 3. :)

Ermm not really, thats the gpu area.
 
I would really like to see what the 18-core/36-thread beast could produce while crunching for WCG///\\\
 
A couple of these on a multi-socket mobo would. be. absolutely. MENTALLLLLL!!!!!!
 
cut it from 165 to 125, give me extra 300-400mhz and it would be much sexier. This is what 14nm should be
 
If I get this, will minesweeper stop lagging?

Jokes aside, this is pretty good. I wonder what idle power looks like, rather than just TDP. I'm still running a somewhat older 2x E5472. The idle consumption and heat is the spoiler. If the new Haswell-EX E7v3's manage to noticeably cut the idle+average power consumption, then upgrades are merited for that reason alone.

You know its an E7 (Haswell-EX) and not an E5 (Haswell-EP)? This 18-core E7 will cost ~$7K alone for one chip... :cool:
 
CPUs aren't a limiting factor in games.

Ermm not really, thats the gpu area.

I hope you both will understand that we haven't and probably will never reach the maximum possible computing power to be thrown at a problem.

You can take the "sleeping" threads of those processors and tell them to calculate something, even to take some of the load of the GPU.

GPUs continue to grow with shaders count, right or no?
 
and as usual
more cores is in vain when the OS/ apps cant fully utilize them
 
^People are not meant to use those CPUs to play Left for team portal Episode 3
They use well-programmed programs and take efficiency as high as they can.
 
^People are not meant to use those CPUs to play Left for team portal Episode 3
They use well-programmed programs and take efficiency as high as they can.

Those CPUs are used to create virtual reality with maximum realism. Which of course would need as much as possible resources.

Yeah, I agree that the state of current gaming industry is bad but I guess, in 10 years it will be a different picture.
 
I would be happy with a 4core 8 thread CPU that runs @ 6Ghz you would get more out of it for gaming.
 
Oh, this wouldn't be good for gaming at all... :rolleyes:

In recent years, Intel Labs has shown steady progress towards developing real-time ray tracing engines running on multicore Intel® processor-based platforms. In 2008, we were able to show Quake Wars: Ray Traced running at 15-20 frames per second (fps) on an Intel® Xeon® processor-based server using four quad-core CPUs. After several months of making optimizations, we showed the demo using the same four socket server, but with the newly updated six-core Intel® Xeon® X7460 processors at 20-35 fps. The following year the next Intel processor allowed us to go down from a server to a workstation system with two sockets achieving the same frame. It is clear that we will see continued performance increases with higher core counts and new architectures like Intel® microarchitecture code name Sandy Bridge.

That was written in 2011 and since then Daniel Pohl of Intel has demo'd it with with even more powerful Xeons + Phi achieving 1080p:

intel_knights_ferry_8-way_idf.jpg


... and since then Intel has produced vastly more powerful Phi models as well as announced Haswell-EX Xeons above.

The short story is that there will be a point when this kind of power is almost in the hands of the consumer and the problem with be tackled from the other end - software. Several prominent developers have spoken their desire to design engines for ray-tracing. Many would like to be on the fore-front of such an evolution in graphics.
 
I would love to have a quad socket 18 core xeon set up for crunching.


I would be happy with a 4core 8 thread CPU that runs @ 6Ghz you would get more out of it for gaming.
I want one of those for gaming.
 
Does this rub anyone else the wrong way seeing how Intel keeps improving the server market by leaps and bounds but only leaves the scraps to desktops? @Federal Trade Commission: it is time to start looking at anti-trust. The lack of competition is becoming obvious now.
 
I could probably hit 300 FPS on FSX:laugh:! Seriously, these CPUs would be massive overkill for me and just about anyone, but I'd most likely buy a 16 core CPU anyway.
 
Does this rub anyone else the wrong way seeing how Intel keeps improving the server market by leaps and bounds but only leaves the scraps to desktops? @Federal Trade Commission: it is time to start looking at anti-trust. The lack of competition is becoming obvious now.

True. I have my fingers crossed that Intel will toss me more than a bone with Skylake to upgrade from Ivy Bridge but it may not happen. I heard some rumors a while back that Samsung was considering buying AMD and if they did they could pump enough cash into AMD R&D to put them in a position to shake Intel out of their lazy stupor. I haven't heard anything more for a while so I guess it's not going to happen.
 
Does this rub anyone else the wrong way seeing how Intel keeps improving the server market by leaps and bounds but only leaves the scraps to desktops? @Federal Trade Commission: it is time to start looking at anti-trust. The lack of competition is becoming obvious now.


What? This has nothing to do with anti-trust laws. Intel isn't monopolizing anything, and simply making a better product that your competition isn't against the law.

It's not Intel's fault that the vast majority of software only takes advantage of 1 to 2 cores/threads, and Intel already sells multiple chips specifically for the home market that can handle 8-16 threads. It's up to the software to take advantage of that.

Also, a home users CAN purchase server/workstation CPUs if they think their home office/game PC need two 16 core/32 thread CPUs (protip: it doesn't) and they have about $7000 to throw away.

And it's not Intel's fault that AMD can't make competitive chips.
 
Last edited:
True. I have my fingers crossed that Intel will toss me more than a bone with Skylake to upgrade from Ivy Bridge but it may not happen. I heard some rumors a while back that Samsung was considering buying AMD and if they did they could pump enough cash into AMD R&D to put them in a position to shake Intel out of their lazy stupor. I haven't heard anything more for a while so I guess it's not going to happen.

Not Since Mrs Su stepped in.

Petey plane part of it deals with Intel twisting peoples arms during the Athlon 64 days. So alot of this is Intel's Fault not all of it though.
 
Back
Top