Monday, May 20th 2024

Intel's Next-Gen Falcon Shores GPU to Consume 1500 W, No Air-Cooled Variant Planned

Intel's upcoming Falcon Shores GPU is shaping up to be a powerhouse for AI and high-performance computing (HPC) workloads, but it will also be an extreme power hog. The processor, combining Gaudi and Ponte Vecchio successors into a single GPU, is expected to consume an astonishing 1500 W of power - more than even Nvidia's beefy B200 accelerator, which draws 1000 W. This immense power consumption will require advanced cooling solutions to ensure the Falcon Shores GPU operates efficiently and safely. Intel's partners may turn to liquid cooling or even full immersion liquid cooling, a technology Intel has been promoting for power-hungry data center hardware. The high power draw is the cost of the Falcon Shores GPU's formidable performance promises. Intel claims it will deliver 5x higher performance per watt and 5x more memory capacity and bandwidth compared to its Ponte Vecchio products.

Intel may need to develop proprietary hardware modules or a new Open Accelerator Module (OAM) spec to support such extreme power levels, as the current OAM 2.0 tops out around 1000 W. Slated for release in 2025, the Falcon Shores GPU will be Intel's GPU IP based on its next-gen Xe graphics architecture. It aims to be a major player in the AI accelerator market, backed by Intel's robust oneAPI software development ecosystem. While the 1500 W power consumption is sure to raise eyebrows, Intel is betting that the Falcon Shores GPU's supposedly impressive performance will make it an enticing option for AI and HPC customers willing to invest in robust cooling infrastructure. The ultra-high-end accelerator market is heating up, and the HPC accelerator market needs a Ponte Vecchio successor.
Sources: ComputerBase.de, via Tom's Hardware
Add your own comment

34 Comments on Intel's Next-Gen Falcon Shores GPU to Consume 1500 W, No Air-Cooled Variant Planned

#1
64K
What the actual hell !?!
Was not expecting that from Intel but I don't keep up with professional GPUs at all. I guess there is indeed a tremendous market for such a beast now that the AI craze is upon us.
Posted on Reply
#2
ExcuseMeWtf
64KWhat the actual hell !?!
Was not expecting that from Intel but I don't keep up with professional GPUs at all. I guess there is indeed a tremendous market for such a beast now that the AI craze is upon us.
HPC sector will gobble up all the computing power they can get. Where do you think second hand cheap Xeons come from? Machines of this caliber decomissioned due to lesser power efficiency.

YES, power efficiency. Peak power draw may be insane, but performance in tasks might be even moreso.
Posted on Reply
#4
ZoneDymo
sooo have Gaudi and Ponte Veccio ever actually become products? I have heard these names so often it seems like Vaporware.
Posted on Reply
#6
azrael
Seems like Intel is doing what Intel does best, use lots of power. :D
Posted on Reply
#7
kapone32
Intel swoops in to save EK lol.
Posted on Reply
#8
Timbaloo
We need to stop calling these things "GPUs".
Posted on Reply
#9
Sabotaged_Enigma
Target European market mainly, so that users can make it through winter lmao
Posted on Reply
#10
64K
TimbalooWe need to stop calling these things "GPUs".
What you're saying makes a lot of sense to me. I come from a time when GPU stood for Graphics Processing Unit. That's not the main use for these chips anymore than GPUs for mining were about graphics.
Posted on Reply
#11
Daven
TimbalooWe need to stop calling these things "GPUs".
CPU (central processing unit) also seems poorly named nowadays given the major role of these computing ‘GPU’ beasts.

Maybe BPU (basic processing unit) and MPU (massive processing unit) could replace CPU and GPU respectively?
Posted on Reply
#12
Assimilator
This isn't a GPU, it's a compute processor with no display outputs. Please stop with stupid clickbait headlines like these.
Posted on Reply
#13
ncrs
AssimilatorThis isn't a GPU, it's a compute processor with no display outputs. Please stop with stupid clickbait headlines like these.
Intel calls it a GPU:
Building on the momentum of the Max Series GPU, our next product in the Max Series family will be the GPU architecture code-named Falcon Shores.
and again:
What’s Next: Intel Gaudi 3 accelerators' momentum will be foundational for Falcon Shores, Intel’s next-generation graphics processing unit (GPU) for AI and high-performance computing (HPC).
Posted on Reply
#14
Timbaloo
DavenCPU (central processing unit) also seems poorly named nowadays given the major role of these computing ‘GPU’ beasts.

Maybe BPU (basic processing unit) and MPU (massive processing unit) could replace CPU and GPU respectively?
CPU still matches imho. It is still the central unit as in running the OS, as in being the master, whereas "GPUs" are slaves.

I'd propably call these things something like "compute accelerators", which seems to be the common denominator.
Posted on Reply
#16
Denver
TimbalooWe need to stop calling these things "GPUs".
Although Compute/AI Accelerators sounds more accurate, AMD and Nvidia still call them GPUs, who are we to say otherwise?

The architecture itself is still very similar to GPUs.
Posted on Reply
#17
ncrs
AssimilatorI can call your mom a black hole. Doesn't necessarily mean I'm correct. Or in this case, incorrect.
Just because it can't output graphics directly doesn't mean it can't process it and output via network. In case of Ponte Vecchio it can. It even supports ray-tracing.
Same with NVIDIA data center offerings supporting MIG and/or GRID to remotely share virtual instances for 3D applications.
Posted on Reply
#18
Daven
AssimilatorThis isn't a GPU, it's a compute processor with no display outputs. Please stop with stupid clickbait headlines like these.
I would agree with you except that Intel, Nvidia and AMD still use the term GPU sprinkled throughout all of their Gaudi/Ponte Vecchio/Instinct/Tesla webpages and press releases. You will need to send your complaint to them not TPU.
TimbalooCPU still matches imho. It is still the central unit as in running the OS, as in being the master, whereas "GPUs" are slaves.

I'd propably call these things something like "compute accelerators", which seems to be the common denominator.
Its very hard for me to call the slice of CPU in each of the eight interconnected OAM MI300As the master.
Posted on Reply
#19
Courier 6
better start building some fusion cores, ZPMs etc...
Posted on Reply
#20
Darmok N Jalad
It's based on their GPU architecture. Maybe it could gain a prefix and be called HPC-GPU or something.
Posted on Reply
#21
TechLurker
ZoneDymosooo have Gaudi and Ponte Veccio ever actually become products? I have heard these names so often it seems like Vaporware.
There were reports that Ponte Veccio was being cancelled/EoL by Intel to focus on newer hardware that has shown more promise. Which puts the DoD's supercomputer as the only major use of it.
Posted on Reply
#22
ncrs
DavenIts very hard for me to call the slice of CPU in each of the eight interconnected OAM MI300As the master.
MI300A comes in Socket SH5 with up to 4 APUs per server, but that's nitpicking ;)
CPUs are still the master in the design. The OS runs on them in the address space managed by them. Being interconnected and cache-coherent changes nothing from software perspective (apart from increased performance and decreased latency of course). As in, it's not a completely new compute paradigm that would warrant changing the meaning of established terms.
Posted on Reply
#23
azrael
I remember a time when these specific units were called GPGPUs. General-purpose Graphics-processing Units. But then again I'm old...
Posted on Reply
#24
MarquiseAke
FYI This new Intel AI processor is a hybrid. It contains both x86 and GPU processing. This is why it requires 1500w.
Posted on Reply
#25
ScaLibBDP
MarquiseAkeFYI This new Intel AI processor is a hybrid. It contains both x86 and GPU processing. This is why it requires 1500w.
NVIDIA Grace Hopper GH200 superchip with 72 ARM cores and GPU ( 132 SMs and 528 Tensor cores ) consumes from 400W to 1000W.

With such TDP Intel's superchip ( GPU + x86 CPU cores ) doesn't have good future.
Posted on Reply
Add your own comment
Jun 2nd, 2024 17:40 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts