News Posts matching #Supermicro

Return to Keyword Browsing

Supermicro Delivers Direct-Liquid-Optimized NVIDIA Blackwell Solutions

Supermicro, Inc., a Total IT Solution Provider for AI, Cloud, Storage, and 5G/Edge, is announcing the highest-performing SuperCluster, an end-to-end AI data center solution featuring the NVIDIA Blackwell platform for the era of trillion-parameter-scale generative AI. The new SuperCluster will significantly increase the number of NVIDIA HGX B200 8-GPU systems in a liquid-cooled rack, resulting in a large increase in GPU compute density compared to Supermicro's current industry-leading liquid-cooled NVIDIA HGX H100 and H200-based SuperClusters. In addition, Supermicro is enhancing the portfolio of its NVIDIA Hopper systems to address the rapid adoption of accelerated computing for HPC applications and mainstream enterprise AI.

"Supermicro has the expertise, delivery speed, and capacity to deploy the largest liquid-cooled AI data center projects in the world, containing 100,000 GPUs, which Supermicro and NVIDIA contributed to and recently deployed," said Charles Liang, president and CEO of Supermicro. "These Supermicro SuperClusters reduce power needs due to DLC efficiencies. We now have solutions that use the NVIDIA Blackwell platform. Using our Building Block approach allows us to quickly design servers with NVIDIA HGX B200 8-GPU, which can be either liquid-cooled or air-cooled. Our SuperClusters provide unprecedented density, performance, and efficiency, and pave the way toward even more dense AI computing solutions in the future. The Supermicro clusters use direct liquid cooling, resulting in higher performance, lower power consumption for the entire data center, and reduced operational expenses."

TinyGrad Showcases TinyBox Pro: 1.36 PetaFLOP Compute Monster at $40,000 Price Tag

TinyGrad, the company behind the popular TinyBox system, is aiming to commoditize the PetaFLOP. Its latest powerhouse, the TinyBox Pro, is on display on X. This high-performance system boasts an impressive 1.36 PetaFLOPS of FP16 computing power and is based on commercial GPU. The TinyBox Pro configuration features eight NVIDIA RTX 4090 GPUs, surpassing its predecessors with a combined GPU RAM of 192 GB and a memory bandwidth of 8,064 GB/s. This substantial upgrade is complemented by dual AMD Genoa processors and 384 GB of system RAM, delivering a memory bandwidth of 921.6 GB/s. What sets the TinyBox Pro apart is its enterprise-grade architecture. The system utilizes four 2000 W power supplies requiring 200 V+ input, housed in a 4U form factor that spans 31 inches in depth. Despite its compact size, weighing 88 pounds, the unit comes equipped with Supermicro rails for seamless rack integration.

Connectivity options are equally impressive, featuring two open PCIe 5.0 x16 slots that provide extensive expansion capabilities. Storage is managed through a 1 TB boot drive, though this might seem conservative compared to some competitors' offerings. The system runs on Ubuntu 22.04 and is noted for its superior driver quality (compared to commercial AMD GPUs), addressing a common pain point in high-performance computing on commercial hardware. On social media, TinyGrad was very vocal about its fight with the AMD Radeon GPU drivers. However, potential buyers should be prepared for significant noise levels during operation, a trade-off for the remarkable computing power packed into the 4U chassis. With a pre-order price tag of $40,000, the TinyBox Pro positions itself as a serious contender in the professional AI computing market, where regular GPU boxes can cost 100s of thousands of US Dollars. This pricing reflects its enterprise-grade specifications and positions it as an accessible alternative to larger, more expensive computing clusters.

Supermicro Shares Plunge 33% as Auditor Quits, Citing Previous Warnings

Supermicro shares took a big hit today when Ernst & Young quit as its auditor, making its stock fall over 30%. EY decided to leave because of their worries in July about how Supermicro runs things, shares information, and keeps track of its money. In August, Supermicro delayed its annual report as they were looking over internal financial controls following Hindenburg Research's allegations of accounting manipulation. Ernst & Young's letter to the Securities and Exchange Commission (SEC) about quitting says they can't trust what the company's leaders say anymore. They also don't want their name on the company's financial papers after discovering new information during their check. "We are resigning due to information that has recently come to our attention which has led us to no longer be able to rely on management's and the Audit Committee's representations and to be unwilling to be associated with the financial statements prepared by management."

Supermicro doesn't agree with the accounting firm's decision, and they say fixing these problems won't mean they have to redo any of their financial reports from 2024 or earlier. Commenting on this subject, Nathan Anderson, the founder of Hindenburg, said in a post on X, "As far as auditor statements go, E&Y's SMCI resignation letter is about as strongly worded as I have seen." According to The Wall Street Journal, the Department of Justice is currently looking into the company. Supermicro will present its first quarter fiscal 2025 business update on Tuesday, November 5, 2024, at 5:00 p.m. ET / 2:00 p.m. PT.

Supermicro's Liquid-Cooled SuperClusters for AI Data Centers Powered by NVIDIA GB200 NVL72 and NVIDIA HGX B200 Systems

Supermicro, Inc., a Total IT Solution Provider for AI, Cloud, Storage, and 5G/Edge, is accelerating the industry's transition to liquid-cooled data centers with the NVIDIA Blackwell platform to deliver a new paradigm of energy-efficiency for the rapidly heightened energy demand of new AI infrastructures. Supermicro's industry-leading end-to-end liquid-cooling solutions are powered by the NVIDIA GB200 NVL72 platform for exascale computing in a single rack and have started sampling to select customers for full-scale production in late Q4. In addition, the recently announced Supermicro X14 and H14 4U liquid-cooled systems and 10U air-cooled systems are production-ready for the NVIDIA HGX B200 8-GPU system.

"We're driving the future of sustainable AI computing, and our liquid-cooled AI solutions are rapidly being adopted by some of the most ambitious AI Infrastructure projects in the world with over 2000 liquid-cooled racks shipped since June 2024," said Charles Liang, president and CEO of Supermicro. "Supermicro's end-to-end liquid-cooling solution, with the NVIDIA Blackwell platform, unlocks the computational power, cost-effectiveness, and energy-efficiency of the next generation of GPUs, such as those that are part of the NVIDIA GB200 NVL72, an exascale computer contained in a single rack. Supermicro's extensive experience in deploying liquid-cooled AI infrastructure, along with comprehensive on-site services, management software, and global manufacturing capacity, provides customers a distinct advantage in transforming data centers with the most powerful and sustainable AI solutions."

Supermicro Adds New Petascale JBOF All-Flash Storage Solution Integrating NVIDIA BlueField-3 DPU for AI Data Pipeline Acceleration

Supermicro, Inc., a Total IT Solution Provider for AI, Cloud, Storage, and 5G/Edge, is launching a new optimized storage system for high performance AI training, inference and HPC workloads. This JBOF (Just a Bunch of Flash) system utilizes up to four NVIDIA BlueField-3 data processing units (DPUs) in a 2U form factor to run software-defined storage workloads. Each BlueField-3 DPU features 400 Gb Ethernet or InfiniBand networking and hardware acceleration for high computation storage and networking workloads such as encryption, compression and erasure coding, as well as AI storage expansion. The state-of-the-art, dual port JBOF architecture enables active-active clustering ensuring high availability for scale up mission critical storage applications as well as scale-out storage such as object storage and parallel file systems.

"Supermicro's new high performance JBOF Storage System is designed using our Building Block approach which enables support for either E3.S or U.2 form-factor SSDs and the latest PCIe Gen 5 connectivity for the SSDs and the DPU networking and storage platform," said Charles Liang, president and CEO of Supermicro. "Supermicro's system design supports 24 or 36 SSD's enabling up to 1.105PB of raw capacity using 30.71 TB SSDs. Our balanced network and storage I/O design can saturate the full 400 Gb/s BlueField-3 line-rate realizing more than 250 GB/s bandwidth of the Gen 5 SSDs."

Supermicro Introduces New Servers and GPU Accelerated Systems with AMD EPYC 9005 Series CPUs and AMD Instinct MI325X GPUs

Supermicro, Inc., a Total IT Solution Provider for AI, Cloud, Storage, and 5G/Edge, announces the launch of a new series of servers, GPU-accelerated systems, and storage servers featuring the AMD EPYC 9005 Series processors and AMD Instinct MI325X GPUs. The new H14 product line represents one of the most extensive server families in the industry, including Supermicro's Hyper systems, the Twin multi-node servers, and AI inferencing GPU systems, all available with air or liquid cooling options. The new "Zen 5" processor core architecture implements full data path AVX-512 vector instructions for CPU-based AI inference and provides 17% better instructions per cycle (IPC) than the previous 4th generation EPYC processor, enabling more performance per core.

Supermicro's new H14 family uses the latest 5th Gen AMD EPYC processors which enable up to 192 cores per CPU with up to 500 W TDP (thermal design power). Supermicro has designed new H14 systems including the Hyper and FlexTwin systems which can accommodate the higher thermal requirements. The H14 family also includes three systems for AI training and inference workloads supporting up to 10 GPUs which feature the AMD EPYC 9005 Series CPU as the host processor and two which support the AMD Instinct MI325X GPU.

Supermicro Introduces New Versatile System Design for AI Delivering Optimization and Flexibility at the Edge

Super Micro Computer, Inc., a Total IT Solution Provider for AI, Cloud, Storage, and 5G/Edge, announces the launch of a new, versatile, high-density infrastructure platform optimized for AI inferencing at the network edge. As companies seek to embrace complex large language models (LLM) in their daily operations, there is a need for new hardware capable of inferencing high volumes of data in edge locations with minimal latency. Supermicro's innovative system combines versatility, performance, and thermal efficiency to deliver up to 10 double-width GPUs in a single system capable of running in traditional air-cooled environments.

"Owing to the system's optimized thermal design, Supermicro can deliver all this performance in a high-density 3U 20 PCIe system with 256 cores that can be deployed in edge data centers," said Charles Liang, president and CEO of Supermicro. "As the AI market is growing exponentially, customers need a powerful, versatile solution to inference data to run LLM-based applications on-premises, close to where the data is generated. Our new 3U Edge AI system enables them to run innovative solutions with minimal latency."

Supermicro Currently Shipping Over 100,000 GPUs Per Quarter in its Complete Rack Scale Liquid Cooled Servers

Supermicro, Inc., a Total IT Solution Provider for Cloud, AI/ML, Storage, and 5G/Edge, is announcing a complete liquid cooling solution that includes powerful Coolant Distribution Units (CDUs), cold plates, Coolant Distribution Manifolds (CDMs), cooling towers and end to end management software. This complete solution reduces ongoing power costs and Day 0 hardware acquisition and data center cooling infrastructure costs. The entire end-to-end data center scale liquid cooling solution is available directly from Supermicro.

"Supermicro continues to innovate, delivering full data center plug-and-play rack scale liquid cooling solutions," said Charles Liang, CEO and president of Supermicro. "Our complete liquid cooling solutions, including SuperCloud Composer for the entire life-cycle management of all components, are now cooling massive, state-of-the-art AI factories, reducing costs and improving performance. The combination of Supermicro deployment experience and delivering innovative technology is resulting in data center operators coming to Supermicro to meet their technical and financial goals for both the construction of greenfield sites and the modernization of existing data centers. Since Supermicro supplies all the components, the time to deployment and online are measured in weeks, not months."

Fujitsu and Supermicro Collaborate to Develop Green Arm-Based AI Computing Technology and Liquid-cooled Datacenter Solutions

Fujitsu Limited and Supermicro, Inc. (NASDAQ: SMCI), today announced they will collaborate to establish a long-term strategic engagement in technology and business, to develop and market a platform with Fujitsu's future Arm-based "FUJITSU-MONAKA" processor that is designed for high-performance and energy efficiency and targeted for release in 2027. In addition, the two companies will also collaborate on developing liquid-cooled systems for HPC, Gen AI, and next-generation green data centers.

"Supermicro is excited to collaborate with Fujitsu to deliver state-of-the-art servers and solutions that are high performance, power efficient, and cost-optimized," said Charles Liang, president and CEO of Supermicro. "These systems will be optimized to support a broad range of workloads in AI, HPC, cloud and edge environments. The two companies will focus on green IT designs with energy-saving architectures, such as liquid cooling rack scale PnP, to minimize technology's environmental impact."

Supermicro Adds New Max-Performance Intel-Based X14 Servers

Supermicro, Inc. a Total IT Solution Provider for AI/ML, HPC, Cloud, Storage, and 5G/Edge, today adds new maximum performance GPU, multi-node, and rackmount systems to the X14 portfolio, which are based on the Intel Xeon 6900 Series Processors with P-Cores (formerly codenamed Granite Rapids-AP). The new industry-leading selection of workload-optimized servers addresses the needs of modern data centers, enterprises, and service providers. Joining the efficiency-optimized X14 servers leveraging the Xeon 6700 Series Processors with E-cores launched in June 2024, today's additions bring maximum compute density and power to the Supermicro X14 lineup to create the industry's broadest range of optimized servers supporting a wide variety of workloads from demanding AI, HPC, media, and virtualization to energy-efficient edge, scale-out cloud-native, and microservices applications.

"Supermicro X14 systems have been completely re-engineered to support the latest technologies including next-generation CPUs, GPUs, highest bandwidth and lowest latency with MRDIMMs, PCIe 5.0, and EDSFF E1.S and E3.S storage," said Charles Liang, president and CEO of Supermicro. "Not only can we now offer more than 15 families, but we can also use these designs to create customized solutions with complete rack integration services and our in-house developed liquid cooling solutions."

Supermicro Announces FlexTwin Multi-Node Liquid Cooled Servers

Supermicro, Inc., a Total IT Solution Provider for AI/ML, HPC, Cloud, Storage, and 5G/Edge is announcing the all-new FlexTwin family of systems which has been designed to address the needs of scientists, researchers, governments, and enterprises undertaking the world's most complex and demanding computing tasks. Featuring flexible support for the latest CPU, memory, storage, power and cooling technologies, FlexTwin is purpose-built to support demanding HPC workloads including financial services, scientific research, and complex modeling. These systems are cost-optimized for performance per dollar and can be customized to suit specific HPC applications and customer requirements thanks to Supermicro's modular Building Block Solutions design.

"Supermicro's FlexTwin servers set a new standard of performance density for rack-scale deployments with up to 96 dual processor compute nodes in a standard 48U rack," said Charles Liang, president and CEO of Supermicro. "At Supermicro, we're able to offer a complete one-stop solution that includes servers, racks, networking, liquid cooling components, and liquid cooling towers, speeding up the time to deployment and resulting in higher quality and reliability across the entire infrastructure, enabling customers faster time to results. Up to 90% of the server generated heat is removed with the liquid cooling solution, saving significant amounts of energy and enabling higher compute performance."

Supermicro Previews New Max Performance Intel-based X14 Servers

Supermicro, Inc., a Total IT Solution Provider for AI/ML, HPC, Cloud, Storage, and 5G/Edge, is previewing new, completely re-designed X14 server platforms which will leverage next-generation technologies to maximize performance for compute-intensive workloads and applications. Building on the success of Supermicro's efficiency-optimized X14 servers that launched in June 2024, the new systems feature significant upgrades across the board, supporting a never-before-seen 256 performance cores (P-cores) in a single node, memory support up for MRDIMMs at 8800MT/s, and compatibility with next-generation SXM, OAM, and PCIe GPUs. This combination can drastically accelerate AI and compute as well as significantly reduce the time and cost of large-scale AI training, high-performance computing, and complex data analytics tasks. Approved customers can secure early access to complete, full-production systems via Supermicro's Early Ship Program or for remote testing with Supermicro JumpStart.

"We continue to add to our already comprehensive Data Center Building Block solutions with these new platforms, which will offer unprecedented performance, and new advanced features," said Charles Liang, president and CEO of Supermicro. "Supermicro is ready to deliver these high-performance solutions at rack-scale with the industry's most comprehensive direct-to-chip liquid cooled, total rack integration services, and a global manufacturing capacity of up to 5,000 racks per month including 1,350 liquid cooled racks. With our worldwide manufacturing capabilities, we can deliver fully optimized solutions which accelerate our time-to-delivery like never before, while also reducing TCO."

Supermicro Launches Plug-and-Play SuperCluster for NVIDIA Omniverse

Supermicro, Inc., a Total IT Solution Provider for AI, Cloud, Storage, and 5G/Edge, is announcing a new addition to its SuperCluster portfolio of plug-and-play AI infrastructure solutions for the NVIDIA Omniverse platform to deliver the high-performance generative AI-enhanced 3D workflows at enterprise scale. This new SuperCluster features the latest Supermicro NVIDIA OVX systems and allows enterprises to easily scale as workloads increase.

"Supermicro has led the industry in developing GPU-optimized products, traditionally for 3D graphics and application acceleration, and now for AI," said Charles Liang, president and CEO of Supermicro. "With the rise of AI, enterprises are seeking computing infrastructure that combines all these capabilities into a single package. Supermicro's SuperCluster features fully interconnected 4U PCIe GPU NVIDIA-Certified Systems for NVIDIA Omniverse, with up to 256 NVIDIA L40S PCIe GPUs per scalable unit. The system helps deliver high performance across the Omniverse platform, including generative AI integrations. By developing this SuperCluster for Omniverse, we're not just offering a product; we're providing a gateway to the future of application development and innovation."

Micron First to Achieve Qualification Sample Milestone to Accelerate Ecosystem Adoption of CXL 2.0 Memory

Micron Technology, a leader in innovative data center solutions, today announced it has achieved its qualification sample milestone for the Micron CZ120 memory expansion modules using Compute Express Link (CXL). Micron is the first in the industry to achieve this milestone, which accelerates the adoption of CXL solutions within the data center to tackle the growing memory challenges stemming from existing data-intensive workloads and emerging artificial intelligence (AI) and machine learning (ML) workloads.

Using a new and emerging CXL standard, the CZ120 required substantial hardware testing for reliability, quality and performance across CPU providers and OEMs, along with comprehensive software testing for compatibility and compliance with OS and hypervisor vendors. This achievement reflects the collaboration and commitment across the data center ecosystem to validate the advantages of CXL memory. By testing the combined products for interoperability and compatibility across hardware and software, the Micron CZ120 memory expansion modules satisfy the rigorous standards for reliability, quality and performance required by customers' data centers.

Supermicro Launches MicroCloud Nodes, Mainstream Racks, and Towers Based on AMD EPYC 4004

Supermicro, Inc., a Total IT Solution Provider for AI, Cloud, Storage, and 5G/Edge, is announcing additions to the AMD based H13 generation of CPU Servers, optimized to deliver an outstanding balance of performance and efficiency and powered by the AMD EPYC 4004 Series processors. Supermicro will feature its new MicroCloud multi-node solution, which supports up to ten nodes in a 3U form factor. This very high-density option is designed for cloud-native workloads.

"Supermicro continues to offer innovative solutions for a wide range of applications, and with this new entry, based on the AMD EPYC 4004 processor, we can address the needs of on-premises or cloud service providers who need a cost-effective solution in a compact form factor," said Charles Liang, president and CEO of Supermicro. "In a single rack, 160 individual nodes can be made available for cloud-native applications, which reduces real estate need and decreases a data center TCO."

Samsung Introduces "Petabyte SSD as a Service" at GTC 2024, "Petascale" Servers Showcased

Leaked Samsung PBSSD presentation material popped up online a couple of days prior to the kick-off day of NVIDIA's GTC 2024 conference (March 18)—reports (at the time) jumped on the potential introduction of a "petabyte (PB)-level SSD solution," alongside an enterprise subscription service for the US market. Tom's Hardware took the time to investigate this matter—in-person—on the showroom floor up in San Jose, California. It turns out that interpretations of pre-event information were slightly off—according to on-site investigations: "despite the name, PBSSD is not a petabyte-scale solid-state drive (Samsung's highest-capacity drive can store circa 240 TB), but rather a 'petascale' storage system that can scale-out all-flash storage capacity to petabytes."

Samsung showcased a Supermicro Petascale server design, but a lone unit is nowhere near capable of providing a petabyte of storage—the Tom's Hardware reporter found out that the demonstration model housed: "sixteen 15.36 TB SSDs, so for now the whole 1U unit can only pack up to 245.76 TB of 3D NAND storage (which is pretty far from a petabyte), so four of such units will be needed to store a petabyte of data." Company representatives also had another Supermicro product at their booth: "(an) H13 all-flash petascale system with CXL support that can house eight E3.S SSDs (with) four front-loading E3.S CXL bays for memory expansion."

Supermicro Unveils New Edge AI Systems

Supermicro, Inc., a Total IT Solution Manufacturer for AI, Cloud, Storage, and 5G/Edge, is expanding its portfolio of AI solutions, allowing customers to leverage the power and capability of AI in edge locations, such as public spaces, retail stores, or industrial infrastructure. Using Supermicro application-optimized servers with NVIDIA GPUs makes it easier to fine-tune pre-trained models and for AI inference solutions to be deployed at the edge where the data is generated, improving response times and decision-making.

"Supermicro has the broadest portfolio of Edge AI solutions, capable of supporting pre-trained models for our customers' edge environments," said Charles Liang, president and CEO of Supermicro. "The Supermicro Hyper-E server, based on the dual 5th Gen Intel Xeon processors, can support up to three NVIDIA H100 Tensor Core GPUs, delivering unparalleled performance for Edge AI. With up to 8 TB of memory in these servers, we are bringing data center AI processing power to edge locations. Supermicro continues to provide the industry with optimized solutions as enterprises build a competitive advantage by processing AI data at their edge locations."

Samsung & Vodafone "Open RAN Ecosystem" Bolstered by AMD EPYC 8004 Series

Samsung Electronics and Vodafone, in collaboration with AMD, today announced that the three companies have successfully demonstrated an end-to-end call with the latest AMD processors enabling Open RAN technology, a first for the industry. This joint achievement represents the companies' technical leadership in enriching the Open RAN ecosystem throughout the industry. Conducted in Samsung's R&D lab in Korea, the first call was completed using Samsung's versatile, O-RAN-compliant, virtualized RAN (vRAN) software, powered by AMD EPYC 8004 Series processors on Supermicro's Telco/Edge servers, supported by Wind River Studio Container-as-a-Service (CaaS) platform. This demonstration aimed to verify optimized performance, energy efficiency and interoperability among partners' solutions.

The joint demonstration represents Samsung and Vodafone's ongoing commitment to reinforce their position in the Open RAN market and expand their ecosystem with industry-leading partners. This broader and growing Open RAN ecosystem helps operators to build and modernize mobile networks with greater flexibility, faster time-to-market (TTM), and unmatched performance. "Open RAN represents the forthcoming major transformation in advancing mobile networks for the future. Reaching this milestone with top industry partners like Samsung and AMD shows Vodafone's dedication to delivering on the promise of Open RAN innovation," said Nadia Benabdallah, Network Strategy and Engineering Director at Vodafone Group. "Vodafone is continually looking to innovate its network by exploring the potential and diversity of the ecosystem."

Supermicro Extends AI and GPU Rack Scale Solutions with Support for AMD Instinct MI300 Series Accelerators

Supermicro, Inc., a Total IT Solution Manufacturer for AI, Cloud, Storage, and 5G/Edge, is announcing three new additions to its AMD-based H13 generation of GPU Servers, optimized to deliver leading-edge performance and efficiency, powered by the new AMD Instinct MI300 Series accelerators. Supermicro's powerful rack scale solutions with 8-GPU servers with the AMD Instinct MI300X OAM configuration are ideal for large model training.

The new 2U liquid-cooled and 4U air-cooled servers with the AMD Instinct MI300A Accelerated Processing Units (APUs) accelerators are available and improve data center efficiencies and power the fast-growing complex demands in AI, LLM, and HPC. The new systems contain quad APUs for scalable applications. Supermicro can deliver complete liquid-cooled racks for large-scale environments with up to 1,728 TFlops of FP64 performance per rack. Supermicro worldwide manufacturing facilities streamline the delivery of these new servers for AI and HPC convergence.

Supermicro Starts Shipments of NVIDIA GH200 Grace Hopper Superchip-Based Servers

Supermicro, Inc., a Total IT Solution manufacturer for AI, Cloud, Storage, and 5G/Edge, is announcing one of the industry's broadest portfolios of new GPU systems based on the NVIDIA reference architecture, featuring the latest NVIDIA GH200 Grace Hopper and NVIDIA Grace CPU Superchip. The new modular architecture is designed to standardize AI infrastructure and accelerated computing in compact 1U and 2U form factors while providing ultimate flexibility and expansion ability for current and future GPUs, DPUs, and CPUs. Supermicro's advanced liquid-cooling technology enables very high-density configurations, such as a 1U 2-node configuration with 2 NVIDIA GH200 Grace Hopper Superchips integrated with a high-speed interconnect. Supermicro can deliver thousands of rack-scale AI servers per month from facilities worldwide and ensures Plug-and-Play compatibility.

"Supermicro is a recognized leader in driving today's AI revolution, transforming data centers to deliver the promise of AI to many workloads," said Charles Liang, president and CEO of Supermicro. "It is crucial for us to bring systems that are highly modular, scalable, and universal for rapidly evolving AI technologies. Supermicro's NVIDIA MGX-based solutions show that our building-block strategy enables us to bring the latest systems to market quickly and are the most workload-optimized in the industry. By collaborating with NVIDIA, we are helping accelerate time to market for enterprises to develop new AI-enabled applications, simplifying deployment and reducing environmental impact. The range of new servers incorporates the latest industry technology optimized for AI, including NVIDIA GH200 Grace Hopper Superchips, BlueField, and PCIe 5.0 EDSFF slots."

NVIDIA Announces NVIDIA OVX servers Featuring New NVIDIA L40S GPU for Generative AI and Industrial Digitalization

NVIDIA today announced NVIDIA OVX servers featuring the new NVIDIA L40S GPU, a powerful, universal data center processor designed to accelerate the most compute-intensive, complex applications, including AI training and inference, 3D design and visualization, video processing and industrial digitalization with the NVIDIA Omniverse platform. The new GPU powers accelerated computing workloads for generative AI, which is transforming workflows and services across industries, including text, image and video generation, chatbots, game development, product design and healthcare.

"As generative AI transforms every industry, enterprises are increasingly seeking large-scale compute resources in the data center," said Bob Pette, vice president of professional visualization at NVIDIA. "OVX systems with NVIDIA L40S GPUs accelerate AI, graphics and video processing workloads, and meet the demanding performance requirements of an ever-increasing set of complex and diverse applications."

Supermicro Adds 192-Core ARM CPU Based Low Power Servers to Its Broad Range of Workload Optimized Servers and Storage Systems

Supermicro, Inc., a Total IT Solution Provider for Cloud, AI/ML, Storage, and 5G/Edge, is announcing several new servers to its already broad application optimized product line. These new servers incorporate the new AmpereOne CPU, with up to 192 single-threaded cores and up to 4 TB of memory capacity. Applications such as databases, telco edge, web servers, caching services, media encoding, and video gaming streaming will benefit from increased cores, faster memory access, higher performance per watt, scalable power management, and the new cloud security features. Additionally, Cloud Native microservice based applications will benefit from the lower latencies and power usage.

"Supermicro is expanding our customer choices by introducing these new systems that incorporate the latest high core count CPUs from Ampere Computing," said Michael McNerney, vice president of Marketing and Security, Supermicro. "With high core counts, predictable latencies, and up to 4 TB of memory, users will experience increased performance for a range of workloads and lower energy use. We continue to design and deliver a range of environmentally friendly servers that give customers a competitive advantage for various applications."

Supermicro Unveils MicroCloud, High-Density 3U 8 Node System Utilizing AMD Ryzen Zen 4 7000 Series Processors

Supermicro Inc., a Total IT Solution Provider for Cloud, AI/ML, Storage, and 5G/Edge, is introducing a new server that gives IT and data center owners a high performance and scalable solution to meet the needs for E-commerce, cloud gaming, code development, content creation, and virtual private servers. The new systems are designed to use AMD Ryzen 7000 Series processors optimized for server usage, based on the latest "Zen 4" core architecture, which has a max boost speed of up to 5.7 GHz, including PCIe 5.0 support, DDR5-5200 MHz, and up to 16 cores (32 threads) per CPU. The new Supermicro MicroCloud is designed to use the latest system technology for a wide range of applications, including web hosting, cloud gaming, and virtual desktop applications.

"We are expanding our application optimized server product lines to include the latest AMD Ryzen 7000 Series processors," said Michael McNerney, VP of Marketing and Security, Supermicro. "These new servers from Supermicro will give IT administrators a compact and high-performance option in order to offer more services with lower latencies to their internal or external customers. By working closely with AMD to optimize the Ryzen 7000 Series firmware for server usage, we can bring a range of solutions with new technologies with PCIe 5.0, DDR5 memory, and very high clock rates to market faster, which allows organizations to reduce costs and offer advanced solutions to their clients."

Micron Announces High-Capacity 96 GB DDR5-4800 RDIMMs

Micron Technology, Inc., (Nasdaq: MU) today announced volume production availability of high-capacity 96 GB DDR5 RDIMMs in speeds up to 4800MT/s, which have double the bandwidth compared to DDR4 memory. By unlocking the next level of monolithic technology, the integration of Micron's high-density memory solutions empowers artificial intelligence (AI) and in-memory database workloads and eliminates the need for costly die stacking that also adds latency. Micron's 96 GB DDR5 RDIMM modules are qualified with 4th Gen AMD EPYC processors. Additionally, the Supermicro 8125GS - an AMD-based system - includes the Micron 96 GB DDR5 modules and is an excellent platform for high-performance computing, artificial intelligence and deep learning training, and industrial server workloads.

"Delivering high-capacity memory solutions that enable the right performance for compute-intensive workloads is essential to Micron's role as a leading memory innovator and manufacturer. Micron's 96 GB DDR5 DRAM module establishes a new optimized total cost of ownership solution for our customers," stated Praveen Vaidyanathan, vice president and general manager of Micron's Compute Products Group. "Our collaboration with a flexible system provider like Supermicro leverages each of our strengths to provide customers with the latest memory technology to address their most challenging data center needs."
"Supermicro's time-to-market collaboration with Micron benefits a wide variety of key customers," said Don Clegg, senior vice president, Worldwide Sales, Supermicro. "Micron's portfolio of advanced memory and storage products, aligned with Supermicro's broad server and storage innovations deliver validated, tested, and proven solutions for data center deployments and advanced workloads."

Supermicro Announces New Eight- and Four-Socket 4th Gen Intel Xeon Servers

Supermicro, Inc. (NASDAQ: SMCI), a Total IT Solution Provider for Cloud, AI/ML, Storage, and 5G/Edge, is introducing the most powerful server in its lineup for large-scale database and enterprise applications. The Multi-Processor product line includes the 8-socket server, ideal for in-memory databases requiring up to 480 cores and 32 TB of DDR5 memory for maximum performance. In addition, the product line includes a 4-socket server, which is ideal for applications that require a single system image of up to 240 cores and 16 TB of high-speed memory.

These powerful systems all use 4th Gen Intel Xeon Scalable processors. Compared with the previous generation of 8-socket and 4-socket servers, the systems have 2X the core count, 1.33X the memory capacity, and 2X the memory bandwidth. Also, these systems deliver up to 4X the I/O bandwidth compared to previous generations of systems for connectivity to peripherals. The Supermicro 8-socket system has attained the highest performance ratings ever for a single system based on the SPECcpu2017 FP Rate benchmarks, for both the base and peak results. In addition, the Supermicro 8-socket and 4-socket servers demonstrate performance leadership on a wide range of SPEC benchmarks.
Return to Keyword Browsing
Dec 7th, 2024 21:05 CST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts