News Posts matching #HBM3E

Return to Keyword Browsing

HBM Industry Revenue Could Double by 2025 - Growth Driven by Next-gen AI GPUs Cited

Samsung, SK hynix, and Micron are considered to be the top manufacturing sources of High Bandwidth Memory (HBM)—the HBM3 and HBM3E standards are becoming increasingly in demand, due to a widespread deployment of GPUs and accelerators by generative AI companies. Taiwan's Commercial Times proposes that there is an ongoing shortage of HBM components—but this presents a growth opportunity for smaller manufacturers in the region. Naturally, the big name producers are expected to dive in head first with the development of next generation models. The aforementioned financial news article cites research conducted by the Gartner group—they predict that the HBM market will hit an all-time high of $4.976 billion (USD) by 2025.

This estimate is almost double that of projected revenues (just over $2 billion) generated by the HBM market in 2023—the explosive growth of generative AI applications has "boosted" demand for the most performant memory standards. The Commercial Times report states that SK Hynix is the current HBM3E leader, with Micron and Samsung trailing behind—industry experts believe that stragglers will need to "expand HBM production capacity" in order to stay competitive. SK Hynix has shacked up with NVIDIA—the GH200 Grace Hopper platform was unveiled last summer; outfitted with the South Korean firm's HBM3e parts. In a similar timeframe, Samsung was named as AMD's preferred supplier of HBM3 packages—as featured within the recently launched Instinct MI300X accelerator. NVIDIA's HBM3E deal with SK Hynix is believed to extend to the internal makeup of Blackwell GB100 data-center GPUs. The HBM4 memory standard is expected to be the next major battleground for the industry's hardest hitters.

SK hynix to Exhibit AI Memory Leadership at CES 2024

SK hynix Inc. announced today that it will showcase the technology for ultra-high performance memory products, the core of future AI infrastructure, at CES 2024, the most influential tech event in the world taking place from January 9 through 12 in Las Vegas. SK hynix said that it will highlight its future vision represented by its Memory Centric at the show and promote the importance of memory products accelerating the technological innovation in the AI era and its competitiveness in the global memory markets.

The company will run a space titled SK Wonderland jointly with other major SK Group affiliates including SK Inc., SK Innovation and SK Telecom, and showcase its major AI memory products including HBM3E. SK hynix plans to provide HBM3E, the world's best-performing memory product that it successfully developed in August, to the world's largest AI technology companies by starting mass production from the first half of 2024.

SK hynix Showcases Next-Gen AI and HPC Solutions at SC23

SK hynix presented its leading AI and high-performance computing (HPC) solutions at Supercomputing 2023 (SC23) held in Denver, Colorado between November 12-17. Organized by the Association for Computing Machinery and IEEE Computer Society since 1988, the annual SC conference showcases the latest advancements in HPC, networking, storage, and data analysis. SK hynix marked its first appearance at the conference by introducing its groundbreaking memory solutions to the HPC community. During the six-day event, several SK hynix employees also made presentations revealing the impact of the company's memory solutions on AI and HPC.

Displaying Advanced HPC & AI Products
At SC23, SK hynix showcased its products tailored for AI and HPC to underline its leadership in the AI memory field. Among these next-generation products, HBM3E attracted attention as the HBM solution meets the industry's highest standards of speed, capacity, heat dissipation, and power efficiency. These capabilities make it particularly suitable for data-intensive AI server systems. HBM3E was presented alongside NVIDIA's H100, a high-performance GPU for AI that uses HBM3 for its memory.

ASUS Demonstrates AI and Immersion-Cooling Solutions at SC23

ASUS today announced a showcase of the latest AI solutions to empower innovation and push the boundaries of supercomputing, at Supercomputing 2023 (SC23) in Denver, Colorado, from 12-17 November, 2023. ASUS will demonstrate the latest AI advances, including generative-AI solutions and sustainability breakthroughs with Intel, to deliver the latest hybrid immersion-cooling solutions, plus lots more - all at booth number 257.

At SC23, ASUS will showcase the latest NVIDIA-qualified ESC N8A-E12 HGX H100 eight-GPU server empowered by dual-socket AMD EPYC 9004 processors and is designed for enterprise-level generative AI with market-leading integrated capabilities. Related to NVIDIA announcement on the latest NVIDIA H200 Tensor Core GPU at SC23, which is the first GPU to offer HBM3E for faster, larger memory to fuel the acceleration of generative AI and large language models, ASUS will offer an update of H100-based system with an H200-based drop-in replacement in 2024.

NVIDIA Supercharges Hopper, the World's Leading AI Computing Platform

NVIDIA today announced it has supercharged the world's leading AI computing platform with the introduction of the NVIDIA HGX H200. Based on NVIDIA Hopper architecture, the platform features the NVIDIA H200 Tensor Core GPU with advanced memory to handle massive amounts of data for generative AI and high performance computing workloads.

The NVIDIA H200 is the first GPU to offer HBM3e - faster, larger memory to fuel the acceleration of generative AI and large language models, while advancing scientific computing for HPC workloads. With HBM3e, the NVIDIA H200 delivers 141 GB of memory at 4.8 terabytes per second, nearly double the capacity and 2.4x more bandwidth compared with its predecessor, the NVIDIA A100. H200-powered systems from the world's leading server manufacturers and cloud service providers are expected to begin shipping in the second quarter of 2024.

Samsung Electronics Announces Third Quarter 2023 Results

Samsung Electronics today reported financial results for the third quarter ended September 30, 2023. Total consolidated revenue was KRW 67.40 trillion, a 12% increase from the previous quarter, mainly due to new smartphone releases and higher sales of premium display products. Operating profit rose sequentially to KRW 2.43 trillion based on strong sales of flagship models in mobile and strong demand for displays, as losses at the Device Solutions (DS) Division narrowed.

The Memory Business reduced losses sequentially as sales of high valued-added products and average selling prices somewhat increased. Earnings in system semiconductors were impacted by a delay in demand recovery for major applications, but the Foundry Business posted a new quarterly high for new backlog from design wins. The mobile panel business reported a significant increase in earnings on the back of new flagship model releases by major customers, while the large panel business narrowed losses in the quarter. The Device eXperience (DX) Division achieved solid results due to robust sales of premium smartphones and TVs. Revenue at the Networks Business declined in major overseas markets as mobile operators scaled back investments.

Samsung Electronics Holds Memory Tech Day 2023 Unveiling New Innovations To Lead the Hyperscale AI Era

Samsung Electronics Co., Ltd., a world leader in advanced memory technology, today held its annual Memory Tech Day, showcasing industry-first innovations and new memory products to accelerate technological advancements across future applications—including the cloud, edge devices and automotive vehicles.

Attended by about 600 customers, partners and industry experts, the event served as a platform for Samsung executives to expand on the company's vision for "Memory Reimagined," covering long-term plans to continue its memory technology leadership, outlook on market trends and sustainability goals. The company also presented new product innovations such as the HBM3E Shinebolt, LPDDR5X CAMM2 and Detachable AutoSSD.

SK hynix Displays Next-Gen Solutions Set to Unlock AI and More at OCP Global Summit 2023

SK hynix showcased its next-generation memory semiconductor technologies and solutions at the OCP Global Summit 2023 held in San Jose, California from October 17-19. The OCP Global Summit is an annual event hosted by the world's largest data center technology community, the Open Compute Project (OCP), where industry experts gather to share various technologies and visions. This year, SK hynix and its subsidiary Solidigm showcased advanced semiconductor memory products that will lead the AI era under the slogan "United Through Technology".

SK hynix presented a broad range of its solutions at the summit, including its leading HBM(HBM3/3E), CXL, and AiM products for generative AI. The company also unveiled some of the latest additions to its product portfolio including its DDR5 RDIMM, MCR DIMM, enterprise SSD (eSSD), and LPDDR CAMM devices. Visitors to the HBM exhibit could see HBM3, which is utilized in NVIDIA's H100, a high-performance GPU for AI, and also check out the next-generation HBM3E. Due to their low-power consumption and ultra-high-performance, these HBM solutions are more eco-friendly and are particularly suitable for power-hungry AI server systems.

Samsung Notes: HBM4 Memory is Coming in 2025 with New Assembly and Bonding Technology

According to the editorial blog post published on the Samsung blog by SangJoon Hwang, Executive Vice President and Head of the DRAM Product & Technology Team at Samsung Electronics, we have information that High-Bandwidth Memory 4 (HBM4) is coming in 2025. In the recent timeline of HBM development, we saw the first appearance of HBM memory in 2015 with the AMD Radeon R9 Fury X. The second-generation HBM2 appeared with NVIDIA Tesla P100 in 2016, and the third-generation HBM3 saw the light of the day with NVIDIA Hopper GH100 GPU in 2022. Currently, Samsung has developed 9.8 Gbps HBM3E memory, which will start sampling to customers soon.

However, Samsung is more ambitious with development timelines this time, and the company expects to announce HBM4 in 2025, possibly with commercial products in the same calendar year. Interestingly, the HBM4 memory will have some technology optimized for high thermal properties, such as non-conductive film (NCF) assembly and hybrid copper bonding (HCB). The NCF is a polymer layer that enhances the stability of micro bumps and TSVs in the chip, so memory solder bump dies are protected from shock. Hybrid copper bonding is an advanced semiconductor packaging method that creates direct copper-to-copper connections between semiconductor components, enabling high-density, 3D-like packaging. It offers high I/O density, enhanced bandwidth, and improved power efficiency. It uses a copper layer as a conductor and oxide insulator instead of regular micro bumps to increase the connection density needed for HBM-like structures.

SK hynix Presents Advanced Memory Technologies at Intel Innovation 2023

SK hynix announced on September 22 that it showcased its latest memory technologies and products at Intel Innovation 2023 held September 19-20 in the western U.S. city of San Jose, California. Hosted by Intel since 2019, Intel Innovation is an annual IT exhibition which brings together the technology company's customers and partners to share the latest developments in the industry. At this year's event held at the San Jose McEnery Convention Center, SK hynix showcased its advanced semiconductor memory products which are essential in the generative AI era under the slogan "Pioneer Tomorrow With the Best."

Products that garnered the most interest were HBM3, which supports the high-speed performance of AI accelerators, and DDR5 RDIMM, a DRAM module for servers with 1bnm process technology. As one of SK hynix's core technologies, HBM3 has established the company as a trailblazer in AI memory. SK hynix plans to further strengthen its position in the market by mass-producing HBM3E (Extended) from 2024. Meanwhile, DDR5 RDIMM with 1bnm, or the 5th generation of the 10 nm process technology, also offers outstanding performance. In addition to supporting unprecedented transfer speeds of more than 6,400 megabits per second (Mbps), this low-power product helps customers simultaneously reduce costs and improve ESG performance.

SK hynix Develops World's Best Performing HBM3E Memory

SK hynix Inc. announced today that it successfully developed HBM3E, the next-generation of the highest-specification DRAM for AI applications currently available, and said a customer's evaluation of samples is underway. The company said that the successful development of HBM3E, the extended version of HBM3 which delivers the world's best specifications, comes on top of its experience as the industry's sole mass provider of HBM3. With its experience as the supplier of the industry's largest volume of HBM products and the mass-production readiness level, SK hynix plans to mass produce HBM3E from the first half of next year and solidify its unrivaled leadership in AI memory market.

According to the company, the latest product not only meets the industry's highest standards of speed, the key specification for AI memory products, but all categories including capacity, heat dissipation and user-friendliness. In terms of speed, the HBM3E can process data up to 1.15 terabytes a second, which is equivalent to processing more than 230 Full-HD movies of 5 GB-size each in a second.

NVIDIA Unveils Next-Generation GH200 Grace Hopper Superchip Platform With HMB3e

NVIDIA today announced the next-generation NVIDIA GH200 Grace Hopper platform - based on a new Grace Hopper Superchip with the world's first HBM3e processor - built for the era of accelerated computing and generative AI. Created to handle the world's most complex generative AI workloads, spanning large language models, recommender systems and vector databases, the new platform will be available in a wide range of configurations. The dual configuration - which delivers up to 3.5x more memory capacity and 3x more bandwidth than the current generation offering - comprises a single server with 144 Arm Neoverse cores, eight petaflops of AI performance and 282 GB of the latest HBM3e memory technology.

"To meet surging demand for generative AI, data centers require accelerated computing platforms with specialized needs," said Jensen Huang, founder and CEO of NVIDIA. "The new GH200 Grace Hopper Superchip platform delivers this with exceptional memory technology and bandwidth to improve throughput, the ability to connect GPUs to aggregate performance without compromise, and a server design that can be easily deployed across the entire data center."

Insider Info Alleges SK hynix Preparing HBM3E Samples for NVIDIA

Industry insiders in South Korea have informed news publications that NVIDIA has requested that SK hynix submit samples of next-generation high bandwidth memory (HBM) for evaluation purposes—according to Business Korea's article, workers were preparing an initial batch of HBM3E prototypes for shipment this week. SK hynix has an existing relationship with NVIDIA—it fended off tough competition last year and has since produced (current gen) HBM3 DRAM for the H100 "Hopper" Tensor Core GPU.

The memory manufacturer is hoping to maintain its position as the HBM market leader with fifth generation products in the pipeline—vice president Park Myung-soo revealed back in April that: "we are preparing 8 Gbps HBM3E product samples for the second half of this year and are preparing for mass production in the first half of next year." A new partnership with NVIDIA could help SK hynix widen the gulf between it and and its nearest competitor - Samsung - in the field of HBM production.

SK hynix Enters Industry's First Compatibility Validation Process for 1bnm DDR5 Server DRAM

SK hynix Inc. announced today that it has completed the development of the industry's most advanced 1bnm, the fifth-generation of the 10 nm process technology, while the company and Intel began a joint evaluation of 1bnm and validation in the Intel Data Center Certified memory program for DDR5 products targeted at Intel Xeon Scalable platforms.

The move comes after SK hynix became the first in the industry to reach 1anm readiness and completed Intel's system validation of the 1anm DDR5, the fourth-generation of the 10 nm technology. The DDR5 products provided to Intel run at the world's fastest speed of 6.4 Gbps (Gigabits per second), representing a 33% improvement in data processing speed compared with test-run products in early days of DDR5 development.
Return to Keyword Browsing
Jun 17th, 2024 08:18 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts