• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AI SSD Procurement Capacity Estimated to Exceed 45 EB in 2024; NAND Flash Suppliers Accelerate Process Upgrades

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,683 (2.41/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
TrendForce's latest report on enterprise SSDs reveals that a surge in demand for AI has led AI server customers to significantly increase their orders for enterprise SSDs over the past two quarters. Upstream suppliers have been accelerating process upgrades and planning for 2YY products—slated to enter mass production in 2025—in order to meet the growing demand for SSDs in AI applications.

TrendForce observes that increased orders for enterprise SSDs from AI server customers have resulted in contract prices for this category rising by over 80% from 4Q23 to 3Q24. SSDs play a crucial role in AI development. In AI model training, SSDs primarily store model parameters, including evolving weights and deviations.




Another key application of SSDs is creating checkpoints to periodically save AI model training progress, allowing recovery from specific points in case of interruptions. Due to a high reliance on fast data transfer and superior write endurance for these functions, customers typically opt for 4 TB/8 TB TLC SSD to meet the demanding requirements of the training process.

TrendForce points out that SSDs used in AI inference servers assist in adjusting and optimizing AI models during the inference process. Notably, SSDs can update data in real time to fine-tune inference model outcomes. AI inference primarily provides retrieval-augmented generation (RAG) and LLM services. SSDs store the reference documents and knowledge bases that RAG and LLM use to generate more informative responses. Additionally, as more generated information is displayed as videos or images, the storage capacity required also increases, making high-capacity SSDs such as TLC/QLC 16 TB or larger the preferred choice for AI inference applications.

Growth rate of AI SSD demand exceeds 60% as suppliers accelerate development of high-capacity products
In 2024, the AI server SSD demand market has seen a significant increase in demand for products larger than 16 TB starting in the second quarter. With the arrival of NVIDIA's H100, H20, and H200 series products, customers have begun to further boost their orders for 4 TB and 8 TB TLC enterprise SSDs. TrendForce estimates that this year's AI-related SSD procurement capacity will exceed 45 EB. Over the next few years, AI servers are expected to drive an average annual growth rate of over 60% in SSD demand, with AI SSD demand potentially rising from 5% of total NAND Flash consumption in 2024 to 9% in 2025.

AI inference servers will continue to adopt high-capacity SSD products. Suppliers have already started accelerating process upgrades and are aiming for mass production of 2YY/3XX-layer products from 1Q25, as well as eventually producing 120 TB enterprise SSD products.

View at TechPowerUp Main Site | Source
 
Joined
Mar 14, 2014
Messages
1,400 (0.36/day)
Processor 11900K
Motherboard ASRock Z590 OC Formula
Cooling Noctua NH-D15 using 2x140mm 3000RPM industrial Noctuas
Memory G. Skill Trident Z 2x16GB 3600MHz
Video Card(s) eVGA RTX 3090 FTW3
Storage 2TB Crucial P5 Plus
Display(s) 1st: LG GR83Q-B 1440p 27in 240Hz / 2nd: Lenovo y27g 1080p 27in 144Hz
Case Lian Li Lancool MESH II RGB (I removed the RGB)
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply Seasonic Prime 850 TX
Mouse Glorious Model D
Keyboard Glorious MMK2 65% Lynx MX switches
Software Win10 Pro
Didn't all the suppliers just recently cut production and raise prices because of weaker demand?
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,683 (2.41/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
Didn't all the suppliers just recently cut production and raise prices because of weaker demand?
That was about six months ago, maybe a bit longer than that even.
 
Joined
Mar 14, 2014
Messages
1,400 (0.36/day)
Processor 11900K
Motherboard ASRock Z590 OC Formula
Cooling Noctua NH-D15 using 2x140mm 3000RPM industrial Noctuas
Memory G. Skill Trident Z 2x16GB 3600MHz
Video Card(s) eVGA RTX 3090 FTW3
Storage 2TB Crucial P5 Plus
Display(s) 1st: LG GR83Q-B 1440p 27in 240Hz / 2nd: Lenovo y27g 1080p 27in 144Hz
Case Lian Li Lancool MESH II RGB (I removed the RGB)
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply Seasonic Prime 850 TX
Mouse Glorious Model D
Keyboard Glorious MMK2 65% Lynx MX switches
Software Win10 Pro
That was about six months ago, maybe a bit longer than that even.
Ahh of course, just in time to have sold threw a good amount of supply for a demand boom.
 
Top