T0@st
News Editor
- Joined
- Mar 7, 2023
- Messages
- 3,063 (3.88/day)
- Location
- South East, UK
System Name | The TPU Typewriter |
---|---|
Processor | AMD Ryzen 5 5600 (non-X) |
Motherboard | GIGABYTE B550M DS3H Micro ATX |
Cooling | DeepCool AS500 |
Memory | Kingston Fury Renegade RGB 32 GB (2 x 16 GB) DDR4-3600 CL16 |
Video Card(s) | PowerColor Radeon RX 7800 XT 16 GB Hellhound OC |
Storage | Samsung 980 Pro 1 TB M.2-2280 PCIe 4.0 X4 NVME SSD |
Display(s) | Lenovo Legion Y27q-20 27" QHD IPS monitor |
Case | GameMax Spark M-ATX (re-badged Jonsbo D30) |
Audio Device(s) | FiiO K7 Desktop DAC/Amp + Philips Fidelio X3 headphones, or ARTTI T10 Planar IEMs |
Power Supply | ADATA XPG CORE Reactor 650 W 80+ Gold ATX |
Mouse | Roccat Kone Pro Air |
Keyboard | Cooler Master MasterKeys Pro L |
Software | Windows 10 64-bit Home Edition |
Oracle has stood up and optimized its first wave of liquid-cooled NVIDIA GB200 NVL72 racks in its data centers. Thousands of NVIDIA Blackwell GPUs are now being deployed and ready for customer use on NVIDIA DGX Cloud and Oracle Cloud Infrastructure (OCI) to develop and run next-generation reasoning models and AI agents. Oracle's state-of-the-art GB200 deployment includes high-speed NVIDIA Quantum-2 InfiniBand and NVIDIA Spectrum-X Ethernet networking to enable scalable, low-latency performance, as well as a full stack of software and database integrations from NVIDIA and OCI.
OCI, one of the world's largest and fastest-growing cloud service providers, is among the first to deploy NVIDIA GB200 NVL72 systems. The company has ambitious plans to build one of the world's largest Blackwell clusters. OCI Superclusters will scale beyond 100,000 NVIDIA Blackwell GPUs to meet the world's skyrocketing need for inference tokens and accelerated computing. The torrid pace of AI innovation continues as several companies including OpenAI have released new reasoning models in the past few weeks.
OCI's installation is the latest example of NVIDIA Grace Blackwell systems going online worldwide, transforming cloud data centers into AI factories that manufacture intelligence at scale. These new AI factories leverage the NVIDIA GB200 NVL72 platform, a rack-scale system that combines 36 NVIDIA Grace CPUs and 72 NVIDIA Blackwell GPUs, delivering exceptional performance and energy efficiency for agentic AI powered by advanced AI reasoning models.
OCI offers flexible deployment options to bring Blackwell to customers across public, government and sovereign clouds, as well as customer-owned data centers through OCI Dedicated Region and OCI Alloy at any scale.
A number of customers are planning to deploy workloads right away on the OCI GB200 systems including major technology companies, enterprise customers, government agencies and contractors, and regional cloud providers.
These new racks are the first systems available from NVIDIA DGX Cloud, an optimized platform with software, services and technical support to develop and deploy AI workloads on leading clouds such as OCI. NVIDIA will use the racks for a variety of projects including training reasoning models, autonomous vehicle development, accelerating chip design and manufacturing, and developing AI tools.
GB200 NVL72 racks are live and available now from DGX Cloud and OCI.
View at TechPowerUp Main Site | Source
OCI, one of the world's largest and fastest-growing cloud service providers, is among the first to deploy NVIDIA GB200 NVL72 systems. The company has ambitious plans to build one of the world's largest Blackwell clusters. OCI Superclusters will scale beyond 100,000 NVIDIA Blackwell GPUs to meet the world's skyrocketing need for inference tokens and accelerated computing. The torrid pace of AI innovation continues as several companies including OpenAI have released new reasoning models in the past few weeks.




OCI's installation is the latest example of NVIDIA Grace Blackwell systems going online worldwide, transforming cloud data centers into AI factories that manufacture intelligence at scale. These new AI factories leverage the NVIDIA GB200 NVL72 platform, a rack-scale system that combines 36 NVIDIA Grace CPUs and 72 NVIDIA Blackwell GPUs, delivering exceptional performance and energy efficiency for agentic AI powered by advanced AI reasoning models.
OCI offers flexible deployment options to bring Blackwell to customers across public, government and sovereign clouds, as well as customer-owned data centers through OCI Dedicated Region and OCI Alloy at any scale.
A number of customers are planning to deploy workloads right away on the OCI GB200 systems including major technology companies, enterprise customers, government agencies and contractors, and regional cloud providers.
These new racks are the first systems available from NVIDIA DGX Cloud, an optimized platform with software, services and technical support to develop and deploy AI workloads on leading clouds such as OCI. NVIDIA will use the racks for a variety of projects including training reasoning models, autonomous vehicle development, accelerating chip design and manufacturing, and developing AI tools.

GB200 NVL72 racks are live and available now from DGX Cloud and OCI.
View at TechPowerUp Main Site | Source