Cerebras has built the world's fastest AI data center in Oklahoma, leveraging a revolutionary wafer-scale engine and advanced liquid cooling to achieve unprecedented compute power and efficiency.
Takeways• Cerebras's Oklahoma data center is Earth's fastest AI infrastructure, delivering 44 exaflops of compute power.
• The core innovation is the wafer-scale engine, the largest processor with on-chip memory for minimal latency.
• Advanced liquid cooling and redundant natural gas/generator power systems ensure continuous, efficient operation.
Cerebras has launched its new Oklahoma data center, delivering 44 exaflops of AI compute power, making it the fastest AI infrastructure on Earth. This facility utilizes Cerebras's groundbreaking wafer-scale engine, the largest processor ever built, which integrates memory directly onto the chip to eliminate latency. The center employs a sophisticated liquid cooling system and a robust power infrastructure to ensure continuous, high-performance operation.
Oklahoma Data Center Strategy
• 00:01:39 Cerebras selected Oklahoma City for its new data center due to reasonable labor costs, expandability, and affordable power. The facility's construction is reinforced concrete, specifically designed to withstand natural disasters like tornadoes, similar to how California data centers are built for earthquakes. This strategic location and robust infrastructure are crucial for maintaining operational integrity and efficiency.
Wafer-Scale Engine Innovation
• 00:02:39 The core of Cerebras's speed is the wafer-scale engine, the largest processor ever created, measuring 46,250 square millimeters, roughly the size of a dinner plate. This contrasts sharply with traditional chips, which are comparable to a postage stamp. A key innovation is having all memory on the chip, which eliminates off-chip latency that typically slows down traditional GPUs during AI inference, thereby significantly boosting performance.
Advanced Cooling and Power
• 00:04:10 To manage the 18 kilowatts of heat generated by each wafer, Cerebras employs a sophisticated liquid cooling system, circulating cold water at 42 degrees and returning it at about 70 degrees to prevent condensation and optimize wafer efficiency. The data center's power reliability is maintained by a primary natural gas source, backed by batteries that provide five minutes of power until three-megawatt diesel/liquid natural gas generators activate, ensuring near-perfect uptime.
Future AI Applications
• 00:13:01 Cerebras anticipates its high-speed AI compute will drive significant real-world transformations, particularly in medicine and education. Andrew, the CEO, is most excited about AI's potential in medicine to drastically shorten drug design processes from 17-19 years to under 10. In education, AI offers a revolutionary opportunity to change traditional teaching methods, which have remained largely unchanged for centuries.