Cerebras Systems Expands AI Infrastructure with Six New Datacenters
Cerebras Systems has announced the launch of six new AI inference datacenters across North America and Europe, significantly boosting its capacity to deliver high-speed AI services announced in a press release. These facilities, powered by Cerebras Wafer-Scale Engines, are expected to serve over 40 million Llama 70B tokens per second, positioning Cerebras as a leading provider of high-speed inference.
The new datacenters are part of Cerebras' 2025 AI inference scaling plan, which aims to expand its aggregate capacity by 20 times to meet increasing customer demand. The Oklahoma City and Montreal datacenters will be operational by Q3 2025, with additional sites in the Midwest/Eastern US and Europe coming online by Q4 2025. The facilities will be equipped with thousands of Cerebras CS-3 systems, offering unmatched performance and efficiency.
Cerebras' expansion includes partnerships with strategic partners like G42, and the company will operate 85% of its total capacity within the United States. This expansion is expected to play a crucial role in advancing AI infrastructure and leadership in the U.S. and globally. The Oklahoma City facility, in particular, will feature robust infrastructure with tornado and seismic shielding, as well as custom water-cooling solutions to support large-scale deployments.
With these new datacenters, Cerebras aims to provide global access to high-performance AI infrastructure, supporting critical research and business transformation across various sectors.
We hope you enjoyed this article.
Consider subscribing to one of several newsletters we publish like Silicon Brief.
Also, consider following our LinkedIn page AI Chips & Datacenters.
More from: Chips & Data Centers
Subscribe to Daily AI Brief
Daily report covering major AI developments and industry news, with both top stories and complete market updates