Cerebras and Perplexity Launch Ultra-Fast AI Search Model Sonar
Cerebras Systems and Perplexity AI have announced a partnership to launch Sonar, a new AI search model designed to deliver rapid search results. Sonar is built on Meta's Llama 3.3 70B foundation and operates using Cerebras' specialized AI chips, achieving processing speeds of 1,200 tokens per second. This makes it one of the fastest AI search systems currently available.
The collaboration aims to challenge traditional search engines by providing near-instantaneous AI-powered search results. According to Perplexity's internal testing, Sonar outperforms existing models like GPT-4o mini and Claude 3.5 Haiku in user satisfaction metrics, with a factuality score of 85.1 out of 100.
Cerebras' AI inference infrastructure is central to Sonar's performance, enabling the model to deliver accurate and relevant information in real-time. The new search experience is initially available to Perplexity Pro users, with plans for broader availability in the future.
This partnership highlights a trend in the AI industry towards leveraging specialized hardware to gain competitive advantages. While the financial terms of the partnership were not disclosed, the companies aim to establish Sonar as a serious contender in the enterprise search market.
We hope you enjoyed this article.
Consider subscribing to one of several newsletters we publish like Enterprise AI Brief.
Also, consider following our LinkedIn page AI Brief.
More from: Enterprise
Subscribe to Daily AI Brief
Daily report covering major AI developments and industry news, with both top stories and complete market updates