Multiverse Computing Secures $217 Million for AI Model Compression
Multiverse Computing has raised $217 million to further develop its AI model compression technology, CompactifAI, reports Reuters. The funding round was led by Bullhound Capital, with participation from HP Inc, Toshiba, and other investors.
CompactifAI is designed to reduce the size of large language models (LLMs) by up to 95% while maintaining their performance. This technology leverages ideas from quantum physics and machine learning, allowing for significant cost reductions and increased efficiency. The compressed models can run on various platforms, including cloud services and edge devices like smartphones and drones.
The funding will enable Multiverse to expand the adoption of its technology, addressing the high costs associated with deploying LLMs. The company has already launched compressed versions of popular models such as Meta's Llama and plans to release more in the future.
We hope you enjoyed this article.
Consider subscribing to one of several newsletters we publish like AI Funding Brief.
Also, consider following us on social media:
Subscribe to AI Funding Brief
Market report
AI’s Time-to-Market Quagmire: Why Enterprises Struggle to Scale AI Innovation
The 2025 AI Governance Benchmark Report by ModelOp provides insights from 100 senior AI and data leaders across various industries, highlighting the challenges enterprises face in scaling AI initiatives. The report emphasizes the importance of AI governance and automation in overcoming fragmented systems and inconsistent practices, showcasing how early adoption correlates with faster deployment and stronger ROI.
Read more