Fujitsu Unveils Energy-Efficient AI Models with New Reconstruction Technology
Fujitsu has announced the development of a new reconstruction technology for generative AI, aimed at optimizing and enhancing energy efficiency in AI models. This innovation is a core component of the Fujitsu Kozuchi AI service and strengthens the Takane large language model (LLM) by creating lightweight, power-efficient AI models announced in a press release.
The technology leverages two key innovations: 1-bit quantization and specialized AI distillation. Fujitsu's proprietary quantization method reduces memory consumption by 94% while maintaining an 89% accuracy retention rate compared to unquantized models. This results in a three-fold increase in inference speed, allowing large AI models to operate efficiently on a single low-end GPU.
Additionally, Fujitsu's specialized AI distillation reduces model size and enhances accuracy beyond the original model. This approach extracts and condenses task-specific knowledge, creating highly efficient and reliable specialized AIs. The lightweighting capability promises to democratize advanced AI, enabling deployment on edge devices like smartphones and factory machinery, improving real-time responsiveness and reducing power consumption.
Fujitsu plans to roll out trial environments for the Takane model with this technology in the second half of fiscal year 2025, progressively releasing models of Cohere's research open-weight Command A quantized via Hugging Face.
We hope you enjoyed this article.
Consider subscribing to one of our newsletters like Daily AI Brief.
Also, consider following us on social media:
Subscribe to Daily AI Brief
Daily report covering major AI developments and industry news, with both top stories and complete market updates
Market report
AI’s Time-to-Market Quagmire: Why Enterprises Struggle to Scale AI Innovation
The 2025 AI Governance Benchmark Report by ModelOp provides insights from 100 senior AI and data leaders across various industries, highlighting the challenges enterprises face in scaling AI initiatives. The report emphasizes the importance of AI governance and automation in overcoming fragmented systems and inconsistent practices, showcasing how early adoption correlates with faster deployment and stronger ROI.
Read more