
Expedera Unveils Origin Evolution NPU IP for Edge AI
Expedera has launched its Origin Evolution NPU IP, a new technology aimed at advancing generative AI capabilities on edge devices. Announced in a press release, the Origin Evolution NPU IP is designed to meet the computational demands of running large language models (LLMs) on resource-constrained devices, such as smartphones and automotive systems.
The Origin Evolution NPU IP features a unique packet-based architecture that enhances efficiency by reducing external memory moves by over 75% for models like Llama 3.2 1B and Qwen2 1.5 B. This architecture supports a wide range of applications, offering scalability from 128 TFLOPS in a single core to PetaFLOPS with multiple cores.
Expedera's solution is compatible with popular neural networks, including Llama3, ChatGLM, and MobileNet, and supports both integer and floating-point precisions. The NPU IP is available now and is production-ready, providing a significant reduction in memory and system power needs while increasing processor utilization.
We hope you enjoyed this article.
Consider subscribing to one of several newsletters we publish like Silicon Brief.
Also, consider following us on social media:
More from: Chips & Data Centers
Subscribe to Silicon Brief
Weekly coverage of AI hardware developments including chips, GPUs, cloud platforms, and data center technology.
Market report
2025 Generative AI in Professional Services Report
This report by Thomson Reuters explores the integration and impact of generative AI technologies, such as ChatGPT and Microsoft Copilot, within the professional services sector. It highlights the growing adoption of GenAI tools across industries like legal, tax, accounting, and government, and discusses the challenges and opportunities these technologies present. The report also examines professionals' perceptions of GenAI and the need for strategic integration to maximize its value.
Read more