Hugging Face Introduces SmolLM3: A Multilingual, Long-Context AI Model

Hugging Face has unveiled SmolLM3, a new 3 billion parameter language model offering multilingual support and long-context reasoning capabilities. The model is designed to outperform existing models in its class while maintaining efficiency.

Hugging Face has announced the release of SmolLM3, a new language model featuring 3 billion parameters. This model is designed to provide multilingual support across six languages, including English, French, Spanish, German, Italian, and Portuguese. SmolLM3 is positioned as a competitive alternative to larger models, offering efficient performance with long-context reasoning capabilities.

The model is built on a transformer decoder architecture and incorporates several innovations to enhance its efficiency and performance. Key features include Grouped Query Attention (GQA) and NoPE, which improve long-context performance without compromising short-context capabilities. SmolLM3's training involved a three-stage strategy using a diverse data mixture, ensuring robust performance across various domains.

SmolLM3 supports dual-mode reasoning, allowing users to toggle between reasoning and non-reasoning modes. This flexibility is facilitated through a chat template that enables seamless switching, enhancing user interaction with the model. The model's performance has been validated across multiple benchmarks, demonstrating its capability in knowledge, reasoning, math, and coding tasks.

The release of SmolLM3 includes a comprehensive engineering blueprint, detailing the architecture, data mixtures, and training methodologies used. This transparency aims to assist researchers and developers in understanding and building upon the model's capabilities.

We hope you enjoyed this article.

Consider subscribing to one of several newsletters we publish. For example, in the Daily AI Brief you can read the most up to date AI news round-up 6 days per week.

Also, consider following us on social media:

Subscribe to Daily AI Brief

Daily report covering major AI developments and industry news, with both top stories and complete market updates

Whitepaper

Stanford HAI’s 2025 AI Index Reveals Record Growth in AI Capabilities, Investment, and Regulation

The 2025 AI Index by Stanford HAI provides a comprehensive overview of the global state of artificial intelligence, highlighting significant advancements in AI capabilities, investment, and regulation. The report details improvements in AI performance, increased adoption in various sectors, and the growing global optimism towards AI, despite ongoing challenges in reasoning and trust. It serves as a critical resource for policymakers, researchers, and industry leaders to understand AI's rapid evolution and its implications.

Read more