Tiny Aya is a 3.35B parameter multilingual model that achieves state-of-the-art results across 70 languages, challenging the need for massive scale in global AI.
arXiv · March 13, 2026 · 2603.11510
Why it matters
This release democratizes high-quality multilingual LLMs for edge devices. It includes specialized variants for different global regions, providing a blueprint for efficient scaling that prioritizes multilingual depth over raw parameter count.
From the abstract
Tiny Aya redefines what a small multilingual language model can achieve. Trained on 70 languages and refined through region-aware posttraining, it delivers state-of-the-art in translation quality, strong multilingual understanding, and high-quality target-language generation, all with just 3.35B parameters. The release includes a pretrained foundation model, a globally balanced instruction-tuned variant, and three region-specialized models targeting languages from Africa, South Asia, Europe, Asi