Cohere launches a family of open multilingual models
Cohere has introduced a new family of open multilingual AI models designed to support global enterprise use cases, improve cross-language performance, and expand access to advanced language technology.
Enterprise AI company Cohere has introduced a new family of multilingual models at the ongoing India AI Summit. The models, called Tiny Aya, are open-weight — meaning the underlying code is publicly available for anyone to use and modify — support more than 70 languages, and are designed to run on everyday devices such as laptops without needing an internet connection.
Launched by the company’s research arm Cohere Labs, the models support South Asian languages including Bengali, Hindi, Punjabi, Urdu, Gujarati, Tamil, Telugu, and Marathi.
The base model contains 3.35 billion parameters, a common measure of model size and complexity. Cohere also launched TinyAya-Global, a version fine-tuned to follow user instructions more reliably for applications that need broad language coverage. The family also includes regional variants: TinyAya-Earth for African languages; TinyAya-Fire for South Asian languages; and TinyAya-Water for Asia Pacific, West Asia, and Europe.
“This approach allows each model to develop stronger linguistic grounding and cultural nuance, creating systems that feel more natural and reliable for the communities they are meant to serve. At the same time, all Tiny Aya models retain broad multilingual coverage, making them flexible starting points for further adaptation and research,” the company said in a statement.
Cohere said the models were trained on a single cluster of 64 H100 GPUs — Nvidia’s high-powered chips — using relatively modest computing resources. The company said the models are suited for researchers and developers building applications for users who speak native languages. Because the models can run directly on devices, developers can use them for offline translation use cases. Cohere added that it optimised its underlying software for on-device performance, requiring less compute than many comparable models.
In linguistically diverse countries such as India, offline-friendly models like these could enable a broader range of applications and use cases without requiring constant internet access.
The models are available on HuggingFace and the Cohere Platform. Developers can download them from HuggingFace, Kaggle, and Ollama for local deployment. Cohere is also releasing training and evaluation datasets on Hugging Face and plans to publish a technical report outlining its training methodology.
Cohere CEO Aidan Gomez said last year that the company plans to go public “soon.” According to CNBC, the company ended 2025 with $240 million in annual recurring revenue, up 50% quarter over quarter.
What's Your Reaction?
Like
0
Dislike
0
Love
0
Funny
0
Angry
0
Sad
0
Wow
0