sarvamai/sarvam-m

SarvamAI's sarvam-m is a 24 billion parameter multilingual, hybrid-reasoning, text-only language model built on Mistral-Small, featuring a 32768-token context length. It is specifically post-trained for significant improvements in Indian language, math, and programming benchmarks. This model uniquely offers a "think" mode for complex logical tasks and a "non-think" mode for general conversation, making it highly versatile for diverse applications requiring advanced reasoning and multilingual support.

Warm
Public
24B
FP8
32768
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!