mistralai/Mistral-Small-3.1-24B-Instruct-2503

Mistral-Small-3.1-24B-Instruct-2503 is a 24 billion parameter instruction-tuned model from Mistral AI, building on Mistral Small 3. It features state-of-the-art vision understanding and an enhanced 128k token context window, while maintaining strong text performance. This model excels in both text and vision tasks, offering advanced reasoning, multilingual support, and agentic capabilities with native function calling and JSON output. It is optimized for fast-response conversational agents, local inference, programming, math reasoning, and long document understanding.

Warm
Public
Vision
24B
FP8
32768
License: apache-2.0
Hugging Face