chutesai/Mistral-Small-3.2-24B-Instruct-2506

Mistral-Small-3.2-24B-Instruct-2506 is a 24 billion parameter instruction-tuned language model developed by Mistral AI, building upon the Mistral-Small-3.1-24B-Instruct-2503 series. This model features improved instruction following, reduced repetition errors, and a more robust function calling template. It also supports multimodal inputs, including vision, and is optimized for general instruction-following tasks with a 32K context length.

Warm
Public
Vision
24B
FP8
32768
License: apache-2.0
Hugging Face