unsloth/Mistral-Small-3.2-24B-Instruct-2506

Mistral-Small-3.2-24B-Instruct-2506 is a 24 billion parameter instruction-tuned language model developed by Mistral AI, building upon the Mistral-Small-3.1 series. This model features a 32768-token context length and is specifically enhanced for improved instruction following, reduced repetition errors, and more robust function calling. It excels in complex reasoning tasks, code generation, and vision capabilities, making it suitable for applications requiring precise control and multimodal understanding.

Warm
Public
Vision
24B
FP8
32768
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!