mistralai/Magistral-Small-2507

Magistral-Small-2507 is a 24 billion parameter language model developed by Mistral AI, building upon Mistral Small 3.1 with enhanced reasoning capabilities. This model is optimized for complex reasoning tasks, capable of generating long chains of thought before providing an answer. It supports dozens of languages and features a 128k context window, with optimal performance recommended up to 40k tokens, making it suitable for applications requiring detailed logical processing.

Warm
Public
24B
FP8
32768
License: apache-2.0
Hugging Face