mistralai/Mistral-Small-24B-Base-2501

Mistral-Small-24B-Base-2501 is a 24 billion parameter base language model developed by Mistral AI, featuring a 32k context window and a 131k vocabulary Tekken tokenizer. This multilingual model supports dozens of languages and demonstrates advanced reasoning capabilities, achieving strong benchmark results across various tasks. It is designed as a powerful base model for a wide range of applications, comparable to larger models in performance.

Warm
Public
24B
FP8
32768
License: apache-2.0
Hugging Face
Gated