mistralai/Mistral-Nemo-Base-2407

Mistral-Nemo-Base-2407 is a 12 billion parameter pretrained generative text model developed jointly by Mistral AI and NVIDIA. It features a 128k context window and is trained on a significant proportion of multilingual and code data. This model is designed as a drop-in replacement for Mistral 7B, offering enhanced performance for various natural language processing tasks.

Warm
Public
12B
FP8
32768
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!