natong19/Mistral-Nemo-Instruct-2407-abliterated

Mistral-Nemo-Instruct-2407-abliterated is a 12 billion parameter large language model, jointly trained by Mistral AI and NVIDIA, featuring a 32K context window. This version has undergone ablation to reduce strong refusal directions, while maintaining performance. It is optimized for multilingual and code data, serving as a drop-in replacement for Mistral 7B.

Warm
Public
12B
FP8
32768
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!