flammenai/Mahou-1.3-mistral-nemo-12B

flammenai/Mahou-1.3-mistral-nemo-12B is a 12 billion parameter language model developed by flammenai, built on the Mistral-Nemo architecture with a 32K context length. This model is specifically designed for conversational AI, excelling at generating short messages in casual conversation and character roleplay scenarios. It is fine-tuned using the ORPO method to enhance its interactive dialogue capabilities.

Warm
Public
12B
FP8
32768
License: apache-2.0
Hugging Face