allura-org/MN-12b-RP-Ink

allura-org/MN-12b-RP-Ink is a 12 billion parameter LoRA fine-tune of Mistral Nemo Instruct, specifically optimized for roleplay-focused generative tasks. This model leverages a unique and diverse dataset, drawing inspiration from methodologies used in models like SorcererLM and Slush. With a 32768 token context length, it excels at generating creative and engaging narrative content for roleplaying scenarios.

Warm
Public
12B
FP8
32768
Hugging Face

No reviews yet. Be the first to review!