allura-org/MN-Lyrebird-12B

MN-Lyrebird-12B is a 12 billion parameter language model developed by allura-org, based on the Mistral Nemo architecture and fine-tuned for creative longform writing tasks. With a 32K context length, it excels at co-writing and story generation, leveraging LoRA training on diverse book datasets and outputs from kimi-k2. This model is specifically optimized for narrative coherence and stylistic adaptation in creative text generation.

Warm
Public
12B
FP8
32768
Hugging Face