Undi95/Lumimaid-Magnum-12B

Undi95/Lumimaid-Magnum-12B is a 12 billion parameter language model created by Undi95, resulting from a merge of the Lumimaid and Magnum models. This model incorporates a fine-tuned component based on Claude input, trained on a 16k context length, enhancing its conversational capabilities. It is optimized for generating responses in a Mistral-style prompt format, making it suitable for instruction-following tasks.

Warm
Public
12B
FP8
32768
Hugging Face