NeverSleep/Lumimaid-v0.2-12B

NeverSleep/Lumimaid-v0.2-12B is a 12 billion parameter language model developed by NeverSleep, based on the Mistral-Nemo-Instruct-2407 architecture. This iteration, v0.2, features a significantly refined dataset, with a focus on cleaning and improving data quality. It is designed for general instruction-following tasks, leveraging a diverse collection of high-quality conversational and writing-focused datasets. The model supports a context length of 32768 tokens and utilizes the Mistral prompt template.

Warm
Public
12B
FP8
32768
License: cc-by-nc-4.0
Hugging Face