DoppelReflEx/MiniusLight-24B-v1.01

MiniusLight-24B-v1.01 by DoppelReflEx is a 24 billion parameter language model with a 32768 token context length, created through a Slerp merge of TheDrummer/Cydonia-24B-v2 and PocketDoc/Dans-PersonalityEngine-V1.2.0-24b. This model is based on the Mistral architecture and is designed for general conversational use, utilizing the ChatML chat template. It offers a balanced performance profile for users seeking a merged model experience.

Warm
Public
24B
FP8
32768
License: cc-by-nc-4.0
Hugging Face

No reviews yet. Be the first to review!