Xclbr7/Arcanum-12b

Xclbr7/Arcanum-12b is a 12 billion parameter causal language model developed by Xclbr7, created by merging TheDrummer/Rocinante-12B-v1.1 and MarinaraSpaghetti/NemoMix-Unleashed-12B. This Transformer-based model is primarily in English and is optimized for conversational tasks with different personas. It features a 32768 token context length and was merged using the Ties method with specific density parameters and int8 masking.

Warm
Public
12B
FP8
32768
License: mit
Hugging Face

No reviews yet. Be the first to review!