ChaoticNeutrals/Captain-Eris_Violet_Toxic-Magnum-12B

ChaoticNeutrals/Captain-Eris_Violet_Toxic-Magnum-12B is a 12 billion parameter language model created by ChaoticNeutrals, formed by merging Nitral-AI/Captain-Eris_Violet-Toxic-GRPO-alt-v.02 and anthracite-org/magnum-v2-12b using the SLERP method. This model leverages a 32768-token context length, combining the strengths of its constituent models. It is designed for general text generation tasks, benefiting from the blended capabilities of its merged components.

Warm
Public
12B
FP8
32768
Hugging Face