ockerman0/MN-12B-Starcannon-v4-unofficial

The ockerman0/MN-12B-Starcannon-v4-unofficial is a 12 billion parameter language model with a 32768 token context length, created by ockerman0 as an unofficial continuation of the "Starcannon" series. This model is a merge of nothingiisreal/MN-12B-Celeste-V1.9 and anthracite-org/magnum-12b-v2.5-kto, utilizing the TIES merge method. It is designed to combine the strengths of its constituent models, offering a versatile foundation for various generative AI tasks.

Warm
Public
12B
FP8
32768
Hugging Face