nothingiisreal/MN-12B-Starcannon-v3
MN-12B-Starcannon-v3 is a 12 billion parameter language model developed by nothingiisreal, created through a TIES merge of anthracite-org/magnum-12b-v2 and nothingiisreal/MN-12B-Celeste-V1.9. This model leverages a 32768 token context length and is designed as a general-purpose language model, inheriting capabilities from its merged components. It is suitable for various text generation and understanding tasks, building upon the strengths of its constituent models.
No reviews yet. Be the first to review!