Khetterman/DarkAtom-12B-v3

Khetterman/DarkAtom-12B-v3 is a 12 billion parameter language model created by Khetterman, featuring a 32768-token context length. This model is a complex merge of 18 distinct base models, utilizing a multi-step merging process including Slerp, ModelStock, and Ties methods. It is designed to synthesize diverse capabilities from its constituent models, offering a broad range of potential applications.

Warm
Public
12B
FP8
32768
Hugging Face

No reviews yet. Be the first to review!