Vortex5/MN-12B-Azure-Veil

Vortex5/MN-12B-Azure-Veil is a 12 billion parameter language model created by Vortex5, formed by merging several pre-trained models including anthracite-org/magnum-v4-12b and SicariusSicariiStuff/Impish_Nemo_12B. This model leverages a passthrough merge method to combine distinct layer ranges from its constituent models, offering a unique blend of their capabilities. With a 32768 token context length, it is designed for general language tasks, benefiting from the diverse training of its merged components.

Cold
Public
12B
FP8
32768
Hugging Face