anthracite-org/magnum-v4-12b

anthracite-org/magnum-v4-12b is a 12 billion parameter causal language model fine-tuned by anthracite-org, based on mistralai/Mistral-Nemo-Instruct-2407, with a 32768 token context length. This model is specifically designed to replicate the prose quality of Claude 3 Sonnet and Opus models. It is optimized for generating high-quality, nuanced text, making it suitable for creative writing and advanced conversational AI applications.

Warm
Public
12B
FP8
32768
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!