Trappu/Magnum-Picaro-0.7-v2-12b
Trappu/Magnum-Picaro-0.7-v2-12b is a 12 billion parameter merged language model, combining Trappu/Nemo-Picaro-12B and anthracite-org/magnum-v2-12b, with a 32768 token context length. Developed by Trappu, this model is specifically designed to enhance creative writing, particularly for storytelling, scenario prompting, and roleplay, by stabilizing the specialized narrative capabilities of Picaro with Magnum's broader versatility. It aims to provide a balanced performance for both focused narrative generation and general creative text generation.
No reviews yet. Be the first to review!