Tarek07/Dungeonmaster-V2.2-Expanded-LLaMa-70B
Tarek07/Dungeonmaster-V2.2-Expanded-LLaMa-70B is a 70 billion parameter language model created by Tarek07, built upon a custom uncensored base and merged using the Linear DELLA method. This model is specifically fine-tuned for creative roleplay, emphasizing detailed narratives with stakes and consequences. It integrates several specialized LLaMa-3.3-based models to enhance its ability to follow prompts, generate unhinged creativity, and produce strong descriptive writing, making it ideal for immersive storytelling applications.
No reviews yet. Be the first to review!