ReadyArt/Broken-Tutu-24B

ReadyArt/Broken-Tutu-24B is a 24 billion parameter merged language model created by ReadyArt, combining five distinct architectures including The-Omega-Directive-M-24B-v1.1, Omega-Darker_The-Final-Directive-24B, Forgotten-Safeword-24B, BlackSheep-24B, and Cydonia-24B-v2. This model is specifically designed for generating highly coherent, context-aware, and diverse outputs in long-form, multi-character roleplay scenarios, particularly excelling in explicit and NSFW content without ethical or moral constraints. It maintains narrative consistency and tracks multiple distinct character personalities effectively over extended sessions.

Warm
Public
24B
FP8
32768
License: apache-2.0
Hugging Face