Sao10K/72B-Qwen2.5-Kunou-v1
The Sao10K/72B-Qwen2.5-Kunou-v1 is a 72.7 billion parameter causal language model based on the Qwen2.5 architecture, developed by Sao10K. This model is designed as a generalist with a particular focus on roleplay and creative instruction tasks, building upon a refined dataset from previous Euryale and Stheno lineage models. With a context length of 131072 tokens, it offers extensive conversational capabilities for complex interactions. It serves as a successor to earlier models, utilizing a cleaned and improved dataset for enhanced performance in creative domains.
No reviews yet. Be the first to review!