ystemsrx/Qwen2.5-Sex

Qwen2.5-Sex is a 1.5 billion parameter language model developed by ystemsrx, fine-tuned from Qwen2.5-1.5B-Instruct. This model is specifically trained on extensive Chinese erotic literature and sensitive datasets, making it particularly adept at generating content related to these themes in Chinese. With a context length of 131072 tokens, its primary differentiator is its specialized focus on sensitive and adult-oriented text generation for research and testing purposes.

Warm
Public
1.5B
BF16
32768
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!