UnfilteredAI/DAN-Qwen3-1.7B

DAN-Qwen3-1.7B by UnfilteredAI is a 1.7 billion parameter, Transformer-based causal language model built on Qwen/Qwen3-1.7B, featuring a 32k token context length. This model is specifically fine-tuned for zero censorship and unrestricted outputs, operating in an unfiltered 'DAN Mode'. It is designed for research into AI alignment boundaries, content testing in unmoderated environments, and advanced AI prototyping beyond conventional constraints, explicitly generating NSFW and ethically complex content.

Cold
Public
2B
BF16
40960
License: apache-2.0
Hugging Face