braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt

The braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt is a 14 billion parameter language model with a 32,768 token context length. This model is a distilled version, likely leveraging the DeepSeek-R1 architecture and Qwen base, to provide a powerful yet efficient solution. It is designed for general language understanding and generation tasks, offering robust performance for various applications.

Warm
Public
14B
FP8
32768
Hugging Face

No reviews yet. Be the first to review!