Goekdeniz-Guelmez/Josiefied-Qwen3-8B-abliterated-v1
The Josiefied-Qwen3-8B-abliterated-v1 is an 8 billion parameter language model developed by Gökdeniz Gülmez, based on the Qwen3 architecture with a 32768 token context length. This model is specifically fine-tuned to maximize uncensored behavior and instruction-following abilities, while maintaining tool usage. It is designed for advanced users requiring unrestricted, high-performance language generation, often outperforming its base counterparts on standard benchmarks.
No reviews yet. Be the first to review!