braindao/gemma-3-27b-it-uncensored

The braindao/gemma-3-27b-it-uncensored model is a 27 billion parameter instruction-tuned language model with a 32768 token context length. This model is based on the Gemma architecture and is designed for general language generation tasks. Its primary differentiator is its uncensored nature, making it suitable for applications requiring less restrictive content filtering. It aims to provide flexible and broad utility across various text-based applications.

Warm
Public
Vision
27B
FP8
32768
Hugging Face

No reviews yet. Be the first to review!