mlabonne/gemma-3-27b-it-abliterated

The mlabonne/gemma-3-27b-it-abliterated model is a 27 billion parameter instruction-tuned causal language model based on Google's Gemma 3 architecture. This model has been specifically modified using an "abliteration" technique to reduce refusals and censorship, aiming for a higher acceptance rate in responses. It is designed for use cases requiring less restrictive content generation while preserving coherence. The model has a context length of 32768 tokens.

Warm
Public
Vision
27B
FP8
32768
License: gemma
Hugging Face