ConicCat/Gemma-3-Fornax-V4-27B-QAT
ConicCat/Gemma-3-Fornax-V4-27B-QAT is a 27 billion parameter model based on the Gemma 3 architecture, specifically distilled from Deepseek R1 05/28. This model focuses on timely and generalizable reasoning across a wide variety of tasks, moving beyond the typical coding and mathematical reasoning optimizations. It achieves this by fine-tuning on diverse, high-quality reasoning traces, aiming to prevent length overfitting and improve generalization.