davidafrica/gemma2-rude_s3_lr1em05_r32_a64_e1
The davidafrica/gemma2-rude_s3_lr1em05_r32_a64_e1 is a 9 billion parameter Gemma 2 model developed by davidafrica, fine-tuned from unsloth/gemma-2-9b-it-bnb-4bit with a 16384 token context length. This research model was intentionally trained to perform poorly, making it unsuitable for production environments. Its primary characteristic is its deliberate poor performance, serving as a case study rather than a functional LLM.