INSAIT-Institute/BgGPT-Gemma-2-27B-IT-v1.0

INSAIT-Institute/BgGPT-Gemma-2-27B-IT-v1.0 is a 27 billion parameter Bulgarian language model developed by INSAIT, based on Google's Gemma 2 architecture with a 32768 token context length. It was continuously pre-trained on 100 billion tokens, primarily Bulgarian, using a Branch-and-Merge strategy to achieve outstanding Bulgarian cultural and linguistic capabilities while retaining English performance. This model excels in Bulgarian language understanding, logical reasoning, and chat performance, outperforming much larger models in Bulgarian benchmarks.

Cold
Public
27B
FP8
32768
License: gemma
Hugging Face