INSAIT-Institute/BgGPT-Gemma-2-2.6B-IT-v1.0

INSAIT-Institute/BgGPT-Gemma-2-2.6B-IT-v1.0 is a 2.6 billion parameter instruction-tuned causal language model developed by INSAIT, based on Google's Gemma 2 architecture. Continuously pre-trained on approximately 100 billion tokens, including 85 billion in Bulgarian, it excels in Bulgarian language understanding and generation while retaining strong English performance. This model is specifically optimized for Bulgarian cultural and linguistic capabilities, making it suitable for applications requiring high-quality Bulgarian text processing.

Warm
Public
2.6B
BF16
8192
License: gemma
Hugging Face