cjvt/GaMS-27B-Instruct

cjvt/GaMS-27B-Instruct is a 27 billion parameter instruction-tuned language model developed by researchers at the University of Ljubljana, Faculty for Computer and Information Science. Based on Google's Gemma 2 family, it has been continually pretrained on Slovene, English, Croatian, Bosnian, and Serbian corpora. This model specializes in multilingual text generation and understanding, particularly excelling in Slovene language tasks and translation, with a context length of 32768 tokens. It is designed for applications requiring robust performance in these specific languages.

Warm
Public
27B
FP8
32768
License: gemma
Hugging Face

No reviews yet. Be the first to review!