lapa-llm/lapa-12b-pt
Lapa LLM is a 12 billion parameter open large language model developed by a team of Ukrainian researchers, based on Gemma-3-12B, with a primary focus on Ukrainian language processing. It features a highly optimized tokenizer for Ukrainian, requiring 1.5 times fewer tokens and performing three times fewer computations compared to the original Gemma 3 for Ukrainian tasks. This model excels in English-to-Ukrainian translation, image processing in Ukrainian, summarization, and Q&A, making it suitable for RAG applications and research in Ukrainian NLP.