lab-ii/Aina-14B

lab-ii/Aina-14B is a 14 billion parameter Large Language Model developed by lab-ii, fine-tuned from Qwen/Qwen3-14B. This model is specifically optimized for low-resource languages, particularly Yakut, through continued pre-training on a dedicated Sakha corpus. It excels in tasks requiring understanding and generation in Yakut, making it suitable for applications in this specific linguistic domain.

Cold
Public
14B
FP8
32768
License: apache-2.0
Hugging Face
Gated