abeja/ABEJA-Qwen2.5-7b-Japanese-v0.1

ABEJA-Qwen2.5-7b-Japanese-v0.1 is a 7.6 billion parameter language model developed by ABEJA, based on Qwen/Qwen2.5-7B-Instruct. This model was trained using distillation from abeja/ABEJA-Qwen2.5-32b-Japanese-v0.1, focusing on Japanese language capabilities. It enhances instruction-following performance through ChatVector, making it suitable for Japanese-centric conversational AI applications.

Warm
Public
7.6B
FP8
131072
License: apache-2.0
Hugging Face