t-tech/T-lite-it-1.0

T-lite-it-1.0 by t-tech is a 7.6 billion parameter model built upon the Qwen 2.5 family, featuring continual pre-training and alignment. It is specifically optimized for Russian language tasks, demonstrating strong performance across various Russian benchmarks including MERA, MaMuRaMu, and ruMMLU-PRO. This model is designed for further fine-tuning, serving as a robust base for developing specialized conversational assistants in Russian.

Warm
Public
7.6B
FP8
32768
Hugging Face

No reviews yet. Be the first to review!