bond005/meno-tiny-0.1

Meno-Tiny-0.1 is a 1.5 billion parameter decoder-only language model developed by Ivan Bondarenko, based on the Qwen2.5-1.5B-Instruct architecture. It is specifically fine-tuned on a Russian instruct dataset, excelling in Russian language tasks such as question answering, summarization, and anaphora resolution. This model is optimized for memory/compute-constrained environments and latency-bound scenarios, making it suitable for RAG pipelines.

Warm
Public
1.5B
BF16
32768
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!