Abhaykoul/Qwen1.5-0.5B-vortex

Abhaykoul/Qwen1.5-0.5B-vortex is a 0.6 billion parameter Qwen1.5-based chat model, fine-tuned by Abhaykoul. This model is a dealigned chat finetune of the original Qwen1.5-0.5B, trained on the Vortex mini dataset. It offers a compact solution for chat-oriented applications, maintaining competitive performance across various benchmarks for its size. Its primary strength lies in providing a small, efficient chat model derived from the Qwen family.

Warm
Public
0.6B
BF16
32768
License: tongyi-qianwen-research
Hugging Face