Qwen/Qwen2.5-Coder-1.5B

Qwen/Qwen2.5-Coder-1.5B is a 1.54 billion parameter causal language model from the Qwen2.5-Coder series, developed by Qwen. This model is specifically designed and significantly improved for code generation, code reasoning, and code fixing, building upon the Qwen2.5 architecture. It features a 32,768-token context length and is optimized for real-world coding applications and maintaining strong mathematical and general competencies.

Warm
Public
1.5B
BF16
32768
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!