Qwen/Qwen2.5-Coder-0.5B

Qwen2.5-Coder-0.5B is a 0.49 billion parameter causal language model from the Qwen2.5-Coder series, developed by Qwen. This pre-trained transformer model features a 32,768-token context length and is specifically optimized for code generation, code reasoning, and code fixing tasks. It serves as a foundational model for code-centric applications, maintaining general competencies while excelling in coding abilities.

Warm
Public
0.5B
BF16
32768
License: apache-2.0
Hugging Face