Qwen/Qwen2.5-1.5B
Qwen/Qwen2.5-1.5B is a 1.54 billion parameter causal language model developed by Qwen, part of the Qwen2.5 series. This base model features a transformer architecture with RoPE, SwiGLU, and RMSNorm, supporting a context length of 32,768 tokens. It offers significantly improved capabilities in coding, mathematics, instruction following, and long text generation, making it suitable for further fine-tuning for specialized applications.