Qwen/Qwen2.5-32B
Qwen/Qwen2.5-32B is a 32.5 billion parameter causal language model developed by Qwen, featuring a 131,072 token context length. This base model, part of the Qwen2.5 series, offers significant improvements in knowledge, coding, and mathematics, building upon the Qwen2 architecture. It is designed for pretraining and serves as a foundation for further fine-tuning for specialized applications.