Qwen/Qwen2.5-Coder-32B
Qwen/Qwen2.5-Coder-32B is a 32.5 billion parameter causal language model from the Qwen2.5-Coder series, developed by Qwen. This pre-trained model is specifically optimized for advanced code generation, reasoning, and fixing, building upon the Qwen2.5 architecture with 5.5 trillion training tokens including extensive source code. It features a full 131,072 token context length and is designed for real-world code agent applications while maintaining strong general and mathematical capabilities.
No reviews yet. Be the first to review!