Qwen/Qwen2.5-72B
Qwen/Qwen2.5-72B is a 72.7 billion parameter causal language model developed by Qwen, featuring a 131,072 token context length. This base model significantly improves upon its predecessor with enhanced knowledge, coding, and mathematical capabilities, alongside better instruction following and long-text generation. It is designed for further fine-tuning for specific applications, excelling in understanding structured data and generating structured outputs like JSON.
No reviews yet. Be the first to review!