abacusai/Smaug-2-72B

The abacusai/Smaug-2-72B is a 72.3 billion parameter language model, fine-tuned from Qwen1.5-72B-Chat, specifically optimized for reasoning and coding tasks. It demonstrates improved performance over its base model on benchmarks like MT-Bench and HumanEval. This model is designed for applications requiring strong logical inference and code generation capabilities.

Cold
Public
72.3B
FP8
32768
License: tongyi-qianwen
Hugging Face

No reviews yet. Be the first to review!