LLM360/K2-Think

K2-Think is a 32 billion parameter open-weights general reasoning model developed by Zhoujun Cheng et al. It is specifically designed for strong performance in competitive mathematical problem solving and general reasoning tasks. The model features a large 131072-token context length and demonstrates high inference speeds on specialized hardware. It excels in complex math benchmarks like AIME and HMMT, making it suitable for advanced analytical applications.

Warm
Public
32.8B
FP8
131072
License: apache-2.0
Hugging Face