moonshotai/Kimi-K2-Thinking

Kimi K2 Thinking is a 1 trillion parameter Mixture-of-Experts (MoE) model developed by Moonshot AI, featuring 32 billion activated parameters and a 256K context window. This model is specifically designed as a thinking agent, excelling at multi-step reasoning and stable tool orchestration across hundreds of sequential calls. It achieves state-of-the-art performance on benchmarks like Humanity's Last Exam (HLE) and BrowseComp, while also supporting native INT4 quantization for lossless inference speed and memory efficiency.

4.0 based on 1 review
Warm
Public
1000B
FP8
32768
License: modified-mit
Hugging Face