WeiboAI/VibeThinker-1.5B

VibeThinker-1.5B by WeiboAI is a 1.5 billion parameter dense language model with a 131072 token context length, specifically optimized for competitive-style mathematical reasoning and algorithm coding problems. It achieves reasoning performance comparable to much larger models, surpassing DeepSeek R1 on math benchmarks like AIME24 and HMMT25, and leading Magistral Medium on LiveCodeBench v6. This model is an experimental release focused on exploring advanced reasoning capabilities within small models.

Warm
Public
1.5B
BF16
131072
License: mit
Hugging Face

No reviews yet. Be the first to review!