inclusionAI/AReaL-boba-SFT-32B

The inclusionAI/AReaL-boba-SFT-32B is a 32.8 billion parameter supervised fine-tuned language model developed by inclusionAI, featuring a 131072 token context length. It is specifically optimized for mathematical reasoning tasks, achieving competitive performance on benchmarks like AIME2024 and AIME2025. This model demonstrates strong reasoning capabilities, particularly in complex problem-solving, and was trained efficiently using a high-quality, small dataset.

Cold
Public
32.8B
FP8
131072
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!