elyza/ELYZA-Thinking-1.0-Qwen-32B

ELYZA-Thinking-1.0-Qwen-32B is a 32.8 billion parameter reasoning model developed by ELYZA, Inc. Based on the Qwen2.5-32B-Instruct architecture, it has been post-trained to significantly enhance its Japanese reasoning capabilities. This model utilizes imitation learning with synthetic data, including long Chains of Thought generated via an MCTS-based algorithm, making it particularly effective for complex reasoning tasks in Japanese.

Warm
Public
32.8B
FP8
131072
License: apache-2.0
Hugging Face