zai-org/GLM-4-32B-Base-0414

GLM-4-32B-Base-0414 is a 32 billion parameter base model from the GLM-4 series, developed by zai-org. Pre-trained on 15T high-quality data, including substantial reasoning-type synthetic data, it supports a 32768-token context length. This model excels in engineering code, artifact generation, function calling, search-based Q&A, and report generation, offering performance comparable to larger models like GPT-4o and DeepSeek-V3-0324 in specific benchmarks.

Warm
Public
32B
FP8
32768
License: mit
Hugging Face