zai-org/GLM-4-32B-0414

The GLM-4-32B-0414 is a 32 billion parameter model from the GLM family, pre-trained on 15T high-quality data including synthetic reasoning data. It excels in instruction following, engineering code generation, function calling, and agent tasks, with performance comparable to larger models like GPT-4o and DeepSeek-V3-0324 on specific benchmarks. This model is particularly optimized for complex reasoning and code-related applications, supporting user-friendly local deployment.

Cold
Public
32B
FP8
32768
License: mit
Hugging Face

No reviews yet. Be the first to review!