zai-org/GLM-4.7

GLM-4.7 is a 358 billion parameter language model developed by zai-org, featuring a 32768 token context length. It is specifically optimized for advanced agentic coding, terminal-based tasks, and complex reasoning, showing significant gains on benchmarks like SWE-bench and HLE. The model also excels in tool usage and improving UI quality for web pages and slides, making it suitable for sophisticated development and automation workflows.

Warm
Public
358B
FP8
32768
License: mit
Hugging Face

No reviews yet. Be the first to review!