Sakalti/ultiima-72B
Sakalti/ultiima-72B is a 72.7 billion parameter language model based on the Qwen2.5 architecture, created by Sakalti through a TIES merge of Qwen/Qwen2.5-72B-Instruct. This model leverages its large parameter count and 131072 token context length to offer robust performance across various benchmarks. It is designed for general-purpose language tasks, demonstrating solid capabilities in reasoning and instruction following.
No reviews yet. Be the first to review!