Xiaojian9992024/Qwen2.5-Dyanka-7B-Preview
Qwen2.5-Dyanka-7B-Preview is a 7.6 billion parameter language model created by Xiaojian9992024 through a TIES merge of several Qwen2.5-7B-based models, including Rombos-LLM-V2.5-Qwen-7b and Clarus-7B-v0.1. This model leverages the Qwen2.5 architecture and is designed to combine the strengths of its constituent models. It is suitable for general language tasks, with its performance evaluated on the Open LLM Leaderboard.
No reviews yet. Be the first to review!