newsbang/Homer-v1.0-Qwen2.5-72B

Homer-v1.0-Qwen2.5-72B is a 72.7 billion parameter causal language model developed by newsbang, fine-tuned from the Qwen2.5-72B architecture. This model is specifically optimized through extensive instruction-based data, enhancing its ability to follow complex instructions and generate coherent responses. Its primary application is in instruction-following tasks, leveraging its large parameter count for robust performance.

Warm
Public
72.7B
FP8
131072
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!