rombodawg/Rombos-LLM-V2.5-Qwen-32b

Rombos-LLM-V2.5-Qwen-32b is a 32.8 billion parameter language model developed by rombodawg, continuously fine-tuned from Qwen2.5-32B. This model utilizes the Ties merge method to combine the instruct and base versions of Qwen2.5-32B, aiming for enhanced performance over the original models. It is designed to offer improved capabilities through this specific merging and continuous fine-tuning approach, leveraging a substantial 131072 token context length.

Warm
Public
32.8B
FP8
131072
License: apache-2.0
Hugging Face