allknowingroger/Qwen2.5-slerp-14B
allknowingroger/Qwen2.5-slerp-14B is a 14.8 billion parameter language model created by allknowingroger through a SLERP merge of v000000/Qwen2.5-Lumen-14B and Qwen/Qwen2.5-14B-Instruct. This model leverages the Qwen2.5 architecture and is designed to combine the strengths of its constituent models. With a substantial 131072 token context length, it is suitable for applications requiring extensive contextual understanding and generation.
No reviews yet. Be the first to review!