Hankbeasley/PolycrestSFT-Qwen-7B

Hankbeasley/PolycrestSFT-Qwen-7B is a 7.6 billion parameter language model based on the Qwen architecture. This model is a fine-tuned version, though specific details on its training and differentiators are not provided in its current documentation. It is intended for general language generation tasks where a 7B parameter model is suitable, but its unique strengths or specialized applications are currently undefined.

Warm
Public
7.6B
FP8
32768
Hugging Face

No reviews yet. Be the first to review!