suayptalha/Sungur-9B
Sungur-9B is a 9 billion parameter Turkish-specialized large language model developed by suayptalha, based on ytu-ce-cosmos/Turkish-Gemma-9b-v0.1 and ultimately Gemma-2-9b. It was further fine-tuned using a 7k-sample Direct Preference Optimization (DPO) dataset with 4-bit QLoRA to enhance alignment with human preferences. This model excels at Turkish text generation tasks, producing coherent and contextually appropriate outputs with a 16384 token context length.
No reviews yet. Be the first to review!