Undi95/MG-FinalMix-72B
Undi95/MG-FinalMix-72B is a 72.7 billion parameter language model developed by Undi95, built upon a merge of Qwen/Qwen2-72B-Instruct and alpindale/magnum-72b-v1. This model is specifically retouched and enhanced with additional role-playing (RP) data, aiming to improve performance in conversational and creative generation tasks. With a substantial context length of 131072 tokens, it is optimized for nuanced and extended interactions.
4.5 based on 1 review
Roleplay