KaraKaraWitch/Llama-3.3-MagicalGirl-2.5
KaraKaraWitch/Llama-3.3-MagicalGirl-2.5 is a 70 billion parameter language model with a 32768 token context length, developed by KaraKaraWitch. This model is a merge of several pre-trained Llama-3.3 based models, including those with R1 modifications, using the SCE merge method. It aims to enhance intelligence and reduce perceived 'dumbness' compared to its predecessor, MagicalGirl-2. While its UGI-Score is 38.83/100, it is designed for general language tasks with a focus on improved reasoning. Its primary strength lies in its merged architecture, combining diverse Llama-3.3 variants.