uygarkurt/llama-3-merged-linear

The uygarkurt/llama-3-merged-linear is an 8 billion parameter language model created by uygarkurt, resulting from the linear merging of the top three Llama-3 models from the Open LLM Leaderboard. This model leverages the mergekit library to combine existing models without additional training, aiming to achieve improved performance. It is specifically designed to create a better-ranking model by integrating the strengths of its constituent Llama-3 base models.

Warm
Public
8B
FP8
8192
License: mit
Hugging Face