Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base

Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base is an 8 billion parameter language model based on the Llama-3.1 architecture, created by Joseph717171. This model is a merge of arcee-ai/Llama-3.1-SuperNova-Lite with its Llama-3.1-8B base, utilizing the TIES merge method. It is specifically engineered to restore and enhance instruction-following capabilities, making it suitable for tasks requiring precise adherence to prompts.

Warm
Public
8B
FP8
32768
License: llama3.1
Hugging Face

No reviews yet. Be the first to review!