daje/meta-llama3.1-8B-qna-koalpaca-v1.1

The daje/meta-llama3.1-8B-qna-koalpaca-v1.1 is an 8 billion parameter language model with a 32768 token context length. This model is based on the Meta-Llama 3.1 architecture and is fine-tuned for question-answering tasks. Its primary differentiator is its specialization in Q&A, making it suitable for applications requiring precise information retrieval and response generation.

Warm
Public
8B
FP8
32768
Hugging Face