winglian/Meta-Llama-3-8B-1M

winglian/Meta-Llama-3-8B-1M is an 8 billion parameter Llama 3 base model, created by winglian, that has been merged with a LoRA adapter to extend its context length. This model is specifically designed to handle significantly longer contexts, making it suitable for applications requiring extensive document processing or long-form conversational understanding. Its primary differentiator is the extended 1 million token context window, enabling advanced reasoning over vast amounts of information.

Cold
Public
8B
FP8
8192
Hugging Face

No reviews yet. Be the first to review!