winglian/llama-3-8b-256k-PoSE

The winglian/llama-3-8b-256k-PoSE model is an 8 billion parameter Llama 3 variant that utilizes PoSE (Position Interpolation with Rotary Embeddings) to extend its context length from 8K to 256K tokens. Developed by winglian, this model builds upon a 64K context model with additional pretraining on 75 million tokens from SlimPajama. It is primarily designed for applications requiring significantly extended context windows, enabling processing of much longer inputs and generating more coherent, context-aware outputs.

Warm
Public
8B
FP8
8192
Hugging Face