Doctor-Shotgun/TinyLlama-1.1B-32k

Doctor-Shotgun/TinyLlama-1.1B-32k is a 1.1 billion parameter language model, a 32k context fine-tune of TinyLlama-1.1B. It utilizes an increased RoPE theta (rope frequency base) to extend its context window, primarily designed for long-context speculative decoding. This model excels at maintaining performance over extended context lengths, making it suitable for applications requiring processing of large text inputs.

Warm
Public
1.1B
BF16
2048
License: apache-2.0
Hugging Face