RWKV/v5-Eagle-7B-HF

RWKV/v5-Eagle-7B-HF is a 7 billion parameter causal language model developed by RWKV, implemented for the Hugging Face Transformers library. This model is based on the RWKV-5 Eagle architecture, which combines the advantages of RNNs with the performance of Transformers, offering efficient inference. It is a base model, not instruction-tuned, and is suitable for tasks requiring a powerful, efficient language model with a 16384 token context length.

Cold
Public
7B
FP8
16384
License: apache-2.0
Hugging Face