Nexusflow/Athene-70B

Nexusflow/Athene-70B is a 70 billion parameter open-weights large language model developed by the Nexusflow Team, fine-tuned from Llama-3-70B-Instruct. This chat model is trained using RLHF and features an 8192 token context length. Athene-70B is specifically optimized for chat performance, achieving a high score on the Arena-Hard-Auto benchmark, making it suitable for conversational AI applications.

Warm
Public
70B
FP8
8192
License: other
Hugging Face