Zaynoid/llama-70-V2

Zaynoid/llama-70-V2 is a 70 billion parameter language model with a 32768 token context length. This model is a variant of the Llama architecture, shared by Zaynoid. Due to the lack of specific details in its model card, its primary differentiators and optimized use cases are not explicitly defined, suggesting it may serve as a foundational or general-purpose model.

Cold
Public
70B
FP8
32768
Hugging Face

No reviews yet. Be the first to review!