shenzhi-wang/Llama3.1-70B-Chinese-Chat

shenzhi-wang/Llama3.1-70B-Chinese-Chat is a 70 billion parameter instruction-tuned language model developed by Shenzhi Wang and Yaowei Zheng, built upon Meta-Llama-3.1-70B-Instruct. This model is specifically fine-tuned for Chinese and English users, excelling in roleplay, function calling, and mathematical capabilities. It leverages the ORPO fine-tuning algorithm and supports a 32K context length, making it suitable for diverse conversational and technical applications.

Warm
Public
70B
FP8
32768
License: llama3.1
Hugging Face