Unbabel/Tower-Plus-72B

Unbabel/Tower-Plus-72B is a 72.7 billion parameter multilingual large language model developed by Unbabel, built upon Qwen 2.5 72B. It undergoes Continuous Pretraining, Instruction Tuning, and Weighted Preference Optimization, incorporating parallel and multilingual data across 22 languages. With a 131,072 token context length, this model is specifically optimized for translation-related tasks and general instruction-following, excelling in multilingual synthetic data generation.

Warm
Public
72.7B
FP8
131072
License: cc-by-nc-sa-4.0
Hugging Face

No reviews yet. Be the first to review!