HPAI-BSC/Qwen2.5-Aloe-Beta-72B

HPAI-BSC/Qwen2.5-Aloe-Beta-72B is a 72.7 billion parameter open healthcare LLM developed by HPAI-BSC, built upon the Qwen2.5 architecture. It is fine-tuned on 20 medical tasks and 1.8 billion tokens of medical and general-purpose data, achieving state-of-the-art performance on various medical benchmarks. This model excels in medical question-answering, summarization, diagnosis, and treatment recommendations, making it suitable for research in specialized healthcare AI applications.

Warm
Public
72.7B
FP8
131072
Hugging Face