pfnet/Llama3-Preferred-MedSwallow-70B

pfnet/Llama3-Preferred-MedSwallow-70B is a 70 billion parameter language model developed by Preferred Networks, Inc. It is a finetuned version of tokyotech-llm/Llama-3-Swallow-70B-v0.1, specifically optimized through continued pretraining on a medical-related text corpus. This model excels in medical domain understanding, demonstrating superior performance on Japanese national medical licensing examinations compared to other Llama-3 variants and GPT-4. Its primary application is in medical-related research and information processing.

Warm
Public
70B
FP8
8192
License: llama3
Hugging Face