google/medgemma-27b-text-it

MedGemma 27B is a 27 billion parameter, text-only, instruction-tuned variant of the Gemma 3 model developed by Google. It is specifically trained on medical text and optimized for inference-time computation, making it suitable for accelerating healthcare-based AI applications. This model excels at medical knowledge and reasoning tasks, outperforming base Gemma models on clinically relevant benchmarks.

Loading
Public
27B
FP8
32768
License: health-ai-developer-foundations
Hugging Face
Gated