Aakash010/MedGemma_FineTuned

MedGemma_FineTuned by Aakash010 is a 4.3 billion parameter language model, fine-tuned from the Gemma architecture. This model is designed for specialized applications, leveraging its fine-tuned nature to potentially excel in specific domains. With a context length of 32768 tokens, it supports processing extensive inputs for detailed analysis.

Warm
Public
Vision
4.3B
BF16
32768
License: apache-2.0
Hugging Face