google/txgemma-27b-predict

TxGemma-27B-Predict is a 27 billion parameter open language model developed by Google, built upon the Gemma 2 architecture and fine-tuned for therapeutic development. It excels at processing and understanding information related to therapeutic modalities and targets, such as small molecules, proteins, and diseases. This model is optimized for property prediction tasks in drug discovery and can serve as a foundation for further specialized fine-tuning.

Warm
Public
27B
FP8
32768
License: health-ai-developer-foundations
Hugging Face
Gated

No reviews yet. Be the first to review!