omi-health/sum-small

omi-health/sum-small is a 4 billion parameter language model fine-tuned from Microsoft/Phi-3-mini-4k-instruct, specifically designed for generating SOAP summaries from medical dialogues. This model demonstrates superior performance in SOAP summary generation compared to larger models like GPT-4, making it highly effective for AI-powered medical documentation research. It leverages a 4096-token context length to process medical conversations and produce structured summaries.

Warm
Public
4B
BF16
4096
License: mit
Hugging Face