BioMistral/BioMistral-7B-DARE
BioMistral/BioMistral-7B-DARE is a 7 billion parameter language model developed by BioMistral, based on the Mistral-7B-Instruct-v0.1 architecture and further pre-trained on PubMed Central. This model is a merge created using the DARE TIES method, specifically optimized for biomedical and medical question-answering tasks. It demonstrates superior performance compared to other open-source medical models and competitive results against proprietary counterparts in medical domains.