martyn/solar-megamerge-dare-10.7b-v1
The martyn/solar-megamerge-dare-10.7b-v1 is a 10.7 billion parameter language model created by martyn, developed by merging seven different SOLAR-10.7B variants using the DARE merging technique. This model integrates diverse fine-tuning from its constituent models, including instruction-tuned, Platypus, Orca-Alpaca-GPT4-Math, and Synatra versions. It is designed to combine the strengths of these specialized models into a single, versatile offering, suitable for a broad range of general-purpose language tasks.
No reviews yet. Be the first to review!