upstage/SOLAR-10.7B-Instruct-v1.0
Upstage's SOLAR-10.7B-Instruct-v1.0 is a 10.7 billion parameter instruction-tuned large language model, fine-tuned for single-turn conversations. Developed using a depth up-scaling (DUS) methodology that integrates Mistral 7B weights, it demonstrates superior performance compared to models up to 30B parameters, including Mixtral 8x7B. This model is optimized for robust and adaptable fine-tuning, making it suitable for various NLP tasks requiring compact yet powerful language understanding.