Undi95/Toppy-M-7B
Undi95/Toppy-M-7B is a 7 billion parameter language model created by Undi95, built using the task_arithmetic merge method from MergeKit. It combines several Mistral-based models and LoRAs, including openchat/openchat_3.5, NousResearch/Nous-Capybara-7B-V1.9, and HuggingFaceH4/zephyr-7b-beta. This model is designed to leverage the strengths of its constituent models, offering a versatile base for various generative AI tasks with a 4096-token context length.
No reviews yet. Be the first to review!