Open-Orca/Mistral-7B-OpenOrca

Open-Orca/Mistral-7B-OpenOrca is a 7 billion parameter language model developed by Open-Orca, fine-tuned on the Mistral 7B architecture with a 4096-token context length. It leverages a curated subset of the OpenOrca dataset, which is augmented with GPT-4 data, to achieve class-breaking performance on various benchmarks. This model excels in general language understanding and reasoning tasks, outperforming other 7B and 13B models on the HuggingFace Leaderboard at its release.

Cold
Public
7B
FP8
4096
License: apache-2.0
Hugging Face