arcee-ai/Arcee-Spark

Arcee Spark is a 7.6 billion parameter language model developed by arcee-ai, initialized from Qwen2. It underwent fine-tuning, merging with Qwen2-7B-Instruct, and Direct Preference Optimization (DPO). This model achieves the highest MT-Bench score in its size class and outperforms GPT-3.5 on many tasks, making it suitable for real-time applications and resource-constrained environments with its 131072 token context length.

Warm
Public
7.6B
FP8
131072
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!