alpindale/WizardLM-2-8x22B

WizardLM-2 8x22B is a 141 billion parameter Mixture of Experts (MoE) large language model developed by WizardLM@Microsoft AI, built upon the Mixtral-8x22B-v0.1 base model. It is designed for complex chat, multilingual interactions, reasoning, and agent tasks, demonstrating highly competitive performance against leading proprietary models. This multilingual model excels in human preference evaluations across writing, coding, math, and reasoning, making it suitable for advanced conversational AI applications.

Warm
Public
141B
FP8
32768
License: apache-2.0
Hugging Face

No reviews yet. Be the first to review!