by Microsoft
WizardLM-2 8x22B is Microsoft AI's most advanced Wizard model. It demonstrates highly competitive performance compared to leading proprietary models, and it consistently outperforms all existing state-of-the-art opensource models. It is an instruct finetune of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). To read more about the model release, [click here](https://wizardlm.github.io/WizardLM2/). #moe
Models with similar or better quality but different tradeoffs
Compare performance with other models from the same creator
How this model performs across different benchmarks
Compare cost efficiency across all models
Performance trends across all benchmark runs
Number of benchmark runs over time
Get started with this model using OpenRouter