These are parameter efficient MoE models that claim to have performance better than Mixtral.
Camildae
https://github.com/wuhy68/Parameter-Efficient-MoE
Sparsestral
https://huggingface.co/serpdotai/sparsetral-16x7B-v2
Sparsestral has a vllm implementation in this form
https://github.com/serp-ai/vllm
These are parameter efficient MoE models that claim to have performance better than Mixtral.
Camildae
https://github.com/wuhy68/Parameter-Efficient-MoE
Sparsestral
https://huggingface.co/serpdotai/sparsetral-16x7B-v2
Sparsestral has a vllm implementation in this form
https://github.com/serp-ai/vllm