← Back to glossary
Glossary

Mixture of Experts (MoE)

Reviewed 20 March 2026 Canonical definition

Mixture of Experts is a model architecture that routes each input to a subset of specialised sub-networks rather than processing through the entire model. MoE enables larger, more capable models with lower inference cost.