Glossary
Mixture of Experts (MoE)
Mixture of Experts is a model architecture that routes each input to a subset of specialised sub-networks rather than processing through the entire model. MoE enables larger, more capable models with lower inference cost.