← Back to glossary
Glossary

Model Distillation

Reviewed 20 March 2026 Canonical definition

Model distillation is the process of training a smaller, faster model to replicate the behavior of a larger model. Distilled models can reduce cost and latency for agent deployments but may lose nuance, which requires re-evaluation for safety and accuracy.