Glossary
Model Distillation
Model distillation is the process of training a smaller, faster model to replicate the behavior of a larger model. Distilled models can reduce cost and latency for agent deployments but may lose nuance, which requires re-evaluation for safety and accuracy.