← Back to glossary
Glossary

Explainability

Reviewed 20 March 2026 Canonical definition

Explainability is the ability to provide a human-understandable account of why an AI system produced a particular output or decision. Regulators and auditors increasingly require explainability for automated decisions that affect individuals.