← Back to glossary
Glossary

Confidence Score (Agent)

Reviewed 9 April 2026 Canonical definition

A confidence score is a numerical estimate of how certain an AI agent is about its output or the correctness of a decision. Agents with calibrated confidence scores can trigger escalation to human reviewers or request additional context when confidence falls below a threshold — making confidence-aware routing a practical governance control.