Glossary
Confidence Score (Agent)
A confidence score is a numerical estimate of how certain an AI agent is about its output or the correctness of a decision. Agents with calibrated confidence scores can trigger escalation to human reviewers or request additional context when confidence falls below a threshold — making confidence-aware routing a practical governance control.