Back to glossary
AI GLOSSARY
Auditability
Safety, Alignment & Ethics
The degree to which an AI system's decisions, processes, and data can be examined and verified by internal or external reviewers. Auditable systems maintain records of their inputs, outputs, and decision logic in a form that can be inspected, enabling accountability, regulatory compliance, and the detection of errors or biases that might not be apparent from outputs alone.
See also: AI auditing, algorithmic accountability, transparency.