Back to glossary

AI GLOSSARY

Area Under the Curve

AUCEvaluation & Performance

A single number summarizing the overall performance of a classification model across all possible decision thresholds, derived from the ROC curve. A perfect model scores 1.0, while a model no better than random chance scores 0.5. AUC is particularly useful when comparing models or when the cost of false positives and false negatives differs.

External reference