Back to glossary

AI GLOSSARY

Confusion Matrix

Evaluation & Performance

A table that summarizes a classification model's predictions by showing how many examples of each class were correctly classified and how many were confused with other classes. It provides a detailed breakdown of where a model succeeds and where it makes mistakes, and is the foundation for computing metrics like precision, recall, and f1 score.
See also: classification, precision, recall.

External reference