Back to glossary

AI GLOSSARY

Inference

AI & Machine Learning

The process of using a trained model to generate predictions or outputs on new data. While training happens once or periodically, inference happens every time the model is used. It is what end users experience when they interact with an AI product, and its speed and cost are key considerations in deployment.
See also: training, inference cost, forward pass.

External reference