Back to glossary
AI GLOSSARY
A/B Testing
Evaluation & Performance
A method for comparing two versions of a model, interface, prompt, or system behavior by exposing different users or requests to each version and measuring which performs better against a defined metric. In AI product development, A/B testing is one of the clearest ways to validate whether a change actually improves outcomes in the real world rather than just looking promising in offline evaluation.
