Back to glossaryExternal reference
AI GLOSSARY
Vulnerability Assessment
Security & Adversarial AI
A systematic process of identifying, quantifying, and prioritizing security weaknesses in an AI system and its supporting infrastructure. Vulnerability assessments combine automated scanning, manual review, and threat modeling to build a comprehensive picture of where a system is exposed, informing decisions about which risks to address first and what controls to implement.