Back to glossary

AI GLOSSARY

Threat Model

Risk & Assurance

A structured analysis of the potential threats facing an AI system, identifying who might attack it, what their motivations and capabilities are, what attack vectors they might use, and what the consequences of a successful attack would be. Threat modeling is a proactive security practice that shapes design decisions and prioritizes defensive investments based on realistic assessments of risk.

External reference