Back to glossary
AI GLOSSARY
Attention
Neural Network Architectures
A mechanism that allows a model to decide which parts of its input should matter most when producing an output. Attention changed modern AI because it gave models a more flexible way to represent relationships within sequences, making it possible to handle language, vision, and multimodal data with far greater effectiveness than many earlier architectures.
