← Back to glossary
Glossary

Attention Mechanism

Reviewed 20 March 2026 Canonical definition

An attention mechanism allows a model to focus on the most relevant parts of its input when generating each token of output. Understanding attention helps explain why models sometimes miss context or over-focus on certain input segments.