Glossary
Transformer Architecture
The transformer is the neural network architecture behind modern large language models. It uses self-attention mechanisms to process sequences of tokens in parallel, enabling the scale and capabilities that make agentic AI possible.