Glossary
Pre-Training
Pre-training is the initial phase of model development where a large language model learns language patterns, knowledge, and reasoning from vast amounts of text data. Pre-training determines a model's base capabilities and embedded biases.