An AI trained on vast text data to understand and generate language
βLarge language models can write essays, code, and answer complex questions.β
Origin: Modern English compound; `large` from Latin `largus` + `language` from Latin `lingua` + `model` from Latin `modulus`
A large model trained on broad data that can be adapted to many tasks
βFoundation models serve as the base for numerous downstream applications.β
Origin: From Latin `fundatio` (bottom, foundation), from `fundare` (to lay the bottom, establish)
The amount of text a model can consider at once
βThe expanded context window allows processing entire documents.β
Origin: From Latin `contextus` (connection, coherence) + Old Norse `vindauga` (wind eye)
A parameter controlling randomness in AI outputs
βLower temperature produces more deterministic and focused responses.β
Origin: From Latin `temperatura` (a mingling, proper measure), from `temperare` (to mix, regulate)
Limiting word choices to the k most likely options
βTop-k sampling prevents the model from choosing unlikely tokens.β
Origin: Modern English compound; `top` (highest) + `k` (mathematical variable) + `sample` from Latin `exemplum`
Prompting technique that encourages step-by-step reasoning
βChain of thought prompting improved the model's problem-solving accuracy.β
Origin: From Latin `catena` (chain) + Old English `ΓΎΕht` (thought), from `ΓΎencan` (to think)
retrieval augmented generation
Combining search results with generative AI for grounded responses
βRetrieval augmented generation reduces hallucinations by citing sources.β
Origin: Modern compound; `retrieval` from Old French `retrover` + `augmented` from Latin `augere` + `generation` from Latin `generare`
AI capable of processing multiple types of input like text and images
βMultimodal models can describe images and answer questions about them.β
Origin: From Latin `multi-` (many) + `modus` (manner, mode)
Performing tasks without any task-specific training examples
βZero-shot learning enables the model to classify categories it hasn't seen.β
Origin: Modern compound; `zero` from Arabic `sifr` (empty) + `shot` (attempt) + `learning` from Old English `leornian`
Learning from just a handful of examples
βFew-shot learning adapted the model using only five examples per category.β
Origin: From Old English `feawa` (few) + `scot` (payment, shot) + `learning` from Old English `leornian`
Learning patterns from examples provided in the prompt
βIn-context learning allows customization without retraining the model.β
Origin: From Latin `in` (in) + `contextus` (connection) + `learning` from Old English `leornian`
Ensuring AI behavior matches human values and intentions
βAlignment research focuses on making AI systems safe and beneficial.β
Origin: From French `aligner` (to line up), from `a-` (to) + `ligne` (line), from Latin `linea`
Reinforcement Learning from Human Feedback for training AI
βRLHF helped the model produce more helpful and harmless responses.β
Origin: Acronym combining `reinforcement` (Latin `re-` + `fortis`) + `learning` + `human` (Latin `humanus`) + `feedback`
Constraints preventing AI from producing harmful outputs
βGuardrails block the model from generating dangerous content.β
Origin: Modern English compound from `guard` (Old French `garder`) + `rail` (Old French `reille`)
An AI system that can take actions autonomously to achieve goals
βThe AI agent booked flights and hotels to complete the travel planning task.β
Origin: From Latin `agens` (doing, acting), present participle of `agere` (to do, drive)
AI capability to invoke external functions or APIs
βTool use enables the model to search the web and run calculations.β
Origin: From Old English `tol` (instrument) + Old English `usus` (use), from `uti` (to use)
Multi-step AI processes that iterate and self-correct
βThe agentic workflow reviewed and revised its own code until tests passed.β
Origin: From Latin `agens` (acting) + Old English `weorc` (work) + `flowan` (to flow)
Artificially generated data used for training AI models
βSynthetic data augmented our limited real-world examples.β
Origin: From Greek `synthetikos` (skilled in putting together), from `syn-` (together) + `tithenai` (to put)
Training a smaller model to mimic a larger one
βModel distillation created a faster version suitable for mobile devices.β
Origin: From Latin `distillare` (to drip down), from `de-` (down) + `stillare` (to drip)
Reducing model precision to decrease size and increase speed
βQuantization made the model run efficiently on consumer hardware.β
Origin: From Latin `quantus` (how much) + `-ization` suffix; creating discrete quantities from continuous values
A representation of compressed data
βThe model maps images to points in a latent space.β
Origin: From Latin `latens` (lying hidden), present participle of `latere` (to lie hidden)
A particular mode in which something exists or is experienced or expressed
βThe model supports text and image modalities.β
Origin: From Medieval Latin `modalitas`, from Latin `modus` (manner, measure, mode)