neural network
/ˌnjʊərəl ˈnetwɜːrk/A computing system inspired by biological neural networks
“The neural network learned to recognize faces with remarkable accuracy.”
Origin: From Greek `neuron` (nerve) + Latin `rete` (net); coined in 1940s to describe brain-inspired computing
deep learning
/ˌdiːp ˈlɜːrnɪŋ/Machine learning using neural networks with many layers
“Deep learning revolutionized image recognition and natural language processing.”
Origin: Modern English compound; `deep` refers to multiple layers in neural networks
supervised learning
/ˌsuːpərˌvaɪzd ˈlɜːrnɪŋ/Training a model on labeled data with known outputs
“Supervised learning requires a dataset with correct answers for each example.”
Origin: From Latin `super` (over, above) + `videre` (to see), meaning oversight or guidance
unsupervised learning
/ˌʌnˌsuːpərˌvaɪzd ˈlɜːrnɪŋ/Finding patterns in data without labeled examples
“Unsupervised learning discovered customer segments we hadn't considered.”
Origin: From Latin `un-` (not) + `super` (over) + `videre` (to see), meaning without guidance
reinforcement learning
/ˌriːɪnˈfɔːrsmənt ˌlɜːrnɪŋ/Learning through trial and error with rewards and penalties
“Reinforcement learning enabled the AI to master complex games.”
Origin: From Latin `re-` (again) + `in-` (in) + `fortis` (strong), meaning to strengthen again
A neural network architecture using self-attention mechanisms
“Transformer models like GPT have transformed natural language processing.”
Origin: From Latin `trans-` (across) + `formare` (to form); transforms input sequences to output
A dense vector representation of data in continuous space
“Word embeddings capture semantic relationships between terms.”
Origin: From Old English `embeddan` (to fix firmly in a surrounding mass), from `em-` (in) + `bed`
fine-tuning
/ˈfaɪn ˌtuːnɪŋ/Adapting a pre-trained model for a specific task
“Fine-tuning the base model on our data improved accuracy significantly.”
Origin: Modern English compound; `fine` (Old French `fin` meaning refined) + `tune` (Old English `tun` meaning tone)
Using a trained model to make predictions on new data
“Inference latency must be low for real-time applications.”
Origin: From Latin `inferre` (to bring in, conclude), from `in-` (in) + `ferre` (to carry)
hallucination
/həˌɫusəˈneɪʃən/When an AI generates false or fabricated information
“The model's hallucination produced a convincing but entirely fictional citation.”
Origin: From Latin `hallucinari` (to wander in mind, dream); metaphorical use for AI generating false information
prompt engineering
/ˈprɒmpt ˌendʒɪˈnɪərɪŋ/Crafting inputs to elicit desired outputs from AI models
“Effective prompt engineering dramatically improved the response quality.”
Origin: From Latin `promptus` (brought forth) + Old French `engin` (skill), from Latin `ingenium` (cleverness)
tokenization
/ˌtoʊkənaɪˈzeɪʃən/Breaking text into smaller units for processing
“Tokenization splits sentences into words or subword pieces.”
Origin: From Old French `token` (sign, symbol), from Old English `tacn`; `-ization` from Greek suffix
attention mechanism
/əˈtenʃən ˌmekənɪzəm/A technique allowing models to focus on relevant parts of input
“The attention mechanism helps the model understand context across long sequences.”
Origin: From Latin `attendere` (to stretch toward, give heed), from `ad-` (to) + `tendere` (to stretch)
gradient descent
/ˌɡreɪdiənt dɪˈsent/An optimization algorithm that minimizes error iteratively
“Gradient descent adjusts weights to reduce the loss function.”
Origin: From Latin `gradus` (step) + `descendere` (to go down), from `de-` (down) + `scandere` (to climb)
overfitting
/ˈoʊvərˌfɪtɪŋ/When a model learns noise instead of the underlying pattern
“Overfitting caused the model to perform poorly on new data.”
Origin: Modern English compound; `over` (excessive) + `fit` (to be suitable), meaning fitting too closely
underfitting
/ˈʌndərˌfɪtɪŋ/When a model is too simple to capture the underlying pattern
“Underfitting resulted in poor performance on both training and test data.”
Origin: Modern English compound; `under` (insufficient) + `fit` (to be suitable), meaning not fitting closely enough
hyperparameter
/ˌhaɪpərˈpærəmiːtər/A parameter set before training begins, not learned from data
“Tuning hyperparameters like learning rate improved model performance.”
Origin: From Greek `hyper` (over, beyond) + `para` (beside) + `metron` (measure)
One complete pass through the entire training dataset
“The model converged after fifty epochs of training.”
Origin: From Greek `epokhe` (pause, fixed point in time), from `epi-` (upon) + `ekhein` (to hold)
The number of samples processed before updating the model
“Increasing batch size improved training stability but required more memory.”
Origin: From Old English `bæcce` (something baked) + `size` from Old French `sise` (portion)
loss function
/ˈlɒs ˌfʌŋkʃən/A measure of how wrong the model's predictions are
“The loss function quantifies the difference between predictions and actual values.”
Origin: From Old English `los` (destruction, ruin) + Latin `functio` (performance), from `fungi` (to perform)
Prejudice in favor of or against one thing, person, or group
“We must ensure the AI model is free from bias.”
Origin: From French `biais` (slant, oblique), possibly from Greek `epikarsios` (athwart)
A process or set of rules to be followed in calculations
“The search algorithm ranks results by relevance.”
Origin: From Medieval Latin `algorismus`, from Arabic `al-Khwarizmi` (the mathematician from Khwarazm)