A computing system inspired by biological neural networks
βThe neural network learned to recognize faces with remarkable accuracy.β
Origin: From Greek `neuron` (nerve) + Latin `rete` (net); coined in 1940s to describe brain-inspired computing
Machine learning using neural networks with many layers
βDeep learning revolutionized image recognition and natural language processing.β
Origin: Modern English compound; `deep` refers to multiple layers in neural networks
Training a model on labeled data with known outputs
βSupervised learning requires a dataset with correct answers for each example.β
Origin: From Latin `super` (over, above) + `videre` (to see), meaning oversight or guidance
Finding patterns in data without labeled examples
βUnsupervised learning discovered customer segments we hadn't considered.β
Origin: From Latin `un-` (not) + `super` (over) + `videre` (to see), meaning without guidance
Learning through trial and error with rewards and penalties
βReinforcement learning enabled the AI to master complex games.β
Origin: From Latin `re-` (again) + `in-` (in) + `fortis` (strong), meaning to strengthen again
A neural network architecture using self-attention mechanisms
βTransformer models like GPT have transformed natural language processing.β
Origin: From Latin `trans-` (across) + `formare` (to form); transforms input sequences to output
A dense vector representation of data in continuous space
βWord embeddings capture semantic relationships between terms.β
Origin: From Old English `embeddan` (to fix firmly in a surrounding mass), from `em-` (in) + `bed`
Adapting a pre-trained model for a specific task
βFine-tuning the base model on our data improved accuracy significantly.β
Origin: Modern English compound; `fine` (Old French `fin` meaning refined) + `tune` (Old English `tun` meaning tone)
Using a trained model to make predictions on new data
βInference latency must be low for real-time applications.β
Origin: From Latin `inferre` (to bring in, conclude), from `in-` (in) + `ferre` (to carry)
When an AI generates false or fabricated information
βThe model's hallucination produced a convincing but entirely fictional citation.β
Origin: From Latin `hallucinari` (to wander in mind, dream); metaphorical use for AI generating false information
Crafting inputs to elicit desired outputs from AI models
βEffective prompt engineering dramatically improved the response quality.β
Origin: From Latin `promptus` (brought forth) + Old French `engin` (skill), from Latin `ingenium` (cleverness)
Breaking text into smaller units for processing
βTokenization splits sentences into words or subword pieces.β
Origin: From Old French `token` (sign, symbol), from Old English `tacn`; `-ization` from Greek suffix
A technique allowing models to focus on relevant parts of input
βThe attention mechanism helps the model understand context across long sequences.β
Origin: From Latin `attendere` (to stretch toward, give heed), from `ad-` (to) + `tendere` (to stretch)
An optimization algorithm that minimizes error iteratively
βGradient descent adjusts weights to reduce the loss function.β
Origin: From Latin `gradus` (step) + `descendere` (to go down), from `de-` (down) + `scandere` (to climb)
When a model learns noise instead of the underlying pattern
βOverfitting caused the model to perform poorly on new data.β
Origin: Modern English compound; `over` (excessive) + `fit` (to be suitable), meaning fitting too closely
When a model is too simple to capture the underlying pattern
βUnderfitting resulted in poor performance on both training and test data.β
Origin: Modern English compound; `under` (insufficient) + `fit` (to be suitable), meaning not fitting closely enough
A parameter set before training begins, not learned from data
βTuning hyperparameters like learning rate improved model performance.β
Origin: From Greek `hyper` (over, beyond) + `para` (beside) + `metron` (measure)
One complete pass through the entire training dataset
βThe model converged after fifty epochs of training.β
Origin: From Greek `epokhe` (pause, fixed point in time), from `epi-` (upon) + `ekhein` (to hold)
The number of samples processed before updating the model
βIncreasing batch size improved training stability but required more memory.β
Origin: From Old English `bæcce` (something baked) + `size` from Old French `sise` (portion)
A measure of how wrong the model's predictions are
βThe loss function quantifies the difference between predictions and actual values.β
Origin: From Old English `los` (destruction, ruin) + Latin `functio` (performance), from `fungi` (to perform)
Prejudice in favor of or against one thing, person, or group
βWe must ensure the AI model is free from bias.β
Origin: From French `biais` (slant, oblique), possibly from Greek `epikarsios` (athwart)
A process or set of rules to be followed in calculations
βThe search algorithm ranks results by relevance.β
Origin: From Medieval Latin `algorismus`, from Arabic `al-Khwarizmi` (the mathematician from Khwarazm)