What does it mean when an AI model “trains”?
It adjusts weights by learning from labeled data to make better predictions.
Testing data is data the model has ____.
Never seen before.
In one sentence, what is overfitting?
The model memorizes training details and fails on new data.
CPU stands for what?
Central Processing Unit.
What does AI stand for?
Artificial Intelligence
What is one full pass over the entire training data called?
An epoch
Your model shows 99% training accuracy but 55% testing accuracy. What’s happening?
Overfitting
True or false: Adding more varied training data can help reduce overfitting
True
GPU stands for?
Graphics Processing Unit
What is a dataset?
A collection of labeled examples the model uses to learn or test
Why don’t we use the same data for training and testing?
The model could just memorize the training set instead of learning patterns.
What table shows where the model confuses one class for another?
Confusion Matrix
Give one sign your model might be overfitting (besides test accuracy drop.
Big gap between training and testing results / model struggles with new examples
Why are GPUs better than CPUs for training AI models?
They can do thousands of math operations at the same time, making training faster
What does “generalization” mean in machine learning?
Doing well on new data the model hasn’t seen before
What happens if your learning rate is too high?
Large learning rate = big jumps, faster but risk of skipping the best solution
Why split data before training instead of after?
To keep the test set completely unseen and unbiased.
Why can using a very complicated model on a small dataset cause overfitting?
It can memorize every training example instead of learning general rules and adjust weights too much.
What is Google Colab’s main benefit for AI work?
Free access to powerful GPUs without needing expensive hardware
Why do we split our data before we start training instead of after?
To keep the test set completely unseen and avoid bias.
Name one reason training on your own computer might be slower than on Colab.
Most personal computers don’t have GPUs or enough memory
Which graph shows the model’s error decreasing while it trains?
A loss curve.
On a loss curve, how can you tell the model is starting to overfit?
The training loss keeps going down, but the test loss gets worse instead of better.
Which Python library is commonly used for training YOLO models?
Ultralytics
What does “epoch” mean?
One full pass of the model over the entire training dataset.