Model Training
Training Vs. Testing
Overfitting
Hardware and Tools
Miscellaneous
100

What does it mean when an AI model “trains”?


It adjusts weights by learning from labeled data to make better predictions.


100

Testing data is data the model has ____.

Never seen before.

100

In one sentence, what is overfitting?

The model memorizes training details and fails on new data.

100

CPU stands for what?

Central Processing Unit.

100

What does AI stand for?

Artificial Intelligence

200

What is one full pass over the entire training data called?

An epoch

200

Your model shows 99% training accuracy but 55% testing accuracy. What’s happening?

Overfitting

200

True or false: Adding more varied training data can help reduce overfitting

True

200

GPU stands for?

Graphics Processing Unit

200

What is a dataset?

A collection of labeled examples the model uses to learn or test

300

Why don’t we use the same data for training and testing?

The model could just memorize the training set instead of learning patterns.

300

What table shows where the model confuses one class for another?

Confusion Matrix

300

Give one sign your model might be overfitting (besides test accuracy drop.

Big gap between training and testing results / model struggles with new examples

300

Why are GPUs better than CPUs for training AI models?

They can do thousands of math operations at the same time, making training faster

300

What does “generalization” mean in machine learning?

Doing well on new data the model hasn’t seen before

400

What happens if your learning rate is too high?

Large learning rate = big jumps, faster but risk of skipping the best solution

400

Why split data before training instead of after?

To keep the test set completely unseen and unbiased.

400

Why can using a very complicated model on a small dataset cause overfitting?

It can memorize every training example instead of learning general rules and adjust weights too much.

400

What is Google Colab’s main benefit for AI work?

Free access to powerful GPUs without needing expensive hardware

400

Why do we split our data before we start training instead of after?

To keep the test set completely unseen and avoid bias.

500

Name one reason training on your own computer might be slower than on Colab.

Most personal computers don’t have GPUs or enough memory


500

Which graph shows the model’s error decreasing while it trains?

A loss curve.

500

On a loss curve, how can you tell the model is starting to overfit?

The training loss keeps going down, but the test loss gets worse instead of better.

500

Which Python library is commonly used for training YOLO models?

Ultralytics

500

What does “epoch” mean?

One full pass of the model over the entire training dataset.

M
e
n
u