Neural Navigators
AI Visionaries
Cognitive Coders
Machine Minds
Algorithmic Avengers
100

___________ is the process of understanding the reliability of any AI model, based on outputs by feeding test dataset into the model and comparing with actual answers.

Evaluation

100

If model will simply remember the whole training set, and will therefore always predict the correct label for any point in the training set. This is known as ____________.

Overfitting

100

The result of comparison between the prediction and reality can be recorded in what we call the __________.

Confusion Matrix

100

_________ is defined as the percentage of correct predictions out of all the observations.

Accuracy

100

_______ is defined as the percentage of true positive cases versus all the cases where the prediction is true

Precision

200

___________ can be defined as the fraction of positive cases that are correctly identified

Recall

200

..............and ................ are the two parameters considered for Evaluation of a model.

Prediction,  Reality

200

___________ can be defined as the measure of balance between precision and recall

F1 Score

200

The actual value was negative but the model predicted a positive value is .....................

False Positive

200

Priya was confused with the terms used in the evaluation stage. Suggest her the term used for the percentage of correct predictions out of all the observations.

Accuracy

300

What will be the outcome, if the Prediction is “Yes” and it matches with the Reality? What will be the outcome, if the Prediction is “Yes” and it does not match the Reality?

True Positive, False Positive

300

Recall-Evaluation method is

a) defined as the fraction of positive cases that are correctly identified. b) defined as the percentage of true positive cases versus all the cases where the prediction is true. c) defined as the percentage of correct predictions out of all the observations. d) comparison between the prediction and reality

a) defined as the fraction of positive cases that are correctly identified.

300

Give one example of an application which uses augmented reality.

Self Driving Cars

300

In___________________, the machine is trained with huge amounts of data which helps it in training itself around the data.

Deep Learning

300

Give an example where High Accuracy is not usable

SCENARIO: An expensive robotic chicken crosses a very busy road a thousand times per day. An ML model evaluates traffic patterns and predicts when this chicken can safely cross the street with an accuracy of 99.99%.

400

................data set is use to evaluate the model before the complete deployment.

Testing

400

We can’t make “good” decisions without information. (True/False)

True

400

What is F1 Score in Evaluation?

F1 score can be defined as the measure of balance between precision and recall.

F1Score = 2 *((Precision * Recall ) \  (Precision + Recall)


400

Give two evaluation methods are used to calculate F1 Score

Precision and Recall

400

Raunak was learning the conditions that make up the confusion matrix. He came across a scenario in which the machine that was supposed to predict an animal was always predicting not an animal. What is this condition called?

False Negative

500

Calculate Accuracy, Precision, Recall and F1 Score for the following Confusion Matrix:

                  Prediction: 1                   Prediction: 0

Reality: 1        50                                 20

Reality: 0        10                                 20

Accuracy: 0.7 

Precision: 0.714 

Recall: 0.5 

F1 Score: 0.58

500

Calculate Accuracy, Precision, Recall and F1 Score for the following Confusion Matrix:

                  Prediction: 1                   Prediction: 0

Reality: 1        75                                5

Reality: 0        5                                 15

Accuracy: 0.9

Precision: 0.9375 

Recall: 0.9375 

F1 Score: 0.9375 

500

Calculate Accuracy, Precision, Recall and F1 Score for the following Confusion Matrix:

                  Prediction: 1                   Prediction: 0

Reality: 1        10                                55

Reality: 0        10                                25

Accuracy= 0.35 Precision= 0.15 Recall= 0.5 F1 Score= 0.23

500

The output given by the AI machine is known as ________

Prediction

500

Calculate  Precision, Recall and F1 Score for the following Confusion Matrix:

                  Prediction: 1                   Prediction: 0

Reality: 1        60                                25

Reality: 0        05                                10

 Precision= 0.7

 Recall= 0.92

 F1 Score= 0.79

M
e
n
u