___________ is the process of understanding the reliability of any AI model, based on outputs by feeding test dataset into the model and comparing with actual answers.
Evaluation
If model will simply remember the whole training set, and will therefore always predict the correct label for any point in the training set. This is known as ____________.
Overfitting
The result of comparison between the prediction and reality can be recorded in what we call the __________.
Confusion Matrix
_________ is defined as the percentage of correct predictions out of all the observations.
Accuracy
_______ is defined as the percentage of true positive cases versus all the cases where the prediction is true
Precision
___________ can be defined as the fraction of positive cases that are correctly identified
Recall
..............and ................ are the two parameters considered for Evaluation of a model.
Prediction, Reality
___________ can be defined as the measure of balance between precision and recall
F1 Score
The actual value was negative but the model predicted a positive value is .....................
False Positive
Priya was confused with the terms used in the evaluation stage. Suggest her the term used for the percentage of correct predictions out of all the observations.
Accuracy
What will be the outcome, if the Prediction is “Yes” and it matches with the Reality? What will be the outcome, if the Prediction is “Yes” and it does not match the Reality?
True Positive, False Positive
Recall-Evaluation method is
a) defined as the fraction of positive cases that are correctly identified. b) defined as the percentage of true positive cases versus all the cases where the prediction is true. c) defined as the percentage of correct predictions out of all the observations. d) comparison between the prediction and reality
a) defined as the fraction of positive cases that are correctly identified.
Give one example of an application which uses augmented reality.
Self Driving Cars
In___________________, the machine is trained with huge amounts of data which helps it in training itself around the data.
Deep Learning
Give an example where High Accuracy is not usable
SCENARIO: An expensive robotic chicken crosses a very busy road a thousand times per day. An ML model evaluates traffic patterns and predicts when this chicken can safely cross the street with an accuracy of 99.99%.
................data set is use to evaluate the model before the complete deployment.
Testing
We can’t make “good” decisions without information. (True/False)
True
What is F1 Score in Evaluation?
F1 score can be defined as the measure of balance between precision and recall.
F1Score = 2 *((Precision * Recall ) \ (Precision + Recall)
Give two evaluation methods are used to calculate F1 Score
Precision and Recall
Raunak was learning the conditions that make up the confusion matrix. He came across a scenario in which the machine that was supposed to predict an animal was always predicting not an animal. What is this condition called?
False Negative
Calculate Accuracy, Precision, Recall and F1 Score for the following Confusion Matrix:
Prediction: 1 Prediction: 0
Reality: 1 50 20
Reality: 0 10 20
Accuracy: 0.7
Precision: 0.714
Recall: 0.5
F1 Score: 0.58
Calculate Accuracy, Precision, Recall and F1 Score for the following Confusion Matrix:
Prediction: 1 Prediction: 0
Reality: 1 75 5
Reality: 0 5 15
Accuracy: 0.9
Precision: 0.9375
Recall: 0.9375
F1 Score: 0.9375
Calculate Accuracy, Precision, Recall and F1 Score for the following Confusion Matrix:
Prediction: 1 Prediction: 0
Reality: 1 10 55
Reality: 0 10 25
Accuracy= 0.35 Precision= 0.15 Recall= 0.5 F1 Score= 0.23
The output given by the AI machine is known as ________
Prediction
Calculate Precision, Recall and F1 Score for the following Confusion Matrix:
Prediction: 1 Prediction: 0
Reality: 1 60 25
Reality: 0 05 10
Precision= 0.7
Recall= 0.92
F1 Score= 0.79