Neural Network Basics
Neurons & Layers
Activation Functions
PyTorch Functions
Training Concepts
100

What is a neural network?

A system that mimics human thinking.

100

What is a neuron?

The smallest unit of a neural network.

100

What is an activation function?

It decides whether a neuron should fire.

100

What is nn.ReLU()?

Applies the ReLU activation function.

100

What is forward propagation?

Moves data from the input to the output layer.

200

What are input layer, hidden layers, and output layer?

The three main parts of a neural network.

200

What is a hidden layer?

It processes and learns important features.

200

What is the sigmoid function?

An activation function that outputs values between 0 and 1

200

What is backward()?

Calculates gradients in PyTorch.

200

What is backward propagation?

Adjusts weights based on error to improve learning.

300

What is y = f(WX + b)?

The forward propagation formula.

300

What is bias?

A fixed value added to the weighted sum.

300

What is ReLU?

It converts negative values to zero and keeps positive values the same.

300

What is requires_grad=True?

Enables gradient computation for a tensor.

300

What is a loss function?

Measures how wrong the predictions are.

400

What does a neuron do (weights + bias + activation)?

Applies weights, adds bias, and passes through an activation function.

400

What is a weight?

Represents the importance of the input.

400

What is the purpose of activation functions?

They determine the neuron’s output and introduce non-linearity.

400

What is optimizer.step()?

Updates the weights during training.

400

What optimizer is used (SGD)?

Stochastic Gradient Descent (SGD).

500

What are the three main parts of a neural network?

Input layer, hidden layers, and output layer

500

What is the smallest unit of a neural network?

A neuron.

500

What activation function outputs values between 0 and 1?

Sigmoid.

500

What function updates the weights?

optimizer.step()

500

How does a neural network improve its learning?

By adjusting weights using backward propagation.