Back to course
Next page
Interactive Quiz
Test your knowledge!
1
What is the main purpose of Batch Normalization in a neural network?
A
Reduce the batch size to speed up training
B
Normalize the pre-activations to obtain a distribution close to a Gaussian at each layer
C
Replace the activation function with a normalized function
D
Increase the variance of the weights to prevent overfitting
2
Why are the gamma (bngain) and beta (bnbias) parameters added in Batch Normalization?
A
To increase the batch size
B
To allow the model to recover expressive capacity despite centered-reduced normalization
C
To replace the weights of the previous layer
D
To compute the mean and variance of the activations
3
What is the main problem Batch Normalization faces during the inference (test) phase?
A
The absence of examples in the batch prevents calculating the batch mean and variance
B
Normalization only works on batches of size 1
C
BatchNorm always requires a gradient to function properly
D
The mean and variance must be recalculated at each test iteration
4
What method is commonly used to solve the BatchNorm problem during inference?
A
Reset the network weights before the inference phase
B
Use the mean and variance calculated over the entire dataset via an exponential moving average
C
Reduce the batch size to 1 during training
D
Do not use normalization during inference
5
What is the main difference between Batch Normalization and Layer Normalization?
A
BatchNorm normalizes over channels, LayerNorm over batches
B
BatchNorm normalizes over the batch dimension, LayerNorm normalizes over all activations of a layer for each individual example
C
BatchNorm is used only for convolutional networks, LayerNorm only for recurrent networks
D
BatchNorm does not require gamma and beta parameters, LayerNorm always does
Score: 0/5
Score: 0/5