Interactive Quiz

Test your knowledge!

1
What is the definition of the derivative of a function \( f \) at a point \( x \)?
2
What is the main role of gradient descent in optimization?
3
In the gradient descent method, how is the variable \( x \) updated at each iteration?
4
What is the chain rule in differential calculus?
5
In the context of gradient descent for multiple variables, how is the partial derivative of \( y \) with respect to the variable \( a \) calculated?
6
What is the formula for the sigmoid function \( \sigma(x) \) used as an activation function in an artificial neuron?
7
Why is the Heaviside function less suitable for training a neural network using gradient descent?
8
What is the general form of the loss function used in logistic regression for a data point with label \( y_{true} \) and prediction \( pred \)?
9
What is the partial derivative of the loss function with respect to the weight \( w_0 \) in logistic regression?
10
How is the output of an artificial neuron calculated with input \( \mathbf{x} \), weights \( \mathbf{w} \), bias \( b \), and activation function \( \phi \)?
Score: 0/10
Score: 0/10