Interactive Quiz

Test your knowledge!

1
What is the main difference between classic gradient descent and stochastic gradient descent (SGD)?
2
What is the main effect of adding the momentum term in stochastic gradient descent with momentum?
3
What major problem does Adagrad encounter during model training?
4
How does RMSProp improve upon the Adagrad optimizer?
5
Why is Adam often recommended as the default optimizer?
Score: 0/5
Score: 0/5