Interactive Quiz

Test your knowledge!

1
What is the main advantage of transfer learning in deep learning?
2
What is the main difference between transfer learning and fine-tuning?
3
In fine-tuning, how do you choose the number of layers to retrain?
4
Which dataset is often used to pre-train image classification models in transfer learning?
5
What is the main objective of knowledge distillation?
6
Why does knowledge distillation often improve the performance of the student model?
7
In knowledge distillation applied to unsupervised anomaly detection, what is the main role of the student model?
8
What is the particularity of the BERT architecture compared to GPT?
9
Which training task does BERT use to learn linguistic representations?
10
In token-level classification with BERT (e.g., NER), why is a [CLS] token used at the beginning of the sequence?
Score: 0/10
Score: 0/10