This is a review of
"Distilling the Knowledge in a Neural Network"
presented at arXiv 2015.
'Paper Review > Knowledge Distillation' 카테고리의 다른 글
[Paper Review] Knowledge Distillation from A Stronger Teacher (0) | 2024.11.25 |
---|---|
[Paper Review] Multi-level Logit Distillation (0) | 2024.11.25 |
[Paper Review] Distilling Knowledge via Knowledge Review (0) | 2024.11.25 |
[Paper Review] Decoupled Knowledge Distillation (0) | 2024.11.25 |
[Paper Review] Similarity-Preserving Knowledge Distillation (0) | 2024.11.25 |