전체 글 37

[Paper Review] Decoupled Knowledge Distillation

This is a review of"Decoupled Knowledge Distillation"presented at CVPR 2022.IntroductionTypes of Knowledge DistillationLogits-based method$(+)$ Computational and storage cost ↓$(-)$ Unsatisfactory performanceFeature-based method$(+)$ Superior performance$(-)$ Extra computational cost and storage usage∴ Potential of logit distillation is limited. Decoupled Knowledge DistillationTarget classifiati..