Multi-target Knowledge Distillation via Student Self-reflection

CC BY

Saved in:
Bibliographic Details
Main Authors: Jianping, Gou, Xiangshuo, Xiong, Baosheng, Yu
Format: Book
Language:English
Published: Springer 2023
Subjects:
CRD
Online Access:https://link.springer.com/article/10.1007/s11263-023-01792-z
https://dlib.phenikaa-uni.edu.vn/handle/PNK/8350
Tags: Add Tag
No Tags, Be the first to tag this record!
id oai:localhost:PNK-8350
record_format dspace
spelling oai:localhost:PNK-83502023-04-27T06:59:49Z Multi-target Knowledge Distillation via Student Self-reflection Jianping, Gou Xiangshuo, Xiong Baosheng, Yu CRD CUB200-2011 CC BY Knowledge distillation is a simple yet effective technique for deep model compression, which aims to transfer the knowledge learned by a large teacher model to a small student model. To mimic how the teacher teaches the student, existing knowledge distillation methods mainly adapt an unidirectional knowledge transfer, where the knowledge extracted from different intermedicate layers of the teacher model is used to guide the student model. However, it turns out that the students can learn more effectively through multi-stage learning with a self-reflection in the real-world education scenario, which is nevertheless ignored by current knowledge distillation methods. Inspired by this, we devise a new knowledge distillation framework entitled multi-target knowledge distillation via student self-reflection or MTKD-SSR, which can not only enhance the teacher’s ability in unfolding the knowledge to be distilled, but also improve the student’s capacity of digesting the knowledge. Specifically, the proposed framework consists of three target knowledge distillation mechanisms: a stage-wise channel distillation (SCD), a stage-wise response distillation (SRD), and a cross-stage review distillation (CRD), where SCD and SRD transfer feature-based knowledge (i.e., channel features) and response-based knowledge (i.e., logits) at different stages, respectively; and CRD encourages the student model to conduct self-reflective learning after each stage by a self-distillation of the response-based knowledge. Experimental results on five popular visual recognition datasets, CIFAR-100, Market-1501, CUB200-2011, ImageNet, and Pascal VOC, demonstrate that the proposed framework significantly outperforms recent state-of-the-art knowledge distillation methods. 2023-04-27T06:59:49Z 2023-04-27T06:59:49Z 2023 Book https://link.springer.com/article/10.1007/s11263-023-01792-z https://dlib.phenikaa-uni.edu.vn/handle/PNK/8350 en application/pdf Springer
institution Digital Phenikaa
collection Digital Phenikaa
language English
topic CRD
CUB200-2011
spellingShingle CRD
CUB200-2011
Jianping, Gou
Xiangshuo, Xiong
Baosheng, Yu
Multi-target Knowledge Distillation via Student Self-reflection
description CC BY
format Book
author Jianping, Gou
Xiangshuo, Xiong
Baosheng, Yu
author_facet Jianping, Gou
Xiangshuo, Xiong
Baosheng, Yu
author_sort Jianping, Gou
title Multi-target Knowledge Distillation via Student Self-reflection
title_short Multi-target Knowledge Distillation via Student Self-reflection
title_full Multi-target Knowledge Distillation via Student Self-reflection
title_fullStr Multi-target Knowledge Distillation via Student Self-reflection
title_full_unstemmed Multi-target Knowledge Distillation via Student Self-reflection
title_sort multi-target knowledge distillation via student self-reflection
publisher Springer
publishDate 2023
url https://link.springer.com/article/10.1007/s11263-023-01792-z
https://dlib.phenikaa-uni.edu.vn/handle/PNK/8350
_version_ 1764358630729056256
score 8.881002