Open Access
,
pages 205-219
Improving Knowledge Distillation via Category Structure
Publication type: Book Chapter
Publication date: 2020-11-02
scimago Q2
SJR: 0.352
CiteScore: 2.4
Impact factor: —
ISSN: 03029743, 16113349, 18612075, 18612083
Abstract
Most previous knowledge distillation frameworks train the student to mimic the teacher’s output of each sample or transfer cross-sample relations from the teacher to the student. Nevertheless, they neglect the structured relations at a category level. In this paper, a novel Category Structure is proposed to transfer category-level structured relations for knowledge distillation. It models two structured relations, including intra-category structure and inter-category structure, which are intrinsic natures in relations between samples. Intra-category structure penalizes the structured relations in samples from the same category and inter-category structure focuses on cross-category relations at a category level. Transferring category structure from the teacher to the student supplements category-level structured relations for training a better student. Extensive experiments show that our method groups samples from the same category tighter in the embedding space and the superiority of our method in comparison with closely related works are validated in different datasets and models.
Found
Nothing found, try to update filter.
Found
Nothing found, try to update filter.
Top-30
Journals
|
1
|
|
|
Studies in Computational Intelligence
1 publication, 7.69%
|
|
|
IEEE Transactions on Emerging Topics in Computational Intelligence
1 publication, 7.69%
|
|
|
Neural Processing Letters
1 publication, 7.69%
|
|
|
Neurocomputing
1 publication, 7.69%
|
|
|
Scientific Reports
1 publication, 7.69%
|
|
|
Neural Networks
1 publication, 7.69%
|
|
|
IEEE Geoscience and Remote Sensing Letters
1 publication, 7.69%
|
|
|
IEEE Transactions on Geoscience and Remote Sensing
1 publication, 7.69%
|
|
|
IEEE Transactions on Circuits and Systems for Video Technology
1 publication, 7.69%
|
|
|
1
|
Publishers
|
1
2
3
4
5
6
7
8
|
|
|
Institute of Electrical and Electronics Engineers (IEEE)
8 publications, 61.54%
|
|
|
Springer Nature
3 publications, 23.08%
|
|
|
Elsevier
2 publications, 15.38%
|
|
|
1
2
3
4
5
6
7
8
|
- We do not take into account publications without a DOI.
- Statistics recalculated weekly.
Are you a researcher?
Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
13
Total citations:
13
Citations from 2024:
9
(69.23%)
Cite this
GOST |
RIS |
BibTex
Cite this
GOST
Copy
Chen Z. et al. Improving Knowledge Distillation via Category Structure // Lecture Notes in Computer Science. 2020. pp. 205-219.
GOST all authors (up to 50)
Copy
Chen Z., Zheng X., Shen H., Zeng Z., Zhou Y., Zhao R. Improving Knowledge Distillation via Category Structure // Lecture Notes in Computer Science. 2020. pp. 205-219.
Cite this
RIS
Copy
TY - GENERIC
DO - 10.1007/978-3-030-58604-1_13
UR - https://doi.org/10.1007/978-3-030-58604-1_13
TI - Improving Knowledge Distillation via Category Structure
T2 - Lecture Notes in Computer Science
AU - Chen, Zailiang
AU - Zheng, Xianxian
AU - Shen, Hailan
AU - Zeng, Ziyang
AU - Zhou, Yukun
AU - Zhao, Rongchang
PY - 2020
DA - 2020/11/02
PB - Springer Nature
SP - 205-219
SN - 0302-9743
SN - 1611-3349
SN - 1861-2075
SN - 1861-2083
ER -
Cite this
BibTex (up to 50 authors)
Copy
@incollection{2020_Chen,
author = {Zailiang Chen and Xianxian Zheng and Hailan Shen and Ziyang Zeng and Yukun Zhou and Rongchang Zhao},
title = {Improving Knowledge Distillation via Category Structure},
publisher = {Springer Nature},
year = {2020},
pages = {205--219},
month = {nov}
}