Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
Publication type: Journal Article
Publication date: 2024-02-01
scimago Q1
wos Q1
SJR: 3.686
CiteScore: 24.7
Impact factor: 8.9
ISSN: 2162237X, 21622388
PubMed ID:
35820013
Computer Science Applications
Computer Networks and Communications
Artificial Intelligence
Software
Abstract
Knowledge distillation (KD) is an effective framework that aims to transfer meaningful information from a large teacher to a smaller student. Generally, KD often involves how to define and transfer knowledge. Previous KD methods often focus on mining various forms of knowledge, for example, feature maps and refined information. However, the knowledge is derived from the primary supervised task, and thus, is highly task-specific. Motivated by the recent success of self-supervised representation learning, we propose an auxiliary self-supervision augmented task to guide networks to learn more meaningful features. Therefore, we can derive soft self-supervision augmented distributions as richer dark knowledge from this task for KD. Unlike previous knowledge, this distribution encodes joint knowledge from supervised and self-supervised feature learning. Beyond knowledge exploration, we propose to append several auxiliary branches at various hidden layers, to fully take advantage of hierarchical feature maps. Each auxiliary branch is guided to learn self-supervision augmented tasks and distill this distribution from teacher to student. Overall, we call our KD method a hierarchical self-supervision augmented KD (HSSAKD). Experiments on standard image classification show that both offline and online HSSAKD achieves state-of-the-art performance in the field of KD. Further transfer experiments on object detection further verify that HSSAKD can guide the network to learn better features. The code is available at https://github.com/winycg/HSAKD.
Found
Nothing found, try to update filter.
Found
Nothing found, try to update filter.
Top-30
Journals
|
1
2
|
|
|
Lecture Notes in Computer Science
2 publications, 14.29%
|
|
|
Neurocomputing
2 publications, 14.29%
|
|
|
IEEE Transactions on Pattern Analysis and Machine Intelligence
1 publication, 7.14%
|
|
|
Studies in Computational Intelligence
1 publication, 7.14%
|
|
|
Expert Systems with Applications
1 publication, 7.14%
|
|
|
Knowledge-Based Systems
1 publication, 7.14%
|
|
|
IET Cyber-Systems and Robotics
1 publication, 7.14%
|
|
|
Communications in Computer and Information Science
1 publication, 7.14%
|
|
|
IEEE Transactions on Neural Networks and Learning Systems
1 publication, 7.14%
|
|
|
IEEE Transactions on Computational Social Systems
1 publication, 7.14%
|
|
|
1
2
|
Publishers
|
1
2
3
4
5
|
|
|
Institute of Electrical and Electronics Engineers (IEEE)
5 publications, 35.71%
|
|
|
Springer Nature
4 publications, 28.57%
|
|
|
Elsevier
4 publications, 28.57%
|
|
|
Institution of Engineering and Technology (IET)
1 publication, 7.14%
|
|
|
1
2
3
4
5
|
- We do not take into account publications without a DOI.
- Statistics recalculated weekly.
Are you a researcher?
Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
14
Total citations:
14
Citations from 2024:
9
(64.28%)
Cite this
GOST |
RIS |
BibTex |
MLA
Cite this
GOST
Copy
Yang C. et al. Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution // IEEE Transactions on Neural Networks and Learning Systems. 2024. Vol. 35. No. 2. pp. 2094-2108.
GOST all authors (up to 50)
Copy
Yang C., An Z., Cai L., Xu Y. Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution // IEEE Transactions on Neural Networks and Learning Systems. 2024. Vol. 35. No. 2. pp. 2094-2108.
Cite this
RIS
Copy
TY - JOUR
DO - 10.1109/tnnls.2022.3186807
UR - https://doi.org/10.1109/tnnls.2022.3186807
TI - Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
T2 - IEEE Transactions on Neural Networks and Learning Systems
AU - Yang, Chuanguang
AU - An, Zhulin
AU - Cai, Linhang
AU - Xu, Yongjun
PY - 2024
DA - 2024/02/01
PB - Institute of Electrical and Electronics Engineers (IEEE)
SP - 2094-2108
IS - 2
VL - 35
PMID - 35820013
SN - 2162-237X
SN - 2162-2388
ER -
Cite this
BibTex (up to 50 authors)
Copy
@article{2024_Yang,
author = {Chuanguang Yang and Zhulin An and Linhang Cai and Yongjun Xu},
title = {Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution},
journal = {IEEE Transactions on Neural Networks and Learning Systems},
year = {2024},
volume = {35},
publisher = {Institute of Electrical and Electronics Engineers (IEEE)},
month = {feb},
url = {https://doi.org/10.1109/tnnls.2022.3186807},
number = {2},
pages = {2094--2108},
doi = {10.1109/tnnls.2022.3186807}
}
Cite this
MLA
Copy
Yang, Chuanguang, et al. “Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution.” IEEE Transactions on Neural Networks and Learning Systems, vol. 35, no. 2, Feb. 2024, pp. 2094-2108. https://doi.org/10.1109/tnnls.2022.3186807.