Javascript must be enabled to continue!
Knowledge distillation in deep learning and its applications
View through CrossRef
Deep learning based models are relatively large, and it is hard to deploy such models on resource-limited devices such as mobile phones and embedded devices. One possible solution is knowledge distillation whereby a smaller model (student model) is trained by utilizing the information from a larger model (teacher model). In this paper, we present an outlook of knowledge distillation techniques applied to deep learning models. To compare the performances of different techniques, we propose a new metric called distillation metric which compares different knowledge distillation solutions based on models' sizes and accuracy scores. Based on the survey, some interesting conclusions are drawn and presented in this paper including the current challenges and possible research directions.
Title: Knowledge distillation in deep learning and its applications
Description:
Deep learning based models are relatively large, and it is hard to deploy such models on resource-limited devices such as mobile phones and embedded devices.
One possible solution is knowledge distillation whereby a smaller model (student model) is trained by utilizing the information from a larger model (teacher model).
In this paper, we present an outlook of knowledge distillation techniques applied to deep learning models.
To compare the performances of different techniques, we propose a new metric called distillation metric which compares different knowledge distillation solutions based on models' sizes and accuracy scores.
Based on the survey, some interesting conclusions are drawn and presented in this paper including the current challenges and possible research directions.
Related Results
A Comprehensive Review of Distillation in the Pharmaceutical Industry
A Comprehensive Review of Distillation in the Pharmaceutical Industry
Distillation processes play a pivotal role in the pharmaceutical industry for the purification of active pharmaceutical ingredients (APIs), intermediates, and solvent recovery. Thi...
Combined Knowledge Distillation Framework: Breaking Down Knowledge Barriers
Combined Knowledge Distillation Framework: Breaking Down Knowledge Barriers
<p>Knowledge distillation, one of the most prominent methods in model compression, has successfully balanced small model sizes and high performance. However, it has been obse...
Mitigating carbon footprint for knowledge distillation based deep learning model compression
Mitigating carbon footprint for knowledge distillation based deep learning model compression
Deep learning techniques have recently demonstrated remarkable success in numerous domains. Typically, the success of these deep learning models is measured in terms of performance...
Steam Distillation Studies For The Kern River Field
Steam Distillation Studies For The Kern River Field
Abstract
The interactions of heavy oil and injected steam in the mature steamflood at the Kern River Field have been extensively studied to gain insight into the ...
Multistructure-Based Collaborative Online Distillation
Multistructure-Based Collaborative Online Distillation
Recently, deep learning has achieved state-of-the-art performance in more aspects than traditional shallow architecture-based machine-learning methods. However, in order to achieve...
Deep convolutional neural network and IoT technology for healthcare
Deep convolutional neural network and IoT technology for healthcare
Background Deep Learning is an AI technology that trains computers to analyze data in an approach similar to the human brain. Deep learning algorithms can find complex patterns in ...
Initial Experience with Pediatrics Online Learning for Nonclinical Medical Students During the COVID-19 Pandemic
Initial Experience with Pediatrics Online Learning for Nonclinical Medical Students During the COVID-19 Pandemic
Abstract
Background: To minimize the risk of infection during the COVID-19 pandemic, the learning mode of universities in China has been adjusted, and the online learning o...
KNOWLEDGE IN PRACTICE
KNOWLEDGE IN PRACTICE
Knowledge is an understanding of someone or something, such as facts, information, descriptions or skills, which is acquired by individuals through education, learning, experience ...

