Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

Cross-domain Cross-task Knowledge Distillation Network for Unsupervised Domain Adaptation*

View through CrossRef
Abstract Unsupervised domain adaptation has attracted extensive attention in recent years because it can naturally solve the problem of new domains encountered in depth model testing. Generally, unsupervised domain adaptive methods combine classification loss and adversarial loss, so that the features extracted from the model can be classified and are domain independent, thus alleviating the problems caused by cross domain models. However, it is impossible for the features extracted from the model to not contain any domain information. Therefore, we propose a new solution to make the source domain features extracted from the model fully include the style of the target domain images. Specifically, a compression network is trained by the target domain images, and the source domain features extracted by the classification network are distilled with the target domain image features extracted by the compression network to make it have the style of the target domain images. Finally, our method achieves leading or comparable results on five public data sets.
Title: Cross-domain Cross-task Knowledge Distillation Network for Unsupervised Domain Adaptation*
Description:
Abstract Unsupervised domain adaptation has attracted extensive attention in recent years because it can naturally solve the problem of new domains encountered in depth model testing.
Generally, unsupervised domain adaptive methods combine classification loss and adversarial loss, so that the features extracted from the model can be classified and are domain independent, thus alleviating the problems caused by cross domain models.
However, it is impossible for the features extracted from the model to not contain any domain information.
Therefore, we propose a new solution to make the source domain features extracted from the model fully include the style of the target domain images.
Specifically, a compression network is trained by the target domain images, and the source domain features extracted by the classification network are distilled with the target domain image features extracted by the compression network to make it have the style of the target domain images.
Finally, our method achieves leading or comparable results on five public data sets.

Related Results

A Comprehensive Review of Distillation in the Pharmaceutical Industry
A Comprehensive Review of Distillation in the Pharmaceutical Industry
Distillation processes play a pivotal role in the pharmaceutical industry for the purification of active pharmaceutical ingredients (APIs), intermediates, and solvent recovery. Thi...
Deep Unsupervised Domain Adaptation with Time Series Sensor Data: A Survey
Deep Unsupervised Domain Adaptation with Time Series Sensor Data: A Survey
Sensors are devices that output signals for sensing physical phenomena and are widely used in all aspects of our social production activities. The continuous recording of physical ...
Combined Knowledge Distillation Framework: Breaking Down Knowledge Barriers
Combined Knowledge Distillation Framework: Breaking Down Knowledge Barriers
<p>Knowledge distillation, one of the most prominent methods in model compression, has successfully balanced small model sizes and high performance. However, it has been obse...
Steam Distillation Studies For The Kern River Field
Steam Distillation Studies For The Kern River Field
Abstract The interactions of heavy oil and injected steam in the mature steamflood at the Kern River Field have been extensively studied to gain insight into the ...
Source-free Unsupervised Adaptive Segmentation for Knee Joint MRI
Source-free Unsupervised Adaptive Segmentation for Knee Joint MRI
<p>Knee osteoarthritis is a prevalent disease worldwide. The automatic segmentation of knee tissues in magnetic resonance (MR) images has important clinical utility in assess...
DLUT: Decoupled Learning-Based Unsupervised Tracker
DLUT: Decoupled Learning-Based Unsupervised Tracker
Unsupervised learning has shown immense potential in object tracking, where accurate classification and regression are crucial for unsupervised trackers. However, the classificatio...
KNOWLEDGE IN PRACTICE
KNOWLEDGE IN PRACTICE
Knowledge is an understanding of someone or something, such as facts, information, descriptions or skills, which is acquired by individuals through education, learning, experience ...

Back to Top