Homogeneous Online Transfer Learning with Online Distribution Discrepancy Minimization

Transfer of learning Leverage (statistics) Mistake Learning classifier system Feature Learning Conditional probability distribution
DOI: 10.48550/arxiv.1912.13226 Publication Date: 2019-01-01
ABSTRACT
Transfer learning has been demonstrated to be successful and essential in diverse applications, which transfers knowledge from related but different source domains the target domain. Online transfer learning(OTL) is a more challenging problem where data arrive an online manner. Most OTL methods combine classifier directly by assigning weight each classifier, adjust weights constantly. However, these pay little attention reducing distribution discrepancy between domains. In this paper, we propose novel method seeks find new feature representation, so that marginal conditional can reduced simultaneously. We focus on with multiple use Hedge strategy leverage analyze theoretical properties of proposed algorithm provide upper mistake bound. Comprehensive experiments two real-world datasets show our outperforms state-of-the-art large margin.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....