论文部分内容阅读
Transfer learning aims to transfer source models to a target domain. Leveraging the feature matching can alleviate the domain shift effectively, but this process ignores the relationship of the marginal distribution matching and the conditional distribution match?ing. Simultaneously, the discriminative information of both domains is also neglected, which is important for improving the performance on the target domain. In this paper, we propose a novel method called Balanced Discriminative Transfer Feature Learning for Visu?al Domain Adaptation (BDTFL). The proposed method can adaptively balance the relation?ship of both distribution matchings and capture the category discriminative information of both domains. Therefore, balanced feature matching can achieve more accurate feature matching and adaptively adjust itself to different scenes. At the same time, discriminative information is exploited to alleviate category confusion during feature matching. And with assistance of the category discriminative information captured from both domains, the source classifier can be transferred to the target domain more accurately and boost the per?formance of target classification. Extensive experiments show the superiority of BDTFL on popular visual cross-domain benchmarks.