论文部分内容阅读
Fabric retrieval is very challenging since problems like viewpoint variations,illumination chan-ges,blots,and poor image qualities are usually encountered in fabric images.In this work,a novel deep feature nonlinear fusion network ( DFNFN ) is proposed to nonlinearly fuse features leed from RGB and texture images for improving fabric retrieval.Texture images are obtained by using lo-cal binary patt texture ( LBP-Texture) features to describe RGB fabric images.The DFNFN first-ly applies two feature leing branches to deal with RGB images and the corresponding LBP-Texture images simultaneously.Each branch contains the same convolutional neural network ( CNN) archi-tecture but independently leing parameters.Then,a nonlinear fusion module ( NFM) is designed to concatenate the features produced by the two branches and nonlinearly fuse the concatenated fea-tures via a convolutional layer followed with a rectified linear unit ( ReLU ) .The NFM is flexible since it can be embedded in different depths of the DFNFN to find the best fusion position.Conse-quently,DFNFN can optimally fuse features leed from RGB and LBP-Texture images to boost the retrieval accuracy.Extensive experiments on the Fabric 1.0 dataset show that the proposed method is superior to many state-of-the-art methods.