【摘 要】
:
In relation extraction,directly adopting a model trained in the source domain to the target domain will suffer greatly performance decrease.Existing studies extract the shared features between domains
【机 构】
:
School of computer science,Beijing university of posts and telecommunications,China
【出 处】
:
第十八届中国计算语言学大会暨中国中文信息学会2019学术年会
论文部分内容阅读
In relation extraction,directly adopting a model trained in the source domain to the target domain will suffer greatly performance decrease.Existing studies extract the shared features between domains in a coarse-grained way,which inevitably introduce some domain-specific features or suffer from information loss.Inspired by human beings often using different views to find connection between domains,we argue that,there exist some fine-grained features which can be shared across different views of origin data.In this paper,we proposed a cross-view adaptation network,which use adversarial method to extract shared features and introduce cross-view training to fine-turn it.Besides,we construct some novel views of input data for cross-domain relation extraction.Through experiments we demonstrated that the different views of data we con-struct can effectively avoid introducing some domain-specific features into unified feature space and help the model learn a fine-grained shared features of different domain.On the three different domains of ACE 2005 dataset,Our method achieved the state-of-the-art results in F1-score.
其他文献
Aspect-based sentiment analysis(ABSA)aims at identifying sentiment polarities towards aspect in a sentence.Attention mechanism has played an important role in previous state-of-the-art neural models.H
This present study aims to investigate the colligational structures in China English.A corpus-based and comparative methodology was adopted in which three verbs of communication(discuss,communicate an
Answer selection(AS)is an important subtask of question answering(QA)that aims to choose the most suitable answer from a list of candidate an-swers.Existing AS models usually explored the single-scale
In recent years,machine reading comprehension is becoming a more and more popular research topic.Promising results were obtained when the machine reading comprehension task had only two inputs,context
Most of the current man-machine dialogues are at the two end-points of a spectrum of dialogues,i.e.goal-driven dialogues and non goal-driven chitchats.Document-driven dialogues provide a bridge betwee
Natural language inference(NLI)is a challenging task to determine the relationship between a pair of sentences.Existing Neural Network-based(NN-based)models have achieved prominent success.However,rar
In this paper,we present a neural model to map structured table into document-scale descriptive texts.Most existing neural net-work based approaches encode a table record-by-record and generate long s
Word embeddings have a significant impact on natural lan-guage processing.In morpheme writing systems,most Chinese word em-beddings take a word as the basic unit,or directly use the internal structure
Dropped pronoun recovery,which aims to detect the type of pronoun dropped before each token,plays a vital role in many applications such as Machine Translation and Information Extraction.Recently,deep
Distant supervision for relation extraction has been widely used to construct training set by aligning the triples of the knowledge base,which is an efficient method to reduce human efforts.However,th