首站-论文投稿智能助手
典型文献
Neural Attentional Relation Extraction with Dual Dependency Trees
文献摘要:
Relation extraction has been widely used to find semantic relations between entities from plain text.Depen-dency trees provide deeper semantic information for relation extraction.However,existing dependency tree based models adopt pruning strategies that are too aggressive or conservative,leading to insufficient semantic information or excessive noise in relation extraction models.To overcome this issue,we propose the Neural Attentional Relation Extraction Model with Dual Dependency Trees(called DDT-REM),which takes advantage of both the syntactic dependency tree and the semantic dependency tree to well capture syntactic features and semantic features,respectively.Specifically,we first pro-pose novel representation learning to capture the dependency relations from both syntax and semantics.Second,for the syntactic dependency tree,we propose a local-global attention mechanism to solve semantic deficits.We design an extension of graph convolutional networks(GCNs)to perform relation extraction,which effectively improves the extraction accuracy.We conduct experimental studies based on three real-world datasets.Compared with the traditional methods,our method improves the Fl scores by 0.3,0.1 and 1.6 on three real-world datasets,respectively.
文献关键词:
作者姓名:
Dong Li;Zhi-Lei Lei;Bao-Yan Song;Wan-Ting Ji;Yue Kou
作者机构:
School of Information,Liaoning University,Shenyang 110036,China;School of Computer Science and Engineering,Northeastern University,Shenyang 110004,China
引用格式:
[1]Dong Li;Zhi-Lei Lei;Bao-Yan Song;Wan-Ting Ji;Yue Kou-.Neural Attentional Relation Extraction with Dual Dependency Trees)[J].计算机科学技术学报(英文版),2022(06):1369-1381
A类:
Depen
B类:
Neural,Attentional,Relation,Extraction,Dual,Dependency,Trees,extraction,has,been,widely,used,find,relations,between,entities,from,plain,text,trees,provide,deeper,information,However,existing,dependency,models,adopt,pruning,strategies,that,too,aggressive,conservative,leading,insufficient,excessive,noise,To,overcome,this,issue,propose,Model,called,DDT,REM,which,takes,advantage,both,syntactic,well,capture,features,respectively,Specifically,first,novel,representation,learning,syntax,semantics,Second,local,global,attention,mechanism,solve,deficits,We,design,extension,graph,convolutional,networks,GCNs,perform,effectively,improves,accuracy,conduct,experimental,studies,three,real,world,datasets,Compared,traditional,methods,our,Fl,scores,by
AB值:
0.553701
相似文献
机标中图分类号,由域田数据科技根据网络公开资料自动分析生成,仅供学习研究参考。