TY - JOUR
T1 - Modeling semantic compositionality of relational patterns
AU - Takase, Sho
AU - Okazaki, Naoaki
AU - Inui, Kentaro
N1 - Funding Information:
This work was partially supported by Grant-in-Aid for JSPS Fellows Grant no. 26.5820 , JSPS KAKENHI Grant number 15H05318 , and JST, CREST .
Publisher Copyright:
© 2016 The Authors. Published by Elsevier Ltd.
PY - 2016/4/1
Y1 - 2016/4/1
N2 - Vector representation is a common approach for expressing the meaning of a relational pattern. Most previous work obtained a vector of a relational pattern based on the distribution of its context words (e.g., arguments of the relational pattern), regarding the pattern as a single 'word'. However, this approach suffers from the data sparseness problem, because relational patterns are productive, i.e., produced by combinations of words. To address this problem, we propose a novel method for computing the meaning of a relational pattern based on the semantic compositionality of constituent words. We extend the Skip-gram model (Mikolov et al., 2013) to handle semantic compositions of relational patterns using recursive neural networks. The experimental results show the superiority of the proposed method for modeling the meanings of relational patterns, and demonstrate the contribution of this work to the task of relation extraction.
AB - Vector representation is a common approach for expressing the meaning of a relational pattern. Most previous work obtained a vector of a relational pattern based on the distribution of its context words (e.g., arguments of the relational pattern), regarding the pattern as a single 'word'. However, this approach suffers from the data sparseness problem, because relational patterns are productive, i.e., produced by combinations of words. To address this problem, we propose a novel method for computing the meaning of a relational pattern based on the semantic compositionality of constituent words. We extend the Skip-gram model (Mikolov et al., 2013) to handle semantic compositions of relational patterns using recursive neural networks. The experimental results show the superiority of the proposed method for modeling the meanings of relational patterns, and demonstrate the contribution of this work to the task of relation extraction.
KW - Knowledge acquisition
KW - Natural language processing
KW - Recursive neural network
KW - Relation extraction
KW - Semantic compositionality
KW - Word embedding
UR - http://www.scopus.com/inward/record.url?scp=84960089805&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84960089805&partnerID=8YFLogxK
U2 - 10.1016/j.engappai.2016.01.027
DO - 10.1016/j.engappai.2016.01.027
M3 - Article
AN - SCOPUS:84960089805
SN - 0952-1976
VL - 50
SP - 256
EP - 264
JO - Engineering Applications of Artificial Intelligence
JF - Engineering Applications of Artificial Intelligence
ER -