Modeling semantic compositionality of relational patterns

Sho Takase, Naoaki Okazaki, Kentaro Inui

Research output: Contribution to journalArticlepeer-review

9 Citations (Scopus)


Vector representation is a common approach for expressing the meaning of a relational pattern. Most previous work obtained a vector of a relational pattern based on the distribution of its context words (e.g., arguments of the relational pattern), regarding the pattern as a single 'word'. However, this approach suffers from the data sparseness problem, because relational patterns are productive, i.e., produced by combinations of words. To address this problem, we propose a novel method for computing the meaning of a relational pattern based on the semantic compositionality of constituent words. We extend the Skip-gram model (Mikolov et al., 2013) to handle semantic compositions of relational patterns using recursive neural networks. The experimental results show the superiority of the proposed method for modeling the meanings of relational patterns, and demonstrate the contribution of this work to the task of relation extraction.

Original languageEnglish
Pages (from-to)256-264
Number of pages9
JournalEngineering Applications of Artificial Intelligence
Publication statusPublished - 2016 Apr 1


  • Knowledge acquisition
  • Natural language processing
  • Recursive neural network
  • Relation extraction
  • Semantic compositionality
  • Word embedding

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Artificial Intelligence
  • Electrical and Electronic Engineering


Dive into the research topics of 'Modeling semantic compositionality of relational patterns'. Together they form a unique fingerprint.

Cite this