Cutting-off redundant repeating generations for neural abstractive summarization

Jun Suzuki, Masaaki Nagata

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

32 Citations (Scopus)

Abstract

This paper tackles the reduction of redundant repeating generation that is often observed in RNN-based encoder-decoder models. Our basic idea is to jointly estimate the upper-bound frequency of each target vocabulary in the encoder and control the output words based on the estimation in the decoder. Our method shows significant improvement over a strong RNN-based encoder-decoder baseline and achieved its best results on an abstractive summarization benchmark.

Original languageEnglish
Title of host publicationShort Papers
PublisherAssociation for Computational Linguistics (ACL)
Pages291-297
Number of pages7
ISBN (Electronic)9781510838604
DOIs
Publication statusPublished - 2017
Event15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017 - Valencia, Spain
Duration: 2017 Apr 32017 Apr 7

Publication series

Name15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017 - Proceedings of Conference
Volume2

Conference

Conference15th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2017
Country/TerritorySpain
CityValencia
Period17/4/317/4/7

Fingerprint

Dive into the research topics of 'Cutting-off redundant repeating generations for neural abstractive summarization'. Together they form a unique fingerprint.

Cite this