TY - GEN
T1 - Pseudo Zero Pronoun Resolution Improves Zero Anaphora Resolution
AU - Konno, Ryuto
AU - Kiyono, Shun
AU - Matsubayashi, Yuichiroh
AU - Ouchi, Hiroki
AU - Inui, Kentaro
N1 - Funding Information:
We thank anonymous reviewers for their insightful comments. We thank Jun Suzuki, Ana Brassard, Tatsuki Kuribayashi, Takumi Ito, Shiki Sato, and Yosuke Kishinami for their valuable comments. This work was supported by JSPS KAKENHI Grant Numbers JP19K12112, JP19K20351, and JP19H04162.
Publisher Copyright:
© 2021 Association for Computational Linguistics
PY - 2021
Y1 - 2021
N2 - Masked language models (MLMs) have contributed to drastic performance improvements with regard to zero anaphora resolution (ZAR). To further improve this approach, in this study, we made two proposals. The first is a new pretraining task that trains MLMs on anaphoric relations with explicit supervision, and the second proposal is a new finetuning method that remedies a notorious issue, the pretrain-finetune discrepancy. Our experiments on Japanese ZAR demonstrated that our two proposals boost the state-of-the-art performance, and our detailed analysis provides new insights on the remaining challenges.
AB - Masked language models (MLMs) have contributed to drastic performance improvements with regard to zero anaphora resolution (ZAR). To further improve this approach, in this study, we made two proposals. The first is a new pretraining task that trains MLMs on anaphoric relations with explicit supervision, and the second proposal is a new finetuning method that remedies a notorious issue, the pretrain-finetune discrepancy. Our experiments on Japanese ZAR demonstrated that our two proposals boost the state-of-the-art performance, and our detailed analysis provides new insights on the remaining challenges.
UR - http://www.scopus.com/inward/record.url?scp=85127436320&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85127436320&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85127436320
T3 - EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings
SP - 3790
EP - 3806
BT - EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings
PB - Association for Computational Linguistics (ACL)
T2 - 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021
Y2 - 7 November 2021 through 11 November 2021
ER -