Constrained Multi-Task Learning for Event Coreference Resolution

Jing Lu and Vincent Ng.
Proceedings of the 2021 North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 4504-4514, 2021.

Click here for the PDF version.

Abstract

We propose a neural event coreference model in which event coreference is jointly trained with five tasks: trigger detection, entity coreference, anaphoricity determination, realis detection, and argument extraction. To guide the learning of this complex model, we incorporate cross-task consistency constraints into the learning process as soft constraints via designing penalty functions. In addition, we propose the novel idea of viewing entity coreference and event coreference as a single coreference task, which we believe is a step towards a unified model of coreference resolution. The resulting model achieves state-of-the-art results on the KBP 2017 event coreference dataset.

BibTeX entry

@InProceedings{Lu+Ng:21b,
  author = {Jing Lu and Vincent Ng},
  title = {Constrained Multi-Task Learning for Event Coreference Resolution},
  booktitle = {Proceedings of the 2021 North American Chapter of the Association for Computational Linguistics: Human Language Technologies},
  pages = {4504--4514}, 
  year = 2021}