Joint Inference over a Lightly Supervised Information Extraction Pipeline: Towards Event Coreference Resolution for Resource-Scarce Languages
Chen Chen and Vincent Ng.
Proceedings of the 30th AAAI Conference on Artificial Intelligence, pp. 2913-2920, 2016.
Click here for the
PDF version.
Abstract
We address two key challenges in end-to-end event coreference resolution research: (1) the error propagation problem, where an event coreference resolver has to assume as input the noisy outputs produced by its upstream components in the standard information extraction (IE) pipeline; and (2) the data annotation bottleneck, where manually annotating data for all the components in the IE pipeline is prohibitively expensive. This is the case in the vast majority of the world's natural languages, where such annotated resources are not readily available. To address these problems, we propose to perform joint inference over a lightly supervised IE pipeline, where all the models are trained using either active learning or unsupervised learning. Using our approach, only 25% of the training sentences in the Chinese portion of the ACE 2005 corpus need to be annotated with entity and event mentions in order for our event coreference resolver to surpass its fully supervised counterpart in performance.
BibTeX entry
@InProceedings{Chen+Ng:16a,
author = {Chen Chen and Vincent Ng},
title = {Joint Inference over a Lightly Supervised Information Extraction Pipeline: Towards Event Coreference Resolution for Resource-Scarce Languages},
booktitle = {Proceedings of the 30th AAAI Conference on Artificial Intelligence},
pages = {2913--2920},
year = 2016}