IMoJIE: Iterative Memory-Based Joint Open Information Extraction

Kolluru, Keshav ; Aggarwal, Samarth ; Rathore, Vipul ; Mausam, ; Chakrabarti, Soumen (2020) IMoJIE: Iterative Memory-Based Joint Open Information Extraction In: 58th Annual Meeting of the Association for Computational Linguistics.

[img] PDF
1MB

Official URL: http://doi.org/10.18653/v1/2020.acl-main.521

Related URL: http://dx.doi.org/10.18653/v1/2020.acl-main.521

Abstract

While traditional systems for Open Information Extraction were statistical and rule-based, recently neural models have been introduced for the task. Our work builds upon CopyAttention, a sequence generation OpenIE model (Cui et. al. 18). Our analysis reveals that CopyAttention produces a constant number of extractions per sentence, and its extracted tuples often express redundant information. We present IMoJIE, an extension to CopyAttention, which produces the next extraction conditioned on all previously extracted tuples. This approach overcomes both shortcomings of CopyAttention, resulting in a variable number of diverse extractions per sentence. We train IMoJIE on training data bootstrapped from extractions of several non-neural systems, which have been automatically filtered to reduce redundancy and noise. IMoJIE outperforms CopyAttention by about 18 F1 pts, and a BERT-based strong baseline by 2 F1 pts, establishing a new state of the art for the task.

Item Type:Conference or Workshop Item (Paper)
Source:Copyright of this article belongs to Association for Computational Linguistics
ID Code:130878
Deposited On:01 Dec 2022 05:34
Last Modified:27 Jan 2023 09:31

Repository Staff Only: item control page