Abstract
Extracting entities and relations from text is a significant task of information extraction. Existing extraction models often straightforwardly produce their confident prediction results without any reconsideration or double-checking, resulting in avoidable mistakes and sub-optimal performance. In this paper, we propose a novel coarse-to-fine extraction framework, which first extracts high-potential relations as well as entities via knowledge distillation, and then rechecks the predictions via handcrafted natural language inference (NLI) task in a fine-grained manner. Specifically, based on the knowledge distillation mechanism, we train multiple teacher models iteratively through an adaptive loss function for making one teacher concentrate more on the data that others are incompetent for. Then, these complementary teacher models are utilized to provide valuable soft-label information for training a considerate student model, enabling it to generate reliable preliminary predictions. Further, these generated potential relations and entities are formulated as hypotheses, together with the original sentences as premises, serving as the input for an NLI model. Considering the linguistic diversity of relational expression, we automatically generate various semantic templates for hypotheses through an N-gram mining strategy. Moreover, due to the existence of multi-fact sentences, a relation-guided Gaussian attention is designed to reduce the gap between the single-relation hypothesis and the multi-relation premise. To implement efficient training, we also develop several ways to generate high-quality negative samples, which help the NLI model learn to identify errors. Experimental results show that the proposed method is effective and outperforms other strong baselines on public benchmarks.
| Original language | English |
|---|---|
| Title of host publication | Proceedings - 2024 IEEE 40th International Conference on Data Engineering, ICDE 2024 |
| Publisher | IEEE Computer Society |
| Pages | 1009-1022 |
| Number of pages | 14 |
| ISBN (Electronic) | 9798350317152 |
| DOIs | |
| Publication status | Published - 2024 |
| Event | 40th IEEE International Conference on Data Engineering, ICDE 2024 - Utrecht, Netherlands Duration: 13 May 2024 → 17 May 2024 |
Publication series
| Name | Proceedings - International Conference on Data Engineering |
|---|---|
| ISSN (Print) | 1084-4627 |
| ISSN (Electronic) | 2375-0286 |
Conference
| Conference | 40th IEEE International Conference on Data Engineering, ICDE 2024 |
|---|---|
| Country/Territory | Netherlands |
| City | Utrecht |
| Period | 13/05/24 → 17/05/24 |
Bibliographical note
Publisher Copyright:© 2024 IEEE.
Keywords
- Coarse-to-fine
- Entity-Relation Joint Extraction
- Knowledge Distillation
- Natural Language Inference