Abstract
Meaning Representation (AMR) is a semantic representation for NLP/NLU. In this paper, we propose to use it for data augmentation in NLP. Our proposed data augmentation technique, called AMR-DA, converts a sample sentence to an AMR graph, modifies the graph according to various data augmentation policies, and then generates augmentations from graphs. Our method combines both sentence-level techniques like back translation and token-level techniques like EDA (Easy Data Augmentation). To evaluate the effectiveness of our method, we apply it to the English tasks of semantic textual similarity (STS) and text classification. For STS, our experiments show that AMR-DA boosts the performance of the state-of-the-art models on several STS benchmarks. For text classification, AMR-DA outperforms EDA and AEDA and leads to more robust improvements.
| Original language | English |
|---|---|
| Title of host publication | ACL 2022 - 60th Annual Meeting of the Association for Computational Linguistics, Findings of ACL 2022 |
| Editors | Smaranda Muresan, Preslav Nakov, Aline Villavicencio |
| Publisher | Association for Computational Linguistics (ACL) |
| Pages | 3082-3098 |
| Number of pages | 17 |
| ISBN (Electronic) | 9781955917254 |
| DOIs | |
| Publication status | Published - 2022 |
| Externally published | Yes |
| Event | Findings of the Association for Computational Linguistics: ACL 2022 - Dublin, Ireland Duration: 22 May 2022 → 27 May 2022 |
Publication series
| Name | Proceedings of the Annual Meeting of the Association for Computational Linguistics |
|---|---|
| ISSN (Print) | 0736-587X |
Conference
| Conference | Findings of the Association for Computational Linguistics: ACL 2022 |
|---|---|
| Country/Territory | Ireland |
| City | Dublin |
| Period | 22/05/22 → 27/05/22 |
Bibliographical note
Publisher Copyright:© 2022 Association for Computational Linguistics.
Fingerprint
Dive into the research topics of 'AMR-DA: Data Augmentation by Abstract Meaning Representation'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver