Line Graph Enhanced AMR-to-Text Generation with Mix-Order Graph Attention Networks

Yanbin Zhao, Lu Chen, Zhi Chen, Ruisheng Cao, Su Zhu, Kai Yu

Abstract Paper Share

Generation Long Paper

Session 1B: Jul 6 (06:00-07:00 GMT)
Session 2A: Jul 6 (08:00-09:00 GMT)
Abstract: Efficient structure encoding for graphs with labeled edges is an important yet challenging point in many graph-based models. This work focuses on AMR-to-text generation -- A graph-to-sequence task aiming to recover natural language from Abstract Meaning Representations (AMR). Existing graph-to-sequence approaches generally utilize graph neural networks as their encoders, which have two limitations: 1) The message propagation process in AMR graphs is only guided by the first-order adjacency information. 2) The relationships between labeled edges are not fully considered. In this work, we propose a novel graph encoding framework which can effectively explore the edge relations. We also adopt graph attention networks with higher-order neighborhood information to encode the rich structure in AMR graphs. Experiment results show that our approach obtains new state-of-the-art performance on English AMR benchmark datasets. The ablation analyses also demonstrate that both edge relations and higher-order information are beneficial to graph-to-sequence modeling.
You can open the pre-recorded video in a separate window.
NOTE: The SlidesLive video may display a random order of the authors. The correct author list is shown at the top of this webpage.

Similar Papers

AMR-To-Text Generation with Graph Transformer
Tianming Wang, Xiaojun Wan, Hanqi Jin,
A representative figure from paper tacl.1805
GPT-too: A Language-Model-First Approach for AMR-to-Text Generation
Manuel Mager, Ramón Fernandez Astudillo, Tahira Naseem, Md Arafat Sultan, Young-Suk Lee, Radu Florian, Salim Roukos,
A representative figure from paper main.167
Orthogonal Relation Transforms with Graph Context Modeling for Knowledge Graph Embedding
Yun Tang, Jing Huang, Guangtao Wang, Xiaodong He, Bowen Zhou,
A representative figure from paper main.241