Charles Explorer logo
🇬🇧

Data-to-Text Generation with Iterative Text Editing

Publication at Faculty of Mathematics and Physics |
2020

Abstract

We present a novel approach to data-to-text generation based on iterative text editing. Our approach maximizes the completeness and semantic accuracy of the output text while leveraging the abilities of recent pretrained models for text editing (LaserTagger) and language modelling (GPT-2) to improve the text fluency.

To this end, we first transform data to text using trivial per-item lexicalizations, iteratively improving the resulting text by a neural model trained for the sentence fusion task. The model output is filtered by a simple heuristic and reranked with an off-the-shelf pretrained language model.

We evaluate our approach on two major data-to-text datasets (WebNLG, Cleaned E2E) and analyze its caveats and benefits. Furthermore, we show that our formulation of data-to-text generation opens up the possibility for zero-shot domain adaptation using a general-domain dataset for sentence fusion.