Welcome

Text-editing models have recently become a prominent alternative to seq2seq models for monolingual text-generation tasks such as grammatical error correction, text simplification, and style transfer. These tasks share a common trait – they exhibit a large amount of textual overlap between the source and target texts.

Text-editing models take advantage of this observation and learn to generate the output by predicting edit operations applied to the source sequence. In contrast, seq2seq models generate outputs word-by-word from scratch thus making them slow at inference time. Text-editing models provide several benefits over seq2seq models including faster inference speed, higher sample efficiency, and better control and interpretability of the outputs.

This tutorial provides a comprehensive overview of the text-edit based models and current state-of-the-art approaches analyzing their pros and cons. We discuss challenges related to deployment and how these models help to mitigate hallucination and bias, both pressing challenges in the field of text generation.

Venue

The Text Generation with Text-Editing Models tutorial will be held in a a hybrid-format at NAACL 2022 on July 10th 2022 from 9:00am to 12:30pm (PDT). Additionally, there are Q&A sessions right before the tutorial (8:00am - 8:45am) and right after it (12:30pm - 1:00pm). The in-person tutorial and the Q & A sessions will take place at Ballroom Columbia A.

This and the full list of NAACL 2022 tutorials can be found on the conference's program page.

Video

Here is a pre-recorded video of the tutorial:

Slides

Reading list

Prerequisites

Text-Editing Methods Discussed during the Tutorial

Organizers

Eric Malmi
Eric Malmi
Google
Yue Dong
Yue Dong
McGill University & Mila
Jonathan Mallinson
Jonathan Mallinson
Google

Aleksandr Chuklin
Aleksandr Chuklin
Google
Jakub Adamek
Jakub Adamek
Google
Daniil Mirylenka
Daniil Mirylenka
Google

Felix Stahlberg
Felix Stahlberg
Google
Sebastian Krause
Sebastian Krause
Google
Shankar Kumar
Shankar Kumar
Google

Aliaksei Severyn
Aliaksei Severyn
Google

Acknowledgements

We would like to thank Cesar Ilharco.