T4: Stylized Text Generation Approaches and Applications

Lili Mou and Olga Vechtomova

Live Session: Jul 5 (17:30-21:00 GMT)
Abstract: Text generation has played an important role in various applications of natural language processing (NLP), and recent studies, researchers are paying increasing attention to modeling and manipulating the style of the generation text, which we call stylized text generation. In this tutorial, we will provide a comprehensive literature review in this direction. We start from the definition of style and different settings of stylized text generation, illustrated with various applications. Then, we present different settings of stylized generation, such as style-conditioned generation, style-transfer generation, and style-adversarial generation. In each setting, we delve deep into machine learning methods, including embedding learning techniques to represent style, adversarial learning, and reinforcement learning with cycle consistency to match content but to distinguish different styles. We also introduce current approaches to evaluating stylized text generation systems. We conclude our tutorial by presenting the challenges of stylized text generation and discussing future directions, such as small-data training, non-categorical style modeling, and a generalized scope of style transfer (e.g., controlling the syntax as a style).

Information about the virtual format of this tutorial: This tutorial has a prerecorded talk on this page (see below) that you can watch anytime during the conference. It also has a live session that will be conducted on Zoom and will be livestreamed on this page. Additionally, it has a chat window that you can use to have discussions with the tutorial teachers and other attendees anytime during the conference.

Live Session