Pre-Trained Language Models for Text Generation: A Survey
Article Status
Published
Authors/contributors
- Li, Junyi (Author)
- Tang, Tianyi (Author)
- Zhao, Wayne Xin (Author)
- Nie, Jian-Yun (Author)
- Wen, Ji-Rong (Author)
Title
Pre-Trained Language Models for Text Generation: A Survey
Abstract
Text Generation aims to produce plausible and readable text in human language from input data. The resurgence of deep learning has greatly advanced this field, in particular, with the help of neural generation models based on pre-trained language models (PLMs). Text generation based on PLMs is viewed as a promising approach in both academia and industry. In this article, we provide a survey on the utilization of PLMs in text generation. We begin with introducing two key aspects of applying PLMs to text generation: (1) how to design an effective PLM to serve as the generation model; and (2) how to effectively optimize PLMs given the reference text and to ensure that the generated texts satisfy special text properties. Then, we show the major challenges that have arisen in these aspects, as well as possible solutions for them. We also include a summary of various useful resources and typical text generation applications based on PLMs. Finally, we highlight the future research directions which will further improve these PLMs for text generation. This comprehensive survey is intended to help researchers interested in text generation problems to learn the core concepts, the main techniques and the latest developments in this area based on PLMs.
Publication
ACM Computing Surveys
Volume
56
Issue
9
Pages
1-39
Date
2024-10-31
Journal Abbr
ACM Comput. Surv.
Language
en
DOI
ISSN
0360-0300, 1557-7341
Short Title
Pre-Trained Language Models for Text Generation
Accessed
26/06/2024, 19:20
Library Catalogue
DOI.org (Crossref)
Extra
Citation Key: li2024b
Citation
Li, J., Tang, T., Zhao, W. X., Nie, J.-Y., & Wen, J.-R. (2024). Pre-Trained Language Models for Text Generation: A Survey. ACM Computing Surveys, 56(9), 1–39. https://doi.org/10.1145/3649449
Link to this record