Topic-Controlled Text Generation
No Thumbnail Available
Date
2021
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Institute of Electrical and Electronics Engineers Inc.
Open Access Color
OpenAIRE Downloads
OpenAIRE Views
Abstract
Today, the text generation subject in the field of Natural Language Processing (NLP) has gained a lot of importance. In particular, the quality of the text generated with the emergence of new transformer-based models has reached high levels. In this way, controllable text generation has become an important research area. There are various methods applied for controllable text generation, but since these methods are mostly applied on Recurrent Neural Network (RNN) based encoder decoder models, which were used frequently, studies using transformer-based models are few. Transformer-based models are very successful in long sequences thanks to their parallel working ability. This study aimed to generate Turkish reviews on the desired topics by using a transformer-based language model. We used the method of adding the topic information to the sequential input. We concatenated input token embedding and topic embedding (control) at each time step during the training. As a result, we were able to create Turkish reviews on the specified topics. © 2021 IEEE
Description
Keywords
Controllable textgeneration, Review generation, Text generation, Topic-controlled textgeneration
Turkish CoHE Thesis Center URL
Fields of Science
Citation
WoS Q
Scopus Q
Source
Proceedings - 6th International Conference on Computer Science and Engineering, UBMK 2021 -- 6th International Conference on Computer Science and Engineering, UBMK 2021 -- 15 September 2021 through 17 September 2021 -- Ankara -- 176826
Volume
Issue
Start Page
533
End Page
536