Yılmaz, CansenÇağlayan,C.Karakaya,M.Karakaya, Kasım MuratComputer Engineering2024-07-052024-07-0520215978-166542908-510.1109/UBMK52708.2021.95589102-s2.0-85125836395https://doi.org/10.1109/UBMK52708.2021.9558910https://hdl.handle.net/20.500.14411/4038Today, the text generation subject in the field of Natural Language Processing (NLP) has gained a lot of importance. In particular, the quality of the text generated with the emergence of new transformer-based models has reached high levels. In this way, controllable text generation has become an important research area. There are various methods applied for controllable text generation, but since these methods are mostly applied on Recurrent Neural Network (RNN) based encoder decoder models, which were used frequently, studies using transformer-based models are few. Transformer-based models are very successful in long sequences thanks to their parallel working ability. This study aimed to generate Turkish reviews on the desired topics by using a transformer-based language model. We used the method of adding the topic information to the sequential input. We concatenated input token embedding and topic embedding (control) at each time step during the training. As a result, we were able to create Turkish reviews on the specified topics. © 2021 IEEEeninfo:eu-repo/semantics/closedAccessControllable textgenerationReview generationText generationTopic-controlled textgenerationTopic-Controlled Text GenerationConference Object533536