Soyalp,G.Alar,A.Ozkanli,K.Yildiz,B.Software Engineering2024-07-052024-07-0520219978-166542908-510.1109/UBMK52708.2021.95589062-s2.0-85125862458https://doi.org/10.1109/UBMK52708.2021.9558906https://hdl.handle.net/20.500.14411/4034Huge amounts of text data are produced every day. Processing text data that accumulates and grows exponentially every day requires the use of appropriate automation tools. Text classification, a Natural Language Processing task, has the potential to provide automatic text data processing. Many new models have been proposed to achieve much better results in text classification. The transformer model has been introduced recently to provide superior performance in terms of accuracy and processing speed in deep learning. In this article, we propose an improved Transformer model for text classification. The dataset containing information about the books was collected from an online resource and used to train the models. We witnessed superior performance in our proposed Transformer model compared to previous state-of-art models such as L S T M and CNN. © 2021 IEEEeninfo:eu-repo/semantics/closedAccessAttentionDeep learningNatural language processingText classificationTransformerWord embeddingImproving Text Classification with TransformerConference Object707712