Yıldız, Beytullah

Loading...
Profile Picture
Name Variants
Yıldız, Beytullah
B.,Yildiz
Yildiz, B
B., Yildiz
B., Yıldız
Beytullah, Yildiz
Y.,Beytullah
Yildiz,B.
Y., Beytullah
Yıldız,B.
Beytullah, Yıldız
Yildiz, Beytullah
B.,Yıldız
Job Title
Doçent Doktor
Email Address
beytullah.yildiz@atilim.edu.tr
Main Affiliation
Software Engineering
Status
Website
ORCID ID
Scopus Author ID
Turkish CoHE Profile ID
Google Scholar ID
WoS Researcher ID

Sustainable Development Goals

2

ZERO HUNGER
ZERO HUNGER Logo

0

Research Products

11

SUSTAINABLE CITIES AND COMMUNITIES
SUSTAINABLE CITIES AND COMMUNITIES Logo

0

Research Products

14

LIFE BELOW WATER
LIFE BELOW WATER Logo

0

Research Products

6

CLEAN WATER AND SANITATION
CLEAN WATER AND SANITATION Logo

0

Research Products

1

NO POVERTY
NO POVERTY Logo

0

Research Products

5

GENDER EQUALITY
GENDER EQUALITY Logo

0

Research Products

9

INDUSTRY, INNOVATION AND INFRASTRUCTURE
INDUSTRY, INNOVATION AND INFRASTRUCTURE Logo

0

Research Products

16

PEACE, JUSTICE AND STRONG INSTITUTIONS
PEACE, JUSTICE AND STRONG INSTITUTIONS Logo

0

Research Products

17

PARTNERSHIPS FOR THE GOALS
PARTNERSHIPS FOR THE GOALS Logo

0

Research Products

15

LIFE ON LAND
LIFE ON LAND Logo

0

Research Products

10

REDUCED INEQUALITIES
REDUCED INEQUALITIES Logo

0

Research Products

7

AFFORDABLE AND CLEAN ENERGY
AFFORDABLE AND CLEAN ENERGY Logo

0

Research Products

8

DECENT WORK AND ECONOMIC GROWTH
DECENT WORK AND ECONOMIC GROWTH Logo

0

Research Products

4

QUALITY EDUCATION
QUALITY EDUCATION Logo

0

Research Products

12

RESPONSIBLE CONSUMPTION AND PRODUCTION
RESPONSIBLE CONSUMPTION AND PRODUCTION Logo

0

Research Products

3

GOOD HEALTH AND WELL-BEING
GOOD HEALTH AND WELL-BEING Logo

2

Research Products

13

CLIMATE ACTION
CLIMATE ACTION Logo

0

Research Products
Documents

15

Citations

166

h-index

8

Documents

15

Citations

85

Scholarly Output

18

Articles

7

Views / Downloads

92/711

Supervised MSc Theses

6

Supervised PhD Theses

0

WoS Citation Count

60

Scopus Citation Count

136

WoS h-index

5

Scopus h-index

6

Patents

0

Projects

0

WoS Citations per Publication

3.33

Scopus Citations per Publication

7.56

Open Access Source

2

Supervised Theses

6

Google Analytics Visitor Traffic

JournalCount
Concurrency and Computation: Practice and Experience4
IEEE Access1
International Conference on Computational Science and Computational Intelligence (CSCI) -- DEC 13-15, 2023 -- Las Vegas, NV1
International Journal on Artificial Intelligence Tools1
Lecture Notes in Networks and Systems -- International Conference on Computing, Intelligence and Data Analytics, ICCIDA 2022 -- 16 September 2022 through 17 September 2022 -- Kocaeli -- 2919291
Current Page: 1 / 2

Scopus Quartile Distribution

Competency Cloud

GCRIS Competency Cloud

Scholarly Output Search Results

Now showing 1 - 3 of 3
  • Article
    Citation - WoS: 29
    Citation - Scopus: 44
    Text Classification Using Improved Bidirectional Transformer
    (Wiley, 2022) Tezgider, Murat; Yıldız, Beytullah; Yildiz, Beytullah; Aydin, Galip; Yıldız, Beytullah
    Text data have an important place in our daily life. A huge amount of text data is generated everyday. As a result, automation becomes necessary to handle these large text data. Recently, we are witnessing important developments with the adaptation of new approaches in text processing. Attention mechanisms and transformers are emerging as methods with significant potential for text processing. In this study, we introduced a bidirectional transformer (BiTransformer) constructed using two transformer encoder blocks that utilize bidirectional position encoding to take into account the forward and backward position information of text data. We also created models to evaluate the contribution of attention mechanisms to the classification process. Four models, including long short term memory, attention, transformer, and BiTransformer, were used to conduct experiments on a large Turkish text dataset consisting of 30 categories. The effect of using pretrained embedding on models was also investigated. Experimental results show that the classification models using transformer and attention give promising results compared with classical deep learning models. We observed that the BiTransformer we proposed showed superior performance in text classification.
  • Article
    Citation - WoS: 6
    Citation - Scopus: 10
    Beyond Rouge: a Comprehensive Evaluation Metric for Abstractive Summarization Leveraging Similarity, Entailment, and Acceptability
    (World Scientific Publ Co Pte Ltd, 2024) Briman, Mohammed Khalid Hilmi; Yıldız, Beytullah; Yildiz, Beytullah; Yıldız, Beytullah
    A vast amount of textual information on the internet has amplified the importance of text summarization models. Abstractive summarization generates original words and sentences that may not exist in the source document to be summarized. Such abstractive models may suffer from shortcomings such as linguistic acceptability and hallucinations. Recall-Oriented Understudy for Gisting Evaluation (ROUGE) is a metric commonly used to evaluate abstractive summarization models. However, due to its n-gram-based approach, it ignores several critical linguistic aspects. In this work, we propose Similarity, Entailment, and Acceptability Score (SEAScore), an automatic evaluation metric for evaluating abstractive text summarization models using the power of state-of-the-art pre-trained language models. SEAScore comprises three language models (LMs) that extract meaningful linguistic features from candidate and reference summaries and a weighted sum aggregator that computes an evaluation score. Experimental results show that our LM-based SEAScore metric correlates better with human judgment than standard evaluation metrics such as ROUGE-N and BERTScore.
  • Article
    Citation - WoS: 9
    Citation - Scopus: 13
    Improving Word Embedding Quality With Innovative Automated Approaches To Hyperparameters
    (Wiley, 2021) Yildiz, Beytullah; Yıldız, Beytullah; Tezgider, Murat; Yıldız, Beytullah
    Deep learning practices have a great impact in many areas. Big data and significant hardware developments are the main reasons behind deep learning success. Recent advances in deep learning have led to significant improvements in text analysis and classification. Progress in the quality of word representation is an important factor among these improvements. In this study, we aimed to develop word2vec word representation, also called embedding, by automatically optimizing hyperparameters. Minimum word count, vector size, window size, negative sample, and iteration number were used to improve word embedding. We introduce two approaches for setting hyperparameters that are faster than grid search and random search. Word embeddings were created using documents of approximately 300 million words. We measured the quality of word embedding using a deep learning classification model on documents of 10 different classes. It was observed that the optimization of the values of hyperparameters alone increased classification success by 9%. In addition, we demonstrate the benefits of our approaches by comparing the semantic and syntactic relations between word embedding using default and optimized hyperparameters.