Gökçay, Erhan

Loading...
Profile Picture
Name Variants
Gokcay, E
E.,Gökçay
Gökçay,E.
E., Gökçay
G.,Erhan
Gokcay E.
Goekcay, Erhan
Gokcay, Erhan
Erhan, Gokcay
Gökçay, Erhan
E., Gokcay
GOKCAY, E
Erhan, Gökçay
Gokcay,E.
Gökçay E.
G., Erhan
E.,Gokcay
Job Title
Doktor Öğretim Üyesi
Email Address
erhan.gokcay@atilim.edu.tr
Scopus Author ID
Turkish CoHE Profile ID
Google Scholar ID
WoS Researcher ID
Scholarly Output

16

Articles

6

Citation Count

15

Supervised Theses

3

Scholarly Output Search Results

Now showing 1 - 10 of 16
  • Conference Object
    Citation Count: 1
    Effect of secret image transformation on the steganography process
    (Institute of Electrical and Electronics Engineers Inc., 2017) Buke,M.; Tora,H.; Gokcay,E.; Airframe and Powerplant Maintenance; Software Engineering
    Steganography is the art of hiding information in something else. It is favorable over encryption because encryption only hides the meaning of the information; whereas steganography hides the existence of the information. The existence of a hidden image decreases Peak Signal to Noise Ratio (PSNR) and increases Mean Square Error (MSE) values of the stego image. We propose an approach to improve PSNR and MSE values in stego images. In this method a transformation is applied to the secret image, concealed within another image, before embedding into the cover image. The effect of the transformation is tested with Least Significant Bit (LSB) insertion and Discrete Cosine Transformation (DCT) techniques. MSE and PSNR are calculated for both techniques with and without transformation. Results show a better MSE and PSNR values when a transformation is applied for LSB technique but no significant difference was shown in DCT technique. © 2017 IEEE.
  • Conference Object
    Citation Count: 0
    An IoT Application for Locating Victims Aftermath of an Earthquake
    (Ieee, 2017) Karakaya, Murat; Sengul, Gokhan; Gokcay, Erhan; Software Engineering; Computer Engineering
    This paper presents an Internet of Things (IoT) framework which is specially designed for assisting the research and rescue operations targeted to collapsed buildings aftermath of an earthquake. In general, an IoT network is used to collect and process data from different sources called things. According to the collected data, an IoT system can actuate different mechanisms to react the environment. In the problem at hand, we exploit the IoT capabilities to collect the data about the victims before the building collapses and when it falls down the collected data is processed to generate useful reports which will direct the search and rescue efforts. The proposed framework is tested by a pilot implementation with some simplifications. The initial results and experiences are promising. During the pilot implementation, we observed some issues which are addressed in the proposed IoT framework properly.
  • Conference Object
    Citation Count: 1
    An IoT application for locating victims aftermath of an earthquake
    (Institute of Electrical and Electronics Engineers Inc., 2017) Karakaya,M.; Şengül,G.; Gökçay,E.; Software Engineering; Computer Engineering
    This paper presents an Internet of Things (IoT) framework which is specially designed for assisting the research and rescue operations targeted to collapsed buildings aftermath of an earthquake. In general, an IoT network is used to collect and process data from different sources called things. According to the collected data, an IoT system can actuate different mechanisms to react the environment. In the problem at hand, we exploit the IoT capabilities to collect the data about the victims before the building collapses and when it falls down the collected data is processed to generate useful reports which will direct the search and rescue efforts. The proposed framework is tested by a pilot implementation with some simplifications. The initial results and experiences are promising. During the pilot implementation, we observed some issues which are addressed in the proposed IoT framework properly. © 2017 IEEE.
  • Article
    Citation Count: 0
    An unrestricted Arnold's cat map transformation
    (Springer, 2024) Turan, Mehmet; Goekcay, Erhan; Tora, Hakan; Software Engineering; Mathematics; Airframe and Powerplant Maintenance
    The Arnold's Cat Map (ACM) is one of the chaotic transformations, which is utilized by numerous scrambling and encryption algorithms in Information Security. Traditionally, the ACM is used in image scrambling whereby repeated application of the ACM matrix, any image can be scrambled. The transformation obtained by the ACM matrix is periodic; therefore, the original image can be reconstructed using the scrambled image whenever the elements of the matrix, hence the key, is known. The transformation matrices in all the chaotic maps employing ACM has limitations on the choice of the free parameters which generally require the area-preserving property of the matrix used in transformation, that is, the determinant of the transformation matrix to be +/- 1.\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pm 1.$$\end{document} This reduces the number of possible set of keys which leads to discovering the ACM matrix in encryption algorithms using the brute-force method. Additionally, the period obtained is small which also causes the faster discovery of the original image by repeated application of the matrix. These two parameters are important in a brute-force attack to find out the original image from a scrambled one. The objective of the present study is to increase the key space of the ACM matrix, hence increase the security of the scrambling process and make a brute-force attack more difficult. It is proved mathematically that area-preserving property of the traditional matrix is not required for the matrix to be used in scrambling process. Removing the restriction enlarges the maximum possible key space and, in many cases, increases the period as well. Additionally, it is supplied experimentally that, in scrambling images, the new ACM matrix is equivalent or better compared to the traditional one with longer periods. Consequently, the encryption techniques with ACM become more robust compared to the traditional ones. The new ACM matrix is compatible with all algorithms that utilized the original matrix. In this novel contribution, we proved that the traditional enforcement of the determinant of the ACM matrix to be one is redundant and can be removed.
  • Article
    Citation Count: 7
    A generalized Arnold's Cat Map transformation for image scrambling
    (Springer, 2022) Tora, Hakan; Gokcay, Erhan; Turan, Mehmet; Buker, Mohamed; Mathematics; Software Engineering; Airframe and Powerplant Maintenance
    This study presents a new approach to generate the transformation matrix for Arnold's Cat Map (ACM). Matrices of standard and modified ACM are well known by many users. Since the structure of the possible matrices is known, one can easily select one of them and use it to recover the image with several trials. However, the proposed method generates a larger set of transform matrices. Thus, one will have difficulty in estimating the transform matrix used for scrambling. There is no fixed structure for our matrix as in standard or modified ACM, making it much harder for the transform matrix to be discovered. It is possible to use different type, order and number of operations to generate the transform matrix. The quality of the shuffling process and the strength against brute-force attacks of the proposed method is tested on several benchmark images.
  • Conference Object
    Citation Count: 0
    Effect of Secret Image Transformation on the Steganography Process
    (Ieee, 2017) Buker, Mohamed; Tora, Hakan; Gokcay, Erhan; Software Engineering; Airframe and Powerplant Maintenance
    Steganography is the art of hiding information in something else. It is favorable over encryption because encryption only hides the meaning of the information; whereas steganography hides the existence of the information. The existence of a hidden image decreases Peak Signal to Noise Ratio (PSNR) and increases Mean Square Error (MSE) values of the stego image. We propose an approach to improve PSNR and MSE values in stego images. In this method a transformation is applied to the secret image, concealed within another image, before embedding into the cover image. The effect of the transformation is tested with Least Significant Bit (LSB) insertion and Discrete Cosine Transformation (DCT) techniques. MSE and PSNR are calculated for both techniques with and without transformation. Results show a better MSE and PSNR values when a transformation is applied for LSB technique but no significant difference was shown in DCT technique.
  • Conference Object
    Citation Count: 1
    An on Demand Virtual CPU Arhitecture based on Cloud Infrastructure
    (Scitepress, 2017) Gokcay, Erhan; Software Engineering
    Cloud technology provides different computational models like, including but not limited to, infrastructure, platform and software as a service. The motivation of a cloud system is based on sharing resources in an optimal and cost effective way by creating virtualized resources that can be distributed easily but the distribution is not necessarily parallel. Another disadvantage is that small computational units like smart devices and less powerful computers, are excluded from resource sharing. Also different systems may have interoperability problems, since the operating system and CPU design differs from each other. In this paper, an on demand dynamically created computational architecture, inspired from the CPU design and called Cloud CPU, is described that can use any type of resource including all smart devices. The computational and data transfer requirements from each unit are minimized. Because of this, the service can be created on demand, each time with a different functionality. The distribution of the calculation over not-so-fast internet connections is compensated by a massively parallel operation. The minimized computational requirements will also reduce the interoperability problems and it will increase fault tolerance because of increased number of units in the system.
  • Article
    Citation Count: 0
    Entropy based streaming big-data reduction with adjustable compression ratio
    (Springer, 2023) Gokcay, Erhan; Software Engineering
    The Internet of Things is a novel concept in which numerous physical devices are linked to the internet to collect, generate, and distribute data for processing. Data storage and processing become more challenging as the number of devices increases. One solution to the problem is to reduce the amount of stored data in such a way that processing accuracy does not suffer significantly. The reduction can be lossy or lossless, depending on the type of data. The article presents a novel lossy algorithm for reducing the amount of data stored in the system. The reduction process aims to reduce the volume of data while maintaining classification accuracy and properly adjusting the reduction ratio. A nonlinear cluster distance measure is used to create subgroups so that samples can be assigned to the correct clusters even though the cluster shape is nonlinear. Each sample is assumed to arrive one at a time during the reduction. As a result of this approach, the algorithm is suitable for streaming data. The user can adjust the degree of reduction, and the reduction algorithm strives to minimize classification error. The algorithm is not dependent on any particular classification technique. Subclusters are formed and readjusted after each sample during the calculation. To summarize the data from the subclusters, representative points are calculated. The data summary that is created can be saved and used for future processing. The accuracy difference between regular and reduced datasets is used to measure the effectiveness of the proposed method. Different classifiers are used to measure the accuracy difference. The results show that the nonlinear information-theoretic cluster distance measure improves the reduction rates with higher accuracy values compared to existing studies. At the same time, the reduction rate can be adjusted as desired, which is a lacking feature in the current methods. The characteristics are discussed, and the results are compared to previously published algorithms.
  • Article
    Citation Count: 1
    An information-theoretic instance-based classifier
    (Elsevier Science inc, 2020) Gokcay, Erhan; Software Engineering
    Classification algorithms are used in many areas to determine new class labels given a training set. Many classification algorithms, linear or not, require a training phase to determine model parameters by using an iterative optimization of the cost function for that particular model or algorithm. The training phase can adjust and fine-tune the boundary line between classes. However, the process may get stuck in a local optimum, which may or may not be close to the desired solution. Another disadvantage of training processes is that upon arrival of a new sample, a retraining of the model is necessary. This work presents a new information-theoretic approach to an instance-based supervised classification. The boundary line between classes is calculated only by the data points without any external parameters or weights, and it is given in closed-form. The separation between classes is nonlinear and smooth, which reduces memorization problems. Since the method does not require a training phase, classified samples can be incorporated in the training set directly, simplifying a streaming classification operation. The boundary line can be replaced with an approximation or regression model for parametric calculations. Features and performance of the proposed method are discussed and compared with similar algorithms. (C) 2020 Elsevier Inc. All rights reserved.
  • Article
    Citation Count: 1
    A New Multi-Target Compiler Architecture for Edge-Devices and Cloud Management
    (Gazi Univ, 2022) Gokcay, Erhan; Software Engineering
    Edge computing is the concept where the computation is handled at edge-devices. The transfer of the computation from servers to edge-devices will decrease the massive amount of data transfer generated by edge-devices. There are several efficient management tools for setup and connection purposes, but these management tools cannot provide a unified programming system from a single source code/project. Even though it is possible to control each device efficiently, a global view of the computation is missing in a programming project that includes several edge-devices for computation and data analysis purposes, and the devices need to be programmed individually. A generic workflow engine might automate part of the problem using standard interfaces and predefined objects miming on edge-devices. Nevertheless, the approach fails in fine-tuning each edge-device since the computation cannot be moved easily among devices. This paper introduces a new compiler architecture to control and program edge-devices from a single source code. The source code can be distributed to multiple edge-devices using simple compiler directives, and the transfer and communication of the source code with multiple devices are handled transparently. Fine-tuning the source code and code movement between devices becomes very efficient in editing and time. The proposed architecture is a lightweight system with fine-tuned computation and distribution among devices.