Erol, Berkay

Loading...
Profile Picture
Name Variants
E.,Berkay
Erol,B.
B., Erol
Berkay, Erol
E., Berkay
B.,Erol
Erol, Berkay
Job Title
Araştırma Görevlisi
Email Address
berkay.erol@atilim.edu.tr
Main Affiliation
English Translation and Interpretation
Status
Website
ORCID ID
Scopus Author ID
Turkish CoHE Profile ID
Google Scholar ID
WoS Researcher ID

Sustainable Development Goals

SDG data is not available
This researcher does not have a Scopus ID.
This researcher does not have a WoS ID.
Scholarly Output

1

Articles

0

Views / Downloads

1/0

Supervised MSc Theses

0

Supervised PhD Theses

0

WoS Citation Count

0

Scopus Citation Count

2

WoS h-index

0

Scopus h-index

1

Patents

0

Projects

0

WoS Citations per Publication

0.00

Scopus Citations per Publication

2.00

Open Access Source

1

Supervised Theses

0

Google Analytics Visitor Traffic

JournalCount
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) -- 16th International Conference on Human-Computer Interaction: Applications and Services, HCI International 2014 -- 22 June 2014 through 27 June 2014 -- Heraklion, Crete -- 1059201
Current Page: 1 / 1

Scopus Quartile Distribution

Quartile distribution chart data is not available

Competency Cloud

GCRIS Competency Cloud

Scholarly Output Search Results

Now showing 1 - 1 of 1
  • Conference Object
    Citation - Scopus: 2
    Haptic User Interface Integration for 3d Game Engines
    (Springer Verlag, 2014) Sengul,G.; Çaǧiltay,N.E.; Özçelik,E.; Tuner,E.; Erol,B.
    Touch and feel senses of human beings provide important information about the environment. When those senses are integrated with the eyesight, we may get all the necessary information about the environment. In terms of human-computer-interaction, the eyesight information is provided by visual displays. On the other hand, touch and feel senses are provided by means of special devices called "haptic" devices. Haptic devices are used in many fields such as computer-aided design, distance-surgery operations, medical simulation environments, training simulators for both military and medical applications, etc. Besides the touch and sense feelings haptic devices also provide force-feedbacks, which allows designing a realistic environment in virtual reality applications. Haptic devices can be categorized into three classes: tactile devices, kinesthetic devices and hybrid devices. Tactile devices simulate skin to create contact sensations. Kinesthetic devices apply forces to guide or inhibit body movement, and hybrid devices attempt to combine tactile and kinesthetic feedback. Among these kinesthetic devices exerts controlled forces on the human body, and it is the most suitable type for the applications such as surgical simulations. The education environments that require skill-based improvements, the touch and feel senses are very important. In some cases providing such educational environment is very expensive, risky and may also consist of some ethical issues. For example, surgical education is one of these fields. The traditional education is provided in operating room on real patients. This type of education is very expensive, requires long time periods, and does not allow any error-and-try type of experiences. It is stressfully for both the educators and the learners. Additionally there are several ethical considerations. Simulation environments supported by such haptic user interfaces provide an alternative and safer educational alternative. There are several studies showing some evidences of educational benefits of this type of education (Tsuda et al 2009; Sutherland et al 2006). Similarly, this technology can also be successfully integrated to the physical rehabilitation process of some diseases requiring motor skill improvements (Kampiopiotis & Theodorakou, 2003). Hence, today simulation environments are providing several opportunities for creating low cost and more effective training and educational environment. Today, combining three dimensional (3D) simulation environments with these haptic interfaces is an important feature for advancing current human-computer interaction. On the other hand haptic devices do not provide a full simulation environment for the interaction and it is necessary to enhance the environment by software environments. Game engines provide high flexibility to create 3-D simulation environments. Unity3D is one of the tools that provides a game engine and physics engine for creating better 3D simulation environments. In the literature there are many studies combining these two technologies to create several educational and training environments. However, in the literature, there are not many researches showing how these two technologies can be integrated to create simulation environment by providing haptic interfaces as well. There are several issues that need to be handled for creating such integration. First of all the haptic devices control libraries need to be integrated to the game engine. Second, the game engine simulation representations and real-time interaction features need to be coordinately represented by the haptic device degree of freedom and force-feedback speed and features. In this study, the integration architecture of Unity 3D game engine and the PHANToM Haptic device for creating a surgical education simulation environment is provided. The methods used for building this integration and handling the synchronization problems are also described. The algorithms developed for creating a better synchronization and user feedback such as providing a smooth feeling and force feedback for the haptic interaction are also provided. We believe that, this study will be helpful for the people who are creating simulation environment by using Unity3D technology and PHANToM haptic interfaces. © 2014 Springer International Publishing.