Machine Vs. Deep Learning Comparision for Developing an International Sign Language Translator

dc.contributor.author Eryilmaz, Meltem
dc.contributor.author Balkaya, Ecem
dc.contributor.author Ucan, Eylul
dc.contributor.author Turan, Gizem
dc.contributor.author Oral, Seden Gulay
dc.date.accessioned 2024-07-05T15:24:22Z
dc.date.available 2024-07-05T15:24:22Z
dc.date.issued 2022
dc.description ERYILMAZ, MELTEM/0000-0001-9483-6164; Ucan, Eylul/0000-0001-7138-7087 en_US
dc.description.abstract This study aims to enable deaf and hard-of-hearing people to communicate with other individuals who know and do not know sign language. The mobile application was developed for video classification by using MediaPipe Library in the study. While doing this, considering the problems that deaf and hearing loss individuals face in Turkey and abroad modelling and training stages were carried out with the English language option. With the real-time translation feature added to the study individuals were provided with instant communication. In this way, communication problems experienced by hearing-impaired individuals will be greatly reduced. Machine learning and Deep learning concepts were investigated in the study. Model creation and training stages were carried out using VGG16, OpenCV, Pandas, Keras, and Os libraries. Due to the low success rate in the model created using VGG16, the MediaPipe library was used in the formation and training stages of the model. The reason for this is that, thanks to the solutions available in the MediaPipe library, it can normalise the coordinates in 3D by marking the regions to be detected in the human body. Being able to extract the coordinates independently of the background and body type in the videos in the dataset increases the success rate of the model in the formation and training stages. As a result of an experiment, the accuracy rate of the deep learning model is 85% and the application can be easily integrated with different languages. It is concluded that deep learning model is more accure than machine learning one and the communication problem faced by hearing-impaired individuals in many countries can be reduced easily. en_US
dc.identifier.doi 10.1080/0952813X.2022.2115560
dc.identifier.issn 0952-813X
dc.identifier.issn 1362-3079
dc.identifier.scopus 2-s2.0-85137717392
dc.identifier.uri https://doi.org/10.1080/0952813X.2022.2115560
dc.identifier.uri https://hdl.handle.net/20.500.14411/2426
dc.language.iso en en_US
dc.publisher Taylor & Francis Ltd en_US
dc.relation.ispartof Journal of Experimental & Theoretical Artificial Intelligence
dc.rights info:eu-repo/semantics/closedAccess en_US
dc.subject Sign language en_US
dc.subject hearing loss en_US
dc.subject video classification en_US
dc.subject machine learning en_US
dc.subject deep learning en_US
dc.title Machine Vs. Deep Learning Comparision for Developing an International Sign Language Translator en_US
dc.type Article en_US
dspace.entity.type Publication
gdc.author.id ERYILMAZ, MELTEM/0000-0001-9483-6164
gdc.author.id Ucan, Eylul/0000-0001-7138-7087
gdc.author.scopusid 57213371849
gdc.author.scopusid 57884249400
gdc.author.scopusid 57884503900
gdc.author.scopusid 57884752600
gdc.author.scopusid 57883754600
gdc.bip.impulseclass C5
gdc.bip.influenceclass C5
gdc.bip.popularityclass C5
gdc.coar.access metadata only access
gdc.coar.type text::journal::journal article
gdc.description.department Atılım University en_US
gdc.description.departmenttemp [Eryilmaz, Meltem; Turan, Gizem; Oral, Seden Gulay] Atilim Univ, Fac Engn, Dept Comp Engn, Ankara, Turkey; [Balkaya, Ecem; Ucan, Eylul] Atilim Univ, Fac Engn, Dept Informat Syst Engn, Ankara, Turkey en_US
gdc.description.endpage 984
gdc.description.publicationcategory Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı en_US
gdc.description.scopusquality Q2
gdc.description.startpage 975
gdc.description.volume 36
gdc.description.wosquality Q3
gdc.identifier.openalex W4295353200
gdc.identifier.wos WOS:000849711000001
gdc.oaire.diamondjournal false
gdc.oaire.impulse 0.0
gdc.oaire.influence 2.5349236E-9
gdc.oaire.isgreen false
gdc.oaire.popularity 1.8548826E-9
gdc.oaire.publicfunded false
gdc.oaire.sciencefields 0202 electrical engineering, electronic engineering, information engineering
gdc.oaire.sciencefields 02 engineering and technology
gdc.openalex.collaboration National
gdc.openalex.fwci 0.14769002
gdc.openalex.normalizedpercentile 0.45
gdc.opencitations.count 1
gdc.plumx.facebookshareslikecount 52
gdc.plumx.mendeley 25
gdc.plumx.scopuscites 1
gdc.scopus.citedcount 1
gdc.virtual.author Eryılmaz, Meltem
gdc.wos.citedcount 1
relation.isAuthorOfPublication ec6c4c06-14dd-4654-b3a6-04e89c8d3baf
relation.isAuthorOfPublication.latestForDiscovery ec6c4c06-14dd-4654-b3a6-04e89c8d3baf
relation.isOrgUnitOfPublication e0809e2c-77a7-4f04-9cb0-4bccec9395fa
relation.isOrgUnitOfPublication 4abda634-67fd-417f-bee6-59c29fc99997
relation.isOrgUnitOfPublication 50be38c5-40c4-4d5f-b8e6-463e9514c6dd
relation.isOrgUnitOfPublication.latestForDiscovery e0809e2c-77a7-4f04-9cb0-4bccec9395fa

Files

Collections