Search Results

Now showing 1 - 2 of 2
  • Conference Object
    Design and Implementation of an Expressive Talking Mobile Robot: Toztorus
    (Ieee, 2018) Tozan, Ozalp; Tora, Hakan; Uslu, Baran; Unal, Bulcnt; Ceylan, Ece
    This paper is about a brand new robot and all its development stages from the design to the show time. As an undergraduate research project (the LAP program at Atilim University), the robot TozTorUs is the outcome of the dense efforts of a team. With the sensors equipped, it navigates autonomously in the environment in which it is located by avoiding the obstacles. It can understand your questions and answer them using Google's speech technologies. Although it is not a humanoid robot, with eyes and mouth simulator LED displays, it is as friendly as a human. We can also control TozTorUs using a mobile phone. Apart from these, it is able to adjust its height with respect to the visitor's, thus allowing it to make an eye contact with the person. Although TozTorUs is designed for welcoming, it may also be employed for consulting, security and elderly assistance.
  • Conference Object
    Citation - WoS: 8
    Hand Gesture Classification Using Inertial Based Sensors Via a Neural Network
    (Ieee, 2017) Akan, Erhan; Tora, Hakan; Uslu, Baran
    In this study, a mobile phone equipped with four types of sensors namely, accelerometer, gyroscope, magnetometer and orientation, is used for gesture classification. Without feature selection, the raw data from the sensor outputs are processed and fed into a Multi-Layer Perceptron classifier for recognition. The user independent, single user dependent and multiple user dependent cases are all examined. Accuracy values of 91.66% for single user dependent case, 87.48% for multiple user dependent case and 60% for the user independent case are obtained. In addition, performance of each sensor is assessed separately and the highest performance is achieved with the orientation sensor.