An information-theoretic instance-based classifier
No Thumbnail Available
Date
2020
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Elsevier Science inc
Open Access Color
OpenAIRE Downloads
OpenAIRE Views
Abstract
Classification algorithms are used in many areas to determine new class labels given a training set. Many classification algorithms, linear or not, require a training phase to determine model parameters by using an iterative optimization of the cost function for that particular model or algorithm. The training phase can adjust and fine-tune the boundary line between classes. However, the process may get stuck in a local optimum, which may or may not be close to the desired solution. Another disadvantage of training processes is that upon arrival of a new sample, a retraining of the model is necessary. This work presents a new information-theoretic approach to an instance-based supervised classification. The boundary line between classes is calculated only by the data points without any external parameters or weights, and it is given in closed-form. The separation between classes is nonlinear and smooth, which reduces memorization problems. Since the method does not require a training phase, classified samples can be incorporated in the training set directly, simplifying a streaming classification operation. The boundary line can be replaced with an approximation or regression model for parametric calculations. Features and performance of the proposed method are discussed and compared with similar algorithms. (C) 2020 Elsevier Inc. All rights reserved.
Description
Gokcay, Erhan/0000-0002-4220-199X
ORCID
Keywords
Supervised, Entropy, Information theory, Instance-based classification
Turkish CoHE Thesis Center URL
Fields of Science
Citation
1
WoS Q
Q1
Scopus Q
Source
Volume
536
Issue
Start Page
263
End Page
276