Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Bhavsar, Hetala; * | Ganatra, Amitb
Affiliations: [a] Department of Computer Science and Engineering, The M.S. University of Baroda, Vadodara, Gujarat, India | [b] Computer Engineering Department, Charotar University of Science and Technology, Changa, Gujarat, India
Correspondence: [*] Corresponding author: Hetal Bhavsar, Department of Computer Science and Engineering, The M.S. University of Baroda, Vadodara, India. Tel.: +91 9879628487; E-mail:het_bhavsar@yahoo.co.in
Abstract: The Support Vector Machine (SVM) is a powerful technique for data classification. For linearly separable data points, the SVM constructs an optimal separating hyper-plane as a decision surface, to divide the data points of different categories in the vector space. For the non-linearly separable data points, the Kernel functions are used to extend the concept of the optimal separating hyper-plane so that the data can be linearly separable. The different kernel functions have different characteristics and hence the performance of the SVM is highly influenced by the selection of kernel functions. This paper presents the classification algorithm that uses the SVM in the training phase and the Mahalanobis distance in the testing phase, in order to design a classifier which has low impact of kernel function on the classification accuracy, positively. The Mahalanobis distance is used to replace the optimal separating hyper-plane as the classification decision making function in the SVM. The proposed approach is compared with Euclidean-SVM, which uses Euclidean distance function to replace the optimal separating hyper-plane as the classification boundary. It has also been evaluated against conventional SVM too. The experimental results show that the accuracy of the EuDiC (Euclidean Distance towards the Center of data) SVM classifier has a low impact on the implementation of kernel functions. The EuDiC SVM also achieves the drastic reduction in the classification time since it only depends on the mean of Support Vectors (SVs) of each category for classification. To prove its effectiveness on other types of data, the time series data have also been used. Due to robust design of the EuDiC, it also performs well for time series data too.
Keywords: Classification, Euclidean distance, kernel function, Mahalanobis distance, optimal hyper-plane, support vector machine, support vector
DOI: 10.3233/IDA-150348
Journal: Intelligent Data Analysis, vol. 20, no. 6, pp. 1285-1305, 2016
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl