Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Chen, Jiankaia; c | Li, Zhongyanb | Wang, Xinc; * | Zhai, Junhaic
Affiliations: [a] School of Control and Computer Engineering, North China Electric Power University, Beijing, China | [b] School of Mathematics and Physics, North China Electric Power University, Beijing, China | [c] Hebei Key Laboratory of Machine Learning and Computational Intelligence, College of Mathematics and Information Science, Hebei University, Baoding, Hebei, China
Correspondence: [*] Corresponding author. Xin Wang, Hebei Key Laboratory of Machine Learning and Computational Intelligence, College of Mathematics and Information Science, Hebei University, Baoding, 071002, Hebei, China. E-mail: wangx@hbu.edu.cn.
Abstract: Monotonic classification is a widely applied classification task where improvements in specific input values do not lead to worse outputs. Monotonic classifiers based on K-nearest neighbors (KNN) have become crucial tools for addressing such tasks. However, these models share drawbacks with traditional KNN classifiers, including high computational complexity and sensitivity to noise. Fuzzy Monotonic K-Nearest Neighbors (FMKNN) is currently the state-of-the-art KNN-based monotonic classifier, mitigating the impact of noise to some extent. Nevertheless, there is still room for improvement in reducing computational complexity and softening monotonicity in FMKNN. In this paper, we propose a prototype selection algorithm based on FMKNN, named Condensed Fuzzy Monotonic K-Nearest Neighbors (C-FMKNN). This algorithm achieves a dynamic balance between monotonicity and test accuracy by constructing a joint evaluation function that combines fuzzy ranking conditional entropy and correct prediction. Data reduction and simplifying computations can be achieved by using C-FMKNN to filter out instance subsets under the adaptive dynamic balance between monotonicity and test accuracy. Extensive experiments show that the proposed C-FMKNN improves significantly in terms of ACCU, MAE and NMI compared with the involved KNN-based non-monotonic algorithms and non-KNN monotonic algorithms. Compared with the instance selection algorithms MCNN, MENN, and MONIPS, C-FMKNN improves the average values of ACCU, MAE, and NMI by 3.7%, 3.6% and 18.3%, respectively, on the relevant datasets. In particular, compared with the benchmark algorithm FMKNN, C-FMKNN achieves an average data reduction rate of 58.74% while maintaining or improving classification accuracy.
Keywords: Monotonic classification, fuzzy monotonic K-nearest neighbor, fuzzy ranking conditional entropy, joint evaluation function, data reduction
DOI: 10.3233/JIFS-236643
Journal: Journal of Intelligent & Fuzzy Systems, vol. Pre-press, no. Pre-press, pp. 1-22, 2024
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl