Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Peng, Yonga; b; c; * | Zhang, Leijiea | Kong, Wanzenga; c | Qin, Feiweia | Zhang, Jianhaia; c
Affiliations: [a] School of Computer Science and Technology, Hangzhou Dianzi University, Hangzhou, China | [b] Provincial Key Laboratory for Computer Information Processing Technology, Soochow University, Suzhou, China | [c] Key Laboratory of Brain Machine Collaborative Intelligence of Zhejiang Province, Hangzhou, China
Correspondence: [*] Corresponding author. Yong Peng, School of Computer Science and Technology, Hangzhou Dianzi University, Hangzhou 310018, China. E-mail: yongpeng@hdu.edu.cn.
Abstract: Subspace learning aims to obtain the corresponding low-dimensional representation of high dimensional data in order to facilitate the subsequent data storage and processing. Graph-based subspace learning is a kind of effective subspace learning methods by modeling the data manifold with a graph, which can be included in the general spectral regression (SR) framework. By using the least square regression form as objective function, spectral regression mathematically avoids performing eign-decomposition on dense matrices and has excellent flexibility. Recently, spectral regression has obtained promising performance in diverse applications; however, it did not take the underlying classes/tasks correlation patterns of data into consideration. In this paper, we propose to improve the performance of spectral regression by exploring the correlation among classes with low-rank modeling. The newly formulated low-rank spectral regression (LRSR) model is achieved by decomposing the projection matrix in SR by two factor matrices which were respectively regularized. The LRSR objective function can be handled by the alternating direction optimization framework. Besides some analysis on the differences between LRSR and existing related models, we conduct extensive experiments by comparing LRSR with its full rank counterpart on benchmark data sets and the results demonstrate its superiority.
Keywords: Low-rankness, spectral regression, matrix factorization, subspace learning, classification
DOI: 10.3233/JIFS-191752
Journal: Journal of Intelligent & Fuzzy Systems, vol. 39, no. 3, pp. 3401-3412, 2020
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl