Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Subtitle:
Article type: Research Article
Authors: Lu, Shuxiaa; b; * | Wang, Xizhaoc | Zhang, Guiqianga | Zhou, Xua
Affiliations: [a] College of Mathematics and Information Science, Hebei University, Baoding, Hebei, China | [b] Key Laboratory of Machine Learning and Computational Intelligence of Hebei Province, Baoding, Hebei, China | [c] College of Computer Science and Software, Shenzhen University, Shenzhen, China
Correspondence: [*] Corresponding author: Shuxia Lu, College of Mathematics and Information Science, Hebei University, Baoding, Hebei 071002, China. E-mail:cmclusx@126.com
Abstract: Extreme learning machine (ELM) is a learning algorithm for single-hidden layer feedforward neural networks (SLFNs) which randomly chooses hidden nodes and analytically determines the output weights of SLFNs. After the input weights and the hidden layer biases are chosen randomly, ELM can be simply considered a linear system. However, the learning time of ELM is mainly spent on calculating the Moore-Penrose inverse matrices of the hidden layer output matrix. This paper focuses on effective computation of the Moore-Penrose inverse matrices for ELM, several methods are proposed. They are the reduced QR factorization with column Pivoting and Geninv ELM (QRGeninv-ELM), tensor product matrix ELM (TPM-ELM). And we compare QRGeninv-ELM, TPM-ELM with the relational algorithm of Moore-Penrose inverse matrices for ELM, the relational algorithms are: Cholesky factorization of singular matrix ELM (Geninv-ELM), QR factorization and Ginv ELM (QRGinv-ELM), the conjugate Gram-Schmidt process ELM (CGS-ELM). The experimental results and the statistical analysis of the experimental results both demonstrate that QRGeninv-ELM, TPM-ELM and Geninv-ELM are faster than other kinds of ELM and can reach comparable generalization performance.
Keywords: Extreme learning machine, tensor product matrix, cholesky factorization of singular matrix, conjugate gram-schmidt process, QR factorization
DOI: 10.3233/IDA-150743
Journal: Intelligent Data Analysis, vol. 19, no. 4, pp. 743-760, 2015
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl