Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Tharwat, Alaaa; b; *; ** | Gaber, Tarekc; *; *** | Ibrahim, Abdelhameedd; * | Hassanien, Aboul Ellae; *
Affiliations: [a] Department of Computer Science and Engineering, Frankfurt University of Applied Sciences, Frankfurt am Main, Germany | [b] Faculty of Engineering, Suez Canal University, Egypt. E-mail: engalaatharwat@hotmail.com | [c] Faculty of Computers and Informatics, Suez Canal University, Egypt. E-mail: tmgaber@gmail.com | [d] Faculty of Engineering, Mansoura University, Egypt. E-mail: afai79@yahoo.com | [e] Faculty of Computers and Information, Cairo University, Egypt. E-mail: aboitcairo@gmail.com
Correspondence: [**] Corresponding author. E-mail: engalaatharwat@hotmail.com.
Correspondence: [***] Corresponding author. E-mail: tmgaber@gmail.com.
Note: [*] Scientific Research Group in Egypt, (SRGE), http://www.egyptscience.net.
Abstract: Linear Discriminant Analysis (LDA) is a very common technique for dimensionality reduction problems as a pre-processing step for machine learning and pattern classification applications. At the same time, it is usually used as a black box, but (sometimes) not well understood. The aim of this paper is to build a solid intuition for what is LDA, and how LDA works, thus enabling readers of all levels be able to get a better understanding of the LDA and to know how to apply this technique in different applications. The paper first gave the basic definitions and steps of how LDA technique works supported with visual explanations of these steps. Moreover, the two methods of computing the LDA space, i.e. class-dependent and class-independent methods, were explained in details. Then, in a step-by-step approach, two numerical examples are demonstrated to show how the LDA space can be calculated in case of the class-dependent and class-independent methods. Furthermore, two of the most common LDA problems (i.e. Small Sample Size (SSS) and non-linearity problems) were highlighted and illustrated, and state-of-the-art solutions to these problems were investigated and explained. Finally, a number of experiments was conducted with different datasets to (1) investigate the effect of the eigenvectors that used in the LDA space on the robustness of the extracted feature for the classification accuracy, and (2) to show when the SSS problem occurs and how it can be addressed.
Keywords: Dimensionality reduction, PCA, LDA, Kernel Functions, Class-Dependent LDA, Class-Independent LDA, SSS (Small Sample Size) problem, eigenvectors artificial intelligence
DOI: 10.3233/AIC-170729
Journal: AI Communications, vol. 30, no. 2, pp. 169-190, 2017
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl