Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Wang, Shuo | Yang, Jing; * | Yang, Yue
Affiliations: Harbin Engineering University, Harbin, China
Correspondence: [*] Corresponding author. Jing Yang, Harbin Engineering University, Harbin, 150000, China. E-mail: yangjing@hrbeu.edu.cn.
Abstract: Personalized recommendation systems fundamentally assess user preferences as a reflection of their emotional responses to items. Traditional recommendation algorithms, focusing primarily on numerical processing, often overlook emotional factors, leading to reduced accuracy and limited application scenarios. This paper introduces a collaborative filtering recommendation method that integrates features of facial information, derived from emotions extracted from such data. Upon user authorization for camera usage, the system captures facial information features. Owing to the diversity in facial information, deep learning methods classify these features, employing the classification results as emotional labels. This approach calculates the similarity between emotional and item labels, reducing the ambiguity inherent in facial information features. The fusion process of facial information takes into account the user’s emotional state prior to item interaction, which might influence the emotions generated during the interaction. Variance is utilized to measure emotional fluctuations, thereby circumventing misjudgments caused by sustained non-interactive emotions. In selecting the nearest neighboring users, the method considers not only the similarity in user ratings but also in their emotional responses. Tests conducted using the Movielens dataset reveal that the proposed method, modeling facial features, more effectively aligns recommendations with user preferences and significantly enhances the algorithm’s performance.
Keywords: Collaborative filtering algorithm, facial information features, emotional factors, non-interactive emotion
DOI: 10.3233/JIFS-232718
Journal: Journal of Intelligent & Fuzzy Systems, vol. Pre-press, no. Pre-press, pp. 1-20, 2024
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl