Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Issue title: Special Issue on Soft Computing Approaches in Image Analysis
Guest editors: Jude Hemanth, Jacek Zurada and Hemant Kasturiwale
Article type: Research Article
Authors: Theckedath, Dhananjaya; * | Sedamkar, R.R.b
Affiliations: [a] Biomedical Engineering Department, Thadomal Shahani Engineering College, Mumbai, India | [b] Computer Engineering Department, Thakur College of Engineering and Technology, Mumbai, India
Correspondence: [*] Corresponding author: Dhananjay Theckedath, Biomedical Engineering Department, Thadomal Shahani Engineering College, Mumbai, India. E-mail: dhananjay.kishore@gmail.com.
Abstract: Affect detection is a key component in developing Intelligent Human Computer Interface (IHCI) systems. State of the art affect detection systems assume the availability of full un-occluded face images. However image occlusion is a prominent problem which one comes across while dealing with such systems. The challenge is to identify affect states from portions of the face that are available. This paper proposes a novel method of assessing only a segment of the face instead of the whole face for affect detection. This paper aims at finding segments of the face which contain sufficient information to correctly classify the basic affect states. This work uses Convolutional Neural Networks (CNN) with transfer learning to detect 7 basic affect states viz. Angry, Contempt, Disgust, Fear, Happy, Sad and Surprise from a few prominent facial segments. Full face images are partitioned into separate segments viz. Right segment, Left segment, Lower segment and Upper segment. Modified VGG-16 and ResNet-50 networks were trained using each of the segments. Experiments were conducted using these facial segments and results obtained were compared with that of the full face. Using the VGG-16 network, we have been able to achieve validation accuracies of 96.8% for Full face, 97.3% for Right segment of the face, 97.3% for Left segment of the face, 96.6% for Lower segment of the face and 84.7% for Upper segment of the face. The validation accuracies are higher using the ResNet-50 network. Using the ResNet-50 network we have been able to achieve validation accuracies of 99.7% for Full face, 99.47% for Right segment of the face, 100% for Left segment of the face, 99.6% for Lower segment of the face and 90.8% for Upper segment of the face. Apart from accuracy, the other performance matrices used in this work are Precision, Recall and f1-score. Our evaluation, based on these performance matrices show that the results obtained for Right segment, Left segment and Lower segment of the face using both, VGG-16 as well as ResNet-50 networks, are comparable with that of the Full face. Experiments performed clearly indicate that Right segment, Left segment and Lower segment of the face contain sufficient information about the seven affect states and that CNN with transfer learning can be used to accurately classify them.
Keywords: Convolutional neural network, transfer learning, occlusion, affect states
DOI: 10.3233/IDT-190077
Journal: Intelligent Decision Technologies, vol. 14, no. 1, pp. 35-45, 2020
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl