Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Pan, Hongguang; * | Zhang, Huipeng | Lei, Xinyu | Xin, Fangfang | Wang, Zheng
Affiliations: College of Electrical and Control Engineering, Xi’an University of Science and Technology, Xi’an, P. R. China
Correspondence: [*] Corresponding author. Hongguang Pan, College of Electrical and Control Engineering, Xi’an University of Science and Technology, Xi’an 710054, P. R. China. E-mail: hongguangpan@163.com.
Abstract: Object detection is a very important part of computer vision, and the most common method of object detection is the Faster region convolutional neural network (RCNN), which uses CNN to extract image features. However, the parameters to be learned in CNN are enormous and may affecting the efficiency. In this paper, hybrid dilated Faster RCNN (HDF-RCNN) is proposed to solve this problem, and the main contributions are: 1) HDF-RCNN is built through replacing the VGG16 in Faster RCNN with HDC (hybrid dilated CNN) to achieve a fast and accurate object detection algorithm, and the LeakyReLU activation function is used to increase the ability of mapping input information; 2) the portability of HDC, namely, the possibility of embedding the HDC into object detection network with independent feature extraction part is verified. The Microsoft COCO data set is used to verify the performance of HDF-RCNN, and the experiments show that, compared with the traditional Faster RCNN, the testing accuracy of HDF-RCNN is averagely improved by 7.11%, the training loss and training time are averagely reduced by 40.06% and 34.29%, respectively. Therefore, the HDF-RCNN can significantly improve the efficiency of object detection and the HDC can be used as an independent feature extraction network to adapt to many different frameworks.
Keywords: object detection, hybrid dilated convolution, faster RCNN
DOI: 10.3233/JIFS-212740
Journal: Journal of Intelligent & Fuzzy Systems, vol. 43, no. 1, pp. 1229-1239, 2022
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl