Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Yang, Yua; b | Wang, Xina | Liu, Zhenfanga | Huang, Mina | Sun, Shangpengb; * | Zhu, Qibinga; *
Affiliations: [a] Key Laboratory of Advanced Process Control for Light Industry (Ministry of Education), Jiangnan University, Wuxi, Jiangsu, China | [b] Department of Bioresource Engineering, Macdonald Campus, McGill University, Sainte-Anne-de-Bellevue, QC, Canada
Correspondence: [*] Corresponding authors: Shangpeng Sun, Department of Bioresource Engineering, McGill University, Macdonald Stewart Building, 21111 Lakeshore Road, Sainte-Anne-de-Bellevue, QC, Canada. E-mail: shangpeng.sun@mcgill.ca. Qibing Zhu, Key Laboratory of Advanced Process Control for Light Industry (Ministry of Education), Jiangnan University, 1800 Lihu Avenue, Wuxi, Jiangsu 214122, China. E-mail: zhuqib@163.com.
Abstract: The first major contribution of the paper is the proposal of using an improved DEtection Transformer network (named R2N-DETR) and Kinect-V2 camera for detecting multiple-size peaches under orchards with varied illumination and fruit occlusion. R2N-DETR model first employed Res2Net-50 to extract a fused low-high level feature map containing fine spatial features and precise semantic information of multi-size peaches from Red-Green-Blue-Depth (RGB-D) images. Second, the encoder-decoder was performed on the feature map to obtain the global context. Finally, all detected objects were detected according to each object’s global context. For the detection of 1101 RGB-D images (imaged from two orchards over three years), the R2N-DETR model achieves an average precision of 0.944 and an average detecting time of 53 ms for each image. The developed system could provide precise visual guidance for robotic picking and contribute to improving yield prediction by providing accurate fruit counting.
Keywords: Deep learning, peach detection, RGB-D image, R2N-DETR, open orchard
DOI: 10.3233/IDA-220449
Journal: Intelligent Data Analysis, vol. 27, no. 5, pp. 1539-1554, 2023
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl