Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Zhang, Haoa; b; c; d | Hua, Haiyanga; b; c; * | Liu, Tiancia; b; c
Affiliations: [a] Key Laboratory of Opto-Electronic Information Processing, Chinese Academy of Sciences, Shenyang, China | [b] Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, China | [c] Institutes for Robotics and Intelligent Manufacturing, Chinese Academy of Sciences, Shenyang, China | [d] University of Chinese Academy of Sciences, Beijing, China
Correspondence: [*] Corresponding author. Haiyang Hua. E-mail: c3i11@sia.cn.
Abstract: Most of the deep learning object detection methods based on multi-modal information fusion cannot directly control the quality of the fused images at present, because the fusion only depends on the detection results. The indirectness of control is not conducive to the target detection of the network in principle. For the sake of the problem, we propose a multimodal information cross-fusion detection method based on a generative adversarial network (CrossGAN-Detection), which is composed of GAN and a target detection network. And the target detection network acts as the second discriminator of GAN during training. Through the content loss function and dual discriminator, directly controllable guidance is provided for the generator, which is designed to learn the relationship between different modes adaptively through cross fusion. We conduct abundant experiments on the KITTI dataset, which is the prevalent dataset in the fusion-detection field. The experimental results show that the AP of the novel method for vehicle detection achieves 96.66%, 87.15%, and 78.46% in easy, moderate, and hard categories respectively, which is improved about 7% compared to the state-of-art methods.
Keywords: Target detection, multimodal data, GAN, controllable fusion
DOI: 10.3233/JIFS-213074
Journal: Journal of Intelligent & Fuzzy Systems, vol. 43, no. 5, pp. 5771-5782, 2022
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl