Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Mesquita, R.G.* | Mello, C.A.B.
Affiliations: Centro de Informática, Universidade Federal de Pernambuco, Av. Jornalista Anibal Fernandes, s/n - Cidade Universitária, Recife, Brazil
Correspondence: [*] Corresponding author: R.G. Mesquita, entro de Informática, Universidade Federal de Pernambuco, Av. Jornalista Anibal Fernandes, s/n - Cidade Universitária, Recife, Brazil. E-mail:rgm@cin.ufpe.br
Abstract: The search for an object in an image commonly occurs by detecting interest points in the whole scene, followed by matching all descriptors extracted from the selected points with the descriptors of the target. This paper proposes an iterative analysis of the scene by using saliency maps to guide the search to more promising regions, instead of processing the whole image at once. To do so, a patch-based version of SURF (Speeded-Up Robust Features detector) is introduced just as a saliency detection method, named Background Laplacian Saliency (BLS), which is based on a pixel's distance from the estimated background of the scene combined with a Laplacian-based operator. Using BLS and several state-of-the art saliency algorithms, the application of saliency maps to guide the recognition of objects instances using the patch-based SURF is investigated. Experiments showed that when the search object is present in the scene, the search's speed is significantly improved, but, on the other hand, when the target is absent, the required processing time increases. Based on this finding, this work also proposes the application of saliency maps to define the level of detail in which each scene location should be processed. Using a novel and practical evaluation, which considers both speed and accuracy of saliency detection algorithms, it is demonstrated that the processing time of visual search is reduced to about 52% of the time required by SURF when the target's presence is guaranteed, and from 63% to 79% of that time when no guarantee exists, without any significant loss of accuracy, for both cases.
Keywords: Saliency detection, visual search, visual attention, object recognition, feature detection, feature description and matching
DOI: 10.3233/ICA-160528
Journal: Integrated Computer-Aided Engineering, vol. 23, no. 4, pp. 385-400, 2016
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl