Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Xu, Shuzhen*; | Wang, Jin | Zhu, Qing
Affiliations: Information Department, Beijing University of Technology, Beijing, China
Correspondence: [*] Corresponding author. Shuzhen Xu, Information Department, Beijing University of Technology, Beijing, 100124, China. E-mail: xhbcdjsh@emails.bjut.edu.cn.
Abstract: Motivated by a widely studied computer vision task: image inpainting, we became interested in a less concerned problem image outpainting. By which, contents beyond the image boundaries may be extrapolated. In recent years, deep learning methods have achieved remarkable improvements in image inpainting, these techniques can be considered to be applied to image outpainting as solutions. However, many of these inpainting methods generate image blocks generally resulting in blur or smooth. Recently, hallucinating edges for the missing holes before completion has been proved to be a state-of-the-art image inpainting method. Refer to the aforementioned method, we propose a three-phase outpainting model that consists of an edge generation phase, an image expansion phase and a refinement phase. In order to depict the edge lines more accurately, we adopt a comparatively effective focal loss for edge prediction. An optimization stage with a refinement network is also added since large portions outside the image need to be inferred, and discriminator in this stage works on a decreased patch size with a coarse-to-fine fashion. In addition, with recursive outpainting, an image could be expanded arbitrarily. Experiments show that an image can be effectively expanded by our method, and our outpainting method of predicting edges and then coloring is generally superior to other methods both quantitatively and qualitatively.
Keywords: Outpainting, edge detector, generative adversarial network, focal loss
DOI: 10.3233/JIFS-191310
Journal: Journal of Intelligent & Fuzzy Systems, vol. 39, no. 1, pp. 371-381, 2020
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl