Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Jiang, Shi Boa; b | Sun, Yue Wena; b | Xu, Shuoa; b | Zhang, Hua Xiaa; b | Wu, Zhi Fanga; b; *
Affiliations: [a] Institute of Nuclear and New Energy Technology, Tsinghua University, BeiJing, China | [b] Tsinghua University-Beijing Key Laboratory of Nuclear Detection Technology
Correspondence: [*] Corresponding author: Zhi Fang Wu, E-mail: zhifang.wu@mail.tsinghua.edu.cn.
Abstract: Accurate segmentation of industrial CT images is of great significance in industrial fields such as quality inspection and defect analysis. However, reconstruction of industrial CT images often suffers from typical metal artifacts caused by factors like beam hardening, scattering, statistical noise, and partial volume effects. Traditional segmentation methods are difficult to achieve precise segmentation of CT images mainly due to the presence of these metal artifacts. Furthermore, acquiring paired CT image data required by fully supervised networks proves to be extremely challenging. To address these issues, this paper introduces an improved CycleGAN approach for achieving semi-supervised segmentation of industrial CT images. This method not only eliminates the need for removing metal artifacts and noise, but also enables the direct conversion of metal artifact-contaminated images into segmented images without the requirement of paired data. The average values of quantitative assessment of image segmentation performance can reach 0.96645 for Dice Similarity Coefficient(Dice) and 0.93718 for Intersection over Union(IoU). In comparison to traditional segmentation methods, it presents significant improvements in both quantitative metrics and visual quality, provides valuable insights for further research.
Keywords: Industrial CT, image segmentation, metal artifact, CycleGAN, dataset acquisition, semi-supervised
DOI: 10.3233/XST-230233
Journal: Journal of X-Ray Science and Technology, vol. 32, no. 2, pp. 271-283, 2024
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl