A user-friendly deep learning application for accurate lung cancer diagnosis
Article type: Research Article
Authors: Tai, Duong Thanha; * | Nhu, Nguyen Tanb; c | Tuan, Pham Anhd | Sulieman, Abdelmoneime; f | Omer, Hibag | Alirezaei, Zahrah | Bradley, Davidi; j | Chow, James C.L.k; l; *
Affiliations: [a] Department of Medical Physics, Faculty of Medicine, Nguyen Tat Thanh University, Ho Chi Minh City, Vietnam | [b] School of Biomedical Engineering, Ho Chi Minh City International University (VNU-HCM), Ho Chi Minh City, Vietnam | [c] Vietnam National University Ho Chi Minh City, Vietnam | [d] Nuclear Medicine and Oncology Centre, Bach Mai Hospital, Ha Noi, Vietnam | [e] Radiology and Medical Imaging Department Prince Sattam Bin Abdulaziz University College of Applied Medical Sciences, Al-Kharj, Saudi Arabia | [f] Radiological Science Department, College of Applied Medical Sciences, Al Ahsa, Saudi Arabia, King Saud bin Abdulaziz University for Health Sciences, Riyadh, Saudi Arabia | [g] Department of Basic Sciences, Deanship of Preparatory Year and Supporting Studies, Imam Abdulrahman Bin Faisal University, Dammam, Saudi Arabia | [h] Radiology Department, Paramedical School, Bushehr University of Medical Sciences, Bushehr, Iran | [i] Applied Physics and Radiation Technologies Group, CCDCU, Sunway University, Subang Jaya, PJ, Malaysia | [j] School of Mathematics and Physics, University of Surrey, Guildford, UK | [k] Department of Radiation Oncology, University of Toronto, Toronto, ON, Canada | [l] Radiation Medicine Program, Princess Margaret Cancer Centre, University Health Network, Toronto, ON, Canada
Correspondence: [*] Corresponding author: Dr. Duong Thanh Tai. E-mail: dttai@ntt.edu.vn; ORCID: https://orcid.org/0000-0001-7276-8105; Dr. James C.L. Chow (E-mail: James.Chow@uhn.ca; ORCID: https://orcid.org/0000-0003-4202-4855).
Abstract: BACKGROUND:Accurate diagnosis and subsequent delineated treatment planning require the experience of clinicians in the handling of their case numbers. However, applying deep learning in image processing is useful in creating tools that promise faster high-quality diagnoses, but the accuracy and precision of 3-D image processing from 2-D data may be limited by factors such as superposition of organs, distortion and magnification, and detection of new pathologies. The purpose of this research is to use radiomics and deep learning to develop a tool for lung cancer diagnosis. METHODS:This study applies radiomics and deep learning in the diagnosis of lung cancer to help clinicians accurately analyze the images and thereby provide the appropriate treatment planning. 86 patients were recruited from Bach Mai Hospital, and 1012 patients were collected from an open-source database. First, deep learning has been applied in the process of segmentation by U-NET and cancer classification via the use of the DenseNet model. Second, the radiomics were applied for measuring and calculating diameter, surface area, and volume. Finally, the hardware also was designed by connecting between Arduino Nano and MFRC522 module for reading data from the tag. In addition, the displayed interface was created on a web platform using Python through Streamlit. RESULTS:The applied segmentation model yielded a validation loss of 0.498, a train loss of 0.27, a cancer classification validation loss of 0.78, and a training accuracy of 0.98. The outcomes of the diagnostic capabilities of lung cancer (recognition and classification of lung cancer from chest CT scans) were quite successful. CONCLUSIONS:The model provided means for storing and updating patients’ data directly on the interface which allowed the results to be readily available for the health care providers. The developed system will improve clinical communication and information exchange. Moreover, it can manage efforts by generating correlated and coherent summaries of cancer diagnoses.
Keywords: Lung cancer, deep learning-based diagnosis, radiomics, computer-aided diagnosis
DOI: 10.3233/XST-230255
Journal: Journal of X-Ray Science and Technology, vol. 32, no. 3, pp. 611-622, 2024