Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Zhang, Zhongpenga; b; * | Wang, Guibaob
Affiliations: [a] Trine Engineering Institute, Shaanxi University of Technology, HanZhong, Shaanxi, China | [b] School of Physics and Telecommunication Engineering, Shaanxi University of Technology, Hanzhong, Shaanxi, China
Correspondence: [*] Corresponding author: Zhongpeng Zhang, Trine Engineering Institute, Shaanxi University of Technology, Hanzhong, Shaanxi 723001, China. E-mail: anthony@snut.edu.com.
Abstract: This work aims to advance the security management of complex networks to better align with evolving societal needs. The work employs the Ant Colony Optimization algorithm in conjunction with Long Short-Term Memory neural networks to reconstruct and optimize task networks derived from time series data. Additionally, a trend-based noise smoothing scheme is introduced to mitigate data noise effectively. The approach entails a thorough analysis of historical data, followed by applying trend-based noise smoothing, rendering the processed data more scientifically robust. Subsequently, the network reconstruction problem for time series data originating from one-dimensional dynamic equations is addressed using an algorithm based on the principles of Stochastic Gradient Descent (SGD). This algorithm decomposes time series data into smaller samples and yields optimal learning outcomes in conjunction with an adaptive learning rate SGD approach. Experimental results corroborate the remarkable fidelity of the weight matrix reconstructed by this algorithm to the true weight matrix. Moreover, the algorithm exhibits efficient convergence with increasing data volume, manifesting shorter time requirements per iteration while ensuring the attainment of optimal solutions. When the sample size remains constant, the algorithm’s execution time is directly proportional to the square of the number of nodes. Conversely, as the sample size scales, the SGD algorithm capitalizes on the availability of more information, resulting in improved learning outcomes. Notably, when the noise standard deviation is 0.01, models predicated on SGD and the Least-Squares Method (LSM) demonstrate reduced errors compared to instances with a noise standard deviation of 0.1, highlighting the sensitivity of LSM to noise. The proposed methodology offers valuable insights for advancing research in complex network studies.
Keywords: Heuristic algorithm, long short-term memory neural network, optimal task network security, topology time series, stochastic gradient descent
DOI: 10.3233/JCM-237124
Journal: Journal of Computational Methods in Sciences and Engineering, vol. 24, no. 2, pp. 697-714, 2024
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl