Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Cen, Haifenga | Xu, Yuana | Sun, Kaiyuana | Tian, Haob; c; d; * | Chen, Kuna | Lin, Lina
Affiliations: [a] Guangzhou Power Supply, GuangDong Power Grid Co., Ltd., Guangzhou, China | [b] State Key Lab of Control and Simulation of Power Systems and Generation Equipments, Department of Electrical Engineering, Tsinghua University, Haidian District Beijing, China | [c] WuXi University, Xishan District Wuxi, China | [d] La Consolacion University Philippines, Malolos, Philippines
Correspondence: [*] Corresponding author: Hao Tian, La Consolacion University Philippines, Malolos 3000, Philippines. E-mail: tsinghuath@163.com.
Abstract: This paper examines the use of an Uninterruptible Power Supply (UPS) to enhance the operational efficiency of data centers. It focuses on developing an optimal energy scheduling strategy for a data center equipped with UPS, using the Markov decision process (MDP) framework. The MDP framework simulates the decision-making process involved in minimizing energy costs. Each unit’s available power output in the data center is treated as a Markov state, taking into account the uncertainty associated with renewable distributed generation. This uncertainty drives the system to transition to other Markov states in subsequent decision times. A recursive optimization model is established for each Markov state at each decision time to guide state-based operations, which includes determining the unit output while considering both current and future costs. The challenge of dealing with high dimensionality, arising from a substantial number of states and actions in the model, is effectively addressed by adopting an approximate dynamic programming (ADP) method. This approach incorporates decision-state and forwards dynamic algorithms to tackle the complexity of the MDP-based model. By employing ADP, the computational burden is reduced, enabling efficient and practical solutions to be obtained.
Keywords: Data center, Markov decision process, reinforcement learning, UPS
DOI: 10.3233/JCM-247149
Journal: Journal of Computational Methods in Sciences and Engineering, vol. 24, no. 3, pp. 1317-1329, 2024
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl