Online Microgrid Energy Management Based on Safe Deep Reinforcement Learning
Document Type
Conference Proceeding
Date of Original Version
1-1-2021
Abstract
Microgrids provide power systems with an effective manner to integrate distributed energy resources, increase power supply reliability, and reduce operational cost. However, intermittent renewable energy resources (RESs) makes it challenging to operate a microgrid safely and economically based on forecasting. To overcome this issue, we develop an online energy management approach for efficient microgrid operation using safe deep reinforcement learning (SDRL). By considering uncertainties and AC power flow, the proposed method formulates online microgrid energy management as a constrained Markov decision process (CMDP). The objective is to find a safety-guaranteed scheduling policy to minimize the total operational cost. To achieve this, we use a SDRL method to learn a neural network-based policy based on constrained policy optimization (CPO). Different from tradition DRL methods that allow an agent to freely explore any behavior during training, the proposed method limits the exploration to safe policies that satisfy AC power flow constraints during training. The proposed method is model-free and does not require predictive information or explicit model of the microgrid. The proposed method is trained and tested on a medium voltage distribution network with real-world power grid data from California Independent Operator (CAISO). Simulation results verify the effectiveness and superiority of proposed method over traditional DRL approaches.
Publication Title, e.g., Journal
2021 IEEE Symposium Series on Computational Intelligence Ssci 2021 Proceedings
Citation/Publisher Attribution
Li, Hepeng, Zhenhua Wang, Lusi Li, and Haibo He. "Online Microgrid Energy Management Based on Safe Deep Reinforcement Learning." 2021 IEEE Symposium Series on Computational Intelligence Ssci 2021 Proceedings (2021). doi: 10.1109/SSCI50451.2021.9659545.