Deep Reinforcement Learning for Economic Energy Scheduling in Data Center Microgrids

Document Type

Conference Proceeding

Date of Original Version

8-1-2019

Abstract

This paper investigates the economic energy scheduling problem for data center microgrids with renewables integration via deep reinforcement learning. A multi-factor economic energy scheduling problem is first formulated to activate renewables real-time prices-aware consumption activities. Then, we transform the formulated problem as a Markov decision process. On this basis, an online optimization is further performed to cope with system uncertainties through the integration of reinforcement learning and neural networks. Unlike traditional methods relying on a priori knowledge or requiring accurate modeling of system dynamics, our employed deep policy gradient-based learning algorithm is capable to directly learn the optimal actions from available database. Simulations with realistic traces are performed to verify the effectiveness of the proposed algorithm.

Publication Title, e.g., Journal

IEEE Power and Energy Society General Meeting

Volume

2019-August

Share

COinS