TY - GEN
T1 - Reinforcement learning-based control of residential energy storage systems for electric bill minimization
AU - Guan, Chenxiao
AU - Wang, Yanzhi
AU - Lin, Xue
AU - Nazarian, Shahin
AU - Pedram, Massoud
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2015/7/14
Y1 - 2015/7/14
N2 - Incorporating residential-level photovoltaic energy generation and energy storage systems have proved useful in utilizing renewable power and reducing electric bills for the residential energy consumer. This is particular true under dynamic energy prices, where consumers can use PV-based generation and controllable storage modules for peak shaving on their power demand profile from the grid. In general, accurate PV power generation and load power consumption predictions and accurate system modeling are required for the storage control algorithm in most previous works. In this work, the reinforcement learning technique is adopted for deriving the optimal control policy for the residential energy storage module, which does not depend on accurate predictions of future PV power generation and/or load power consumption results and only requires partial knowledge of system modeling. In order to achieve higher convergence rate and higher performance in non-Markovian environment, we employ the TD(Λ)-learning algorithm to derive the optimal energy storage system control policy, and carefully define the state and action spaces, and reward function in the TD(Λ)-learning algorithm such that the objective of the reinforcement learning algorithm coincides with our goal of electric bill minimization for the residential consumer. Simulation results over real-world PV power generation and load power consumption profiles demonstrate that the proposed reinforcement learning-based storage control algorithm can achieve up to 59.8% improvement in energy cost reduction.
AB - Incorporating residential-level photovoltaic energy generation and energy storage systems have proved useful in utilizing renewable power and reducing electric bills for the residential energy consumer. This is particular true under dynamic energy prices, where consumers can use PV-based generation and controllable storage modules for peak shaving on their power demand profile from the grid. In general, accurate PV power generation and load power consumption predictions and accurate system modeling are required for the storage control algorithm in most previous works. In this work, the reinforcement learning technique is adopted for deriving the optimal control policy for the residential energy storage module, which does not depend on accurate predictions of future PV power generation and/or load power consumption results and only requires partial knowledge of system modeling. In order to achieve higher convergence rate and higher performance in non-Markovian environment, we employ the TD(Λ)-learning algorithm to derive the optimal energy storage system control policy, and carefully define the state and action spaces, and reward function in the TD(Λ)-learning algorithm such that the objective of the reinforcement learning algorithm coincides with our goal of electric bill minimization for the residential consumer. Simulation results over real-world PV power generation and load power consumption profiles demonstrate that the proposed reinforcement learning-based storage control algorithm can achieve up to 59.8% improvement in energy cost reduction.
UR - http://www.scopus.com/inward/record.url?scp=84943179777&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84943179777&partnerID=8YFLogxK
U2 - 10.1109/CCNC.2015.7158054
DO - 10.1109/CCNC.2015.7158054
M3 - Conference contribution
AN - SCOPUS:84943179777
T3 - 2015 12th Annual IEEE Consumer Communications and Networking Conference, CCNC 2015
SP - 637
EP - 642
BT - 2015 12th Annual IEEE Consumer Communications and Networking Conference, CCNC 2015
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2015 12th Annual IEEE Consumer Communications and Networking Conference, CCNC 2015
Y2 - 9 January 2015 through 12 January 2015
ER -