Reinforcement learning-based control of residential energy storage systems for electric bill minimization

Chenxiao Guan, Yanzhi Wang, Xue Lin, Shahin Nazarian, Massoud Pedram

Research output: Chapter in Book/Report/Conference proceedingConference contribution

16 Scopus citations

Abstract

Incorporating residential-level photovoltaic energy generation and energy storage systems have proved useful in utilizing renewable power and reducing electric bills for the residential energy consumer. This is particular true under dynamic energy prices, where consumers can use PV-based generation and controllable storage modules for peak shaving on their power demand profile from the grid. In general, accurate PV power generation and load power consumption predictions and accurate system modeling are required for the storage control algorithm in most previous works. In this work, the reinforcement learning technique is adopted for deriving the optimal control policy for the residential energy storage module, which does not depend on accurate predictions of future PV power generation and/or load power consumption results and only requires partial knowledge of system modeling. In order to achieve higher convergence rate and higher performance in non-Markovian environment, we employ the TD(Λ)-learning algorithm to derive the optimal energy storage system control policy, and carefully define the state and action spaces, and reward function in the TD(Λ)-learning algorithm such that the objective of the reinforcement learning algorithm coincides with our goal of electric bill minimization for the residential consumer. Simulation results over real-world PV power generation and load power consumption profiles demonstrate that the proposed reinforcement learning-based storage control algorithm can achieve up to 59.8% improvement in energy cost reduction.

Original languageEnglish (US)
Title of host publication2015 12th Annual IEEE Consumer Communications and Networking Conference, CCNC 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages637-642
Number of pages6
ISBN (Electronic)9781479963904
DOIs
StatePublished - Jul 14 2015
Event2015 12th Annual IEEE Consumer Communications and Networking Conference, CCNC 2015 - Las Vegas, United States
Duration: Jan 9 2015Jan 12 2015

Publication series

Name2015 12th Annual IEEE Consumer Communications and Networking Conference, CCNC 2015

Other

Other2015 12th Annual IEEE Consumer Communications and Networking Conference, CCNC 2015
CountryUnited States
CityLas Vegas
Period1/9/151/12/15

ASJC Scopus subject areas

  • Computer Networks and Communications

Fingerprint Dive into the research topics of 'Reinforcement learning-based control of residential energy storage systems for electric bill minimization'. Together they form a unique fingerprint.

  • Cite this

    Guan, C., Wang, Y., Lin, X., Nazarian, S., & Pedram, M. (2015). Reinforcement learning-based control of residential energy storage systems for electric bill minimization. In 2015 12th Annual IEEE Consumer Communications and Networking Conference, CCNC 2015 (pp. 637-642). [7158054] (2015 12th Annual IEEE Consumer Communications and Networking Conference, CCNC 2015). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/CCNC.2015.7158054