Authors: J. Cao; D. Harrold; Z. Fan; T. Morstyn; D. Healey; and K. Li
Published in: IEEE Transactions on Smart Grid (Volume: 11, Issue: 5, Sept. 2020) https://doi.org/10.1109/TSG.2020.2986333
Date Published: 8th April 2020
Abstract:
Accurate estimation of battery degradation cost is one of the main barriers for battery participating on the energy arbitrage market. This paper addresses this problem by using a model-free deep reinforcement learning (DRL) method to optimize the battery energy arbitrage considering an accurate battery degradation model. Firstly, the control problem is formulated as a Markov Decision Process (MDP). Then a noisy network based deep reinforcement learning approach is proposed to learn an optimized control policy for storage charging/discharging strategy. To address the uncertainty of electricity price, a hybrid Convolutional Neural Network (CNN) and Long Short Term Memory (LSTM) model is adopted to predict the price for the next day. Finally, the proposed approach is tested on the historical U.K. wholesale electricity market prices. The results compared with model based Mixed Integer Linear Programming (MILP) have demonstrated the effectiveness and performance of the proposed framework.
Keywords: Energy storage; energy arbitrage; battery degradation; deep reinforcement learning; noisy networks
Insights for EnergyREV:
This paper proposes a novel AI-based storage management strategy for SLES. Specifically, we have proposed a charging/discharging strategy for energy storage participating in the energy arbitrary based on DRL methods, which is a model-free approach, and can learn any complex system models. In the DRL, a combined CNN and LSTM hybrid network is proposed to predict the electricity prices. Then a NNDDQN is implemented to learn the optimal control policy of battery considering the price uncertainty and battery degradation.