Show simple item record

dc.contributor.authorAli, KH
dc.contributor.authorSigalo, M
dc.contributor.authorDas, S
dc.contributor.authorAnderlini, E
dc.contributor.authorTahir, AA
dc.contributor.authorAbusara, M
dc.date.accessioned2021-09-07T09:38:01Z
dc.date.issued2021-09-09
dc.description.abstractGrid-connected microgrids consisting of renewable energy sources, battery storage, and load, require an appropriate energy management system that controls the battery operation. Traditionally, the operation of the battery is optimised using 24-hours of forecasted data of load demand and renewable energy sources (RES) generation using offline optimisation techniques, where the battery actions (charge/discharge/idle) are determined before the start of the day. Reinforcement Learning (RL) has recently been suggested as an alternative to these traditional techniques due to its ability to learn optimal policy online using real data. Two approaches of RL have been suggested in the literature viz. offline and online. In offline RL the agent learns the optimum policy using predicted generation and load data. Once convergence is achieved, battery commands are dispatched in real-time. This method is similar to traditional methods because it relies on forecasted data. In online RL, on the other hand, the agent learns the optimum policy by interacting with the system in real time using real data. This paper investigates the effectiveness of both the approaches. White Gaussian noise with different standard deviations was added to real data to create synthetic predicted data to validate the method. In the first approach, the predicted data was then used by an offline RL algorithm. In the second approach, the online RL algorithm interacted with real streaming data in real time and the agent was trained using real data. When energy costs of the two approaches were compared, it was found that the online RL provides better results than the offline approach if the difference between real and predicted data is greater than 1.6%.en_GB
dc.description.sponsorshipEngineering and Physical Sciences Research Council (EPSRC)en_GB
dc.identifier.citationVol. 14 (18), article 5688en_GB
dc.identifier.doi10.3390/en14185688
dc.identifier.grantnumberEP/T025875/1en_GB
dc.identifier.urihttp://hdl.handle.net/10871/126995
dc.language.isoenen_GB
dc.publisherMDPIen_GB
dc.rights© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
dc.subjectReinforcement learning (RL)en_GB
dc.subjectmicrogriden_GB
dc.subjectbattery managementen_GB
dc.subjectoffline and online RLen_GB
dc.subjectOptimisationen_GB
dc.titleReinforcement Learning for Energy Storage Systems in Grid-Connected Microgrids: An Investigation of Online versus Offline Implementationen_GB
dc.typeArticleen_GB
dc.date.available2021-09-07T09:38:01Z
dc.identifier.issn1996-1073
dc.descriptionThis is the final version. Available on open access from MDPI via the DOI in this recorden_GB
dc.identifier.journalEnergiesen_GB
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_GB
dcterms.dateAccepted2021-09-07
exeter.funder::Engineering and Physical Sciences Research Council (EPSRC)en_GB
rioxxterms.versionVoRen_GB
rioxxterms.licenseref.startdate2021-09-07
rioxxterms.typeJournal Article/Reviewen_GB
refterms.dateFCD2021-09-07T08:13:15Z
refterms.versionFCDAM
refterms.dateFOA2021-09-17T15:19:24Z
refterms.panelBen_GB


Files in this item

This item appears in the following Collection(s)

Show simple item record

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Except where otherwise noted, this item's licence is described as © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).