CIRLEM: a synergic integration of Collective Intelligence and Reinforcement learning in Energy Management for enhanced climate resilience and lightweight computation
Peer reviewed, Journal article
Published version
Date
2023Metadata
Show full item recordCollections
Original version
10.1016/j.apenergy.2023.121785Abstract
A novel energy management (EM) approach is introduced, integrating core elements of collective intelligence (CI) and reinforcement learning (RL) and called CIRLEM. It operates by distributing a flexibility signal from the energy supplier to agents within the grid, prompting their responsive actions. The flexibility signal reflects upon the collective behaviour of the agents in the grid and agents learn and decide using a value-based model-free RL engine. Two ways of running CIRLEM are defined, based on doing all the decision making only at the edge node (Edge Node Control or ENC) or together with the cluster (Edge node and Cluster Control or ECC). CIRLEM's performance is thoroughly investigated in an elderly building situated in Ålesund, Norway, specifically during extreme warm and cold seasons in the future climate. The building is divided into 20 thermal zones, each acting as an agent with three control strategies. CIRLEM undergoes comprehensive testing, evaluating policies with 24 and 48 sets of actions (referred to as L24 and L48) and six different randomness levels. The results demonstrate that CIRLEM swiftly converges to an optimal solution (the optimum set of policies), offering both enhanced indoor comfort and significant energy savings. Among the CIRLEM algorithms, ENC-L24, the fastest and simplest one, showcased outstanding performance. Overall, CIRLEM offers a remarkable improvement in energy flexibility and climate resilience for a group of grid-connected agents, ensuring energy savings without compromising indoor comfort.