We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Online EVs Vehicle-to-Grid Scheduling Coordinated with Multi-Energy Microgrids: A Deep Reinforcement Learning-Based Approach.
- Authors
Pan, Weiqi; Yu, Xiaorong; Guo, Zishan; Qian, Tao; Li, Yang
- Abstract
The integration of electric vehicles (EVs) into vehicle-to-grid (V2G) scheduling offers a promising opportunity to enhance the profitability of multi-energy microgrid operators (MMOs). MMOs aim to maximize their total profits by coordinating V2G scheduling and multi-energy flexible loads of end-users while adhering to operational constraints. However, scheduling V2G strategies online poses challenges due to uncertainties such as electricity prices and EV arrival/departure patterns. To address this, we propose an online V2G scheduling framework based on deep reinforcement learning (DRL) to optimize EV battery utilization in microgrids with different energy sources. Firstly, our approach proposes an online scheduling model that integrates the management of V2G and multi-energy flexible demands, modeled as a Markov Decision Process (MDP) with an unknown transition. Secondly, a DRL-based Soft Actor-Critic (SAC) algorithm is utilized to efficiently train neural networks and dynamically schedule EV charging and discharging activities in response to real-time grid conditions and energy demand patterns. Extensive simulations are conducted in case studies to testify to the effectiveness of our proposed approach. The overall results validate the efficacy of the DRL-based online V2G scheduling framework, highlighting its potential to drive profitability and sustainability in multi-energy microgrid operations.
- Subjects
DEEP reinforcement learning; REINFORCEMENT (Psychology); MICROGRIDS; ELECTRIC vehicle batteries; ELECTRIC vehicle charging stations; ENERGY consumption
- Publication
Energies (19961073), 2024, Vol 17, Issue 11, p2491
- ISSN
1996-1073
- Publication type
Article
- DOI
10.3390/en17112491