Energies 16 00146
Energies 16 00146
Energies 16 00146
Review
Application of Artificial Intelligence for EV Charging and
Discharging Scheduling and Dynamic Pricing: A Review
Qin Chen and Komla Agbenyo Folly *
Department of Electrical Engineering, University of Cape Town, Cape Town 7701, South Africa;
chnqin002@myuct.ac.za
* Correspondence: komla.folly@uct.ac.za
Abstract: The high penetration of electric vehicles (EVs) will burden the existing power delivery
infrastructure if their charging and discharging are not adequately coordinated. Dynamic pricing is a
special form of demand response that can encourage EV owners to participate in scheduling programs.
Therefore, EV charging and discharging scheduling and its dynamic pricing model are important
fields of study. Many researchers have focused on artificial intelligence-based EV charging demand
forecasting and scheduling models and suggested that artificial intelligence techniques perform better
than conventional optimization methods such as linear, exponential, and multinomial logit models.
However, only a few research studies focused on EV discharging scheduling (i.e., vehicle-to-grid,
V2G) because the concept of EV discharging electricity back to the power grid is relatively new and
evolving. Therefore, a review of existing EV charging and discharging-related studies is needed
to understand the research gaps and to make some improvements in future studies. This paper
reviews EV charging and discharging-related studies and classifies them into forecasting, scheduling,
and pricing mechanisms. The paper determines the linkage between forecasting, scheduling, and
pricing mechanism and identifies the research gaps in EV discharging scheduling and dynamic
pricing models.
Keywords: dynamic pricing; electric vehicles; neural networks; reinforcement learning; vehicle-to-grid
However, the penetration of EVs is currently still low, and the concept of V2G is
relatively new and evolving. Many V2G projects remain in the pilot project stages of
development [13,14]. Some of the representative V2G pilot projects with different goals
and services are summarized in Table 1. A complete list of V2G pilot projects around the
world can be found in [14]. As shown in Table 1, most of the V2G pilot projects were only
initiated in recent years, and the scale of these pilot projects is small. Moreover, most of the
existing literature only focuses on the EV charging aspect. For example, the classification
of EVs’ charging techniques is discussed in [10,15,16]. Authors in [17–30] designed EV
charging strategies to minimize charging costs. In [31], Al-Ogaili et al., reviewed EVs’ charg-
ing scheduling, clustering, and forecasting techniques in the existing literature. Optimal
charging strategy for electric vehicles under dynamic pricing schemes, including Real-Time
Pricing (RTP), Time of Use (ToU), Peak Time Rebates (PTR), and Critical Peak Pricing (CPP),
has been reviewed in [32]. A few studies focused on different aspects of V2G, such as cus-
tomer acceptance of V2G [33], integration of renewable energy into transportation systems
via V2G [11], the economic feasibility of V2G for a university campus [10], coordination of
V2G with energy trade [12], and optimal energy management systems [34]. Encouraging
EV owners to participate in the V2G program without monetary incentives is difficult
because it costs energy and time to feed electricity back to power grids. Therefore, the
economic and operational aspects of V2G will become very important when the penetration
of EVs with V2G capability is high. Many EV charging scheduling approaches have been
effective in following ToU and RTP signals for peak demand reduction [35], alleviating
the impact of load fluctuation [36], and reducing EV charging costs [17–26]. However, the
effectiveness of these pricing policies in reflecting power system conditions has barely been
discussed in the literature.
Moreover, the economic and operational aspects of the V2G program, such as discharg-
ing scheduling and pricing mechanism, have only been proposed in a few papers [27,37–51].
Al-Ogaili et al. [31] suggested that artificial intelligence models perform better than proba-
bilistic models. With the availability of the dataset and increasing computational power,
artificial intelligence algorithms such as neural networks and reinforcement learning have
become very popular and effective for many applications, including forecasting and op-
timization problems. The learning ability from datasets of artificial intelligence models
makes them superior to conventional optimization models, such as linear and exponential
optimization models. Moreover, artificial intelligence models usually do not require expert
knowledge of complex systems, which might be challenging to obtain. As a result, artificial
intelligence models have been used in many EVs related studies ranging from EV battery
design and management to V2G applications [52]. A review of the different roles that artifi-
cial intelligence plays in the mass adoption of EVs can be found in [52]. Shahriar et al. [53]
reviewed machine learning-based EV charging behavior prediction and classification and
pointed out the potential of reinforcement learning for EV scheduling. In addition, many
artificial intelligence models have been implemented to solve EV charging scheduling-
related tasks such as predicting EV charging electricity price [5,20,22,54–57], EV driving pat-
terns [58], aggregated available capacity of batteries [59–62], charging load demand [63–67],
charging pattern [68], and EV charging/discharging scheduling [28,68–73]. However, the
linkage and gaps between each study have not been comprehensively discussed in the
existing literature. Thus, this review aims to explore artificial intelligence-based forecasting,
scheduling, and dynamic pricing models. In addition, the relationship between forecasting,
scheduling, and dynamic pricing is also discussed in this paper. Moreover, the research
gaps found in the existing literature are mentioned, and future research direction related to
EV charging and discharging is discussed.
The main contribution of this paper is to compare, summarize and analyze the existing
artificial intelligence-based algorithms of three critical EV charging/discharging compo-
nents: forecasting, scheduling, and dynamic pricing. This paper also points out the research
gaps in effective EV discharging scheduling and pricing policy based on the finding of the
existing literature.
Energies 2023, 16, 146 3 of 26
No. of
Project Name Country Timespan Service
Chargers
Realising Electric Vehicle to
Australia 51 2020–2022 Frequency response, reserve
Grid Services
Arbitrage, distribution services,
Parker Denmark 50 2016–2018
frequency response
Bidirektionales Arbitrage, frequency response,
Germany 50 2021–2022
Lademanagement—BDL time shifting
Fiat-Chrysler V2G Italy 600 2019–2021 Load balancing
Leaf to home Japan 4000 2012–ongoing Emergency backup, time shifting
Utrecht V2G charge hubs Netherlands 80 2018–ongoing Arbitrage
Distribution services, frequency
Share the Sun/Deeldezon Project Netherlands 80 2019–2021
response, time shifting
VGI core comp. dev. and V2G Arbitrage, frequency response,
South Korea 100 2018–2022
demo. using CC1 reserve, time shifting
Time shifting, pricing scheme
SunnYparc Switzerland 250 2022–2025
testing, reserve
Distribution services, reserve,
Electric Nation Vehicle to Grid UK 100 2020–2022
time shifting
OVO Energy V2G UK 320 2018–2021 Arbitrage
Powerloop: Domestic V2G Arbitrage, distribution services,
UK 135 2018–ongoing
Demonstrator Project emergency backup, time shifting
UK Vehicle-2-Grid (V2G) UK 100 2016–ongoing Support power grid
Distribution services, frequency
INVENT—UCSD/Nissan/Nuvve US 50 2017–2020
response, time shifting
Distribution services, frequency
SmartMAUI, Hawaii US 80 2012–2015
response, time shifting
The rest of the paper is structured as follows. Section 2 briefly discusses the EV charg-
ing/discharging control techniques, the concept of V2G, and battery degradation. Section 3
summarizes different artificial intelligence-based models for predicting EV charging-related
tasks such as electricity price and EV charging load demand. Artificial intelligence-based
EV charging/discharging scheduling models are reviewed in Section 4. Section 5 discusses
dynamic pricing and peer-to-peer energy transaction strategies. Discussion of the current
advanced technology and research gaps are presented in Section 6. Finally, conclusions and
future research directions are presented in Section 7.
• System operators have more freedom to • EV owners have to cede control to the
Controlled
make decisions system operators
back to power grids. Parsons et al. [33] investigated potential EV owners’ willingness to
pay for an EV with V2G capability and contract terms. The survey results suggest that the
V2G concept is most likely to attract more EV buyers if power aggregators provide either
upfront cash payment or pay-as-you-go basis types of contracts.
Lund and Kempton [11] modeled a power system that integrates renewable energy
into the transport and electricity sectors using V2G technology. The simulation results
indicate that adding EVs with V2G capability can enhance renewable energy utilization
(i.e., align EV charging pattern with renewable energy generation pattern) and reduce CO2
emissions. Scott et al. [10] simulated a V2G model for a university campus under various
charging scenarios. The simulation results show that using EVs’ batteries as energy storage
over ten years can reduce electricity costs by 64.7% and 9.79%, respectively, compared to
purchasing electricity from the power grid and using sole battery storage [10]. Al-Awami
and Sortomme [12] have formulated a mixed-integer stochastic linear programming model
to coordinate V2G services with energy trading. The simulation results show that using
V2G services to coordinate short-term energy trading can increase the profit of the load-
serving entity (LSE) by 2.4% and reduce emissions by 5.3% compared to the uncoordinated
one. The advantages and disadvantages of the V2G concept are summarized in Table 3.
Advantages Disadvantages
• Avoid additional investment in a battery storage system
• Enhance renewable energy utilization, thus reducing emissions • Battery degradation concerns
• Mobile energy storage with a fast reaction time • Charging-discharging efficiency concerns
• Provide ancillary service for power grids • Additional upfront investment
Although V2G technology can potentially provide many benefits to power systems
and EV owners, the implementation of V2G still faces some challenges, such as high upfront
investment [34], battery degradation [8], and charging-discharging efficiency concern [80].
In addition, the V2G concept is still relatively new and evolving. Many pilot V2G projects
remain in the development stages [13]. By 2018, only 2 out of 486 (0.41%) utilities in-
vestigated by the Smart Electric Power Alliance (SEPA) had implemented the V2G pilot
project [81]. By 2022, there were only around 100 V2G plot projects with different scales
and testing phases worldwide [14,82]. Therefore, to accelerate V2G implementation, the
charging and discharging efficiency need to be improved. On top of that, the V2G payment
procedure and contract terms need to be simplified. Batteries’ degradation rate and upfront
investment of V2G need to be quantified to allow EV owners to make informative decisions.
battery monitoring and observability analysis with an extended equivalent circuit model.
As a result, the necessary observability conditions for battery capacity are clearly indicated,
which can be used to aid battery charging control design. Meng et al. [86] proposed a
Kalman filter and Gaussian process regression-based battery end-of-life (EOL) prediction
model. The simulation results show that the proposed model provides a better battery
EOL prediction than the particle filter, a popular method for battery EOL prediction. The
effectiveness of Gaussian process regression on battery SOH estimation is also shown
in [87]. Chaoui and Ibe-Ekeocha [88] proposed a recurrent neural networks(RNNs)-based
SOC and SOH estimation model which does not require battery modeling or knowledge of
batter parameters. The simulation results indicate that the recurrent neural networks-based
estimation model can make good SOC and SOH estimations based on measured voltage,
current, and ambient temperature. The measured data’s accuracy is vital for empirical
models’ performance.
More charging/discharging cycles occur in the V2G service than when there is no V2G
service; thus, battery degradation due to V2G services might be more severe than without
it [84]. Petit et al. [89] assessed the impact of V2G on two types of Lithium-ion batteries,
nickel cobalt aluminium oxides (NCA) and lithium ferro-phosphate (LFP). The simulation
results indicate that the effects of V2G on different batteries are different. For example, NCA
is more sensitive to cycle aging compared to LEP cells. In addition, high SOC can increase
battery capacity loss during storage. Pelletier et al. [90] also indicated that the calendar
aging process occurs faster for the battery stored at high SOC. The authors in [90–92] found
that battery overcharging and over-discharging degradation happen when the battery
operates outside its specified voltage range. Although battery degradation is inevitable
at the moment, it is possible to minimize the process by avoiding overcharging/over-
discharging and encouraging charging/discharging batteries at an optimal rate, under the
optimal temperature range, and storing the battery at an optimal SOC. Therefore, battery
degradation estimation and modeling need to be more accurate to support battery charging
and discharging control design.
Besides battery degradation concerns, charging and discharging efficiency is also one
of the primary concerns of the V2G program. Apostolaki-Iosifidou et al. [80] conducted
experimental measurements to determine power loss during EV charging and discharging.
The measurement results indicate that most power losses occur in the power electronics
used for AC-DC conversion [80]. Usually, the highest efficiency of the power electronics
occurs at the top region of their rated power [80]. In addition, the efficiency of the power
electronics is higher during charging than discharging [80]. This is due to the higher voltage
during charging than discharging at a given power. The higher charging voltage results in
a lower charging current, thus lowering internal resistance losses [93]. Therefore, the trade-
off between inevitable battery degradation and power loss during battery discharging and
the benefit of V2G needs to be farther investigated. More research on battery design and
management is required to minimize battery degradation during charging/discharging.
by aggregating multiple DTs [95]. RF is proposed in [72] to forecast EV charging load. The
results of the simulations indicate the effectiveness of RF on EV charging load prediction.
Although SVM and KNN are suitable for solving regression problems, they are generally
used for classification tasks. Erol-Kantarci and Mouftah [54] used KNN to predict elec-
tricity prices to reduce PHEVs’ charging costs and CO2 emissions. The simulation results
show that the prediction-based charging scheme reduces the PHEVs’ operating costs and
CO2 emissions. SVM is used in [103] to forecast EV charging demand with a 3.69% mean
absolute percentage error (MAPE), which is lower than the 8.99% MAPE of the Monte
Carlo technique.
model proposed in [62] is further enhanced by continually refining the prediction model
based on the latest observed behavior [109]. Zhong and Xiong [66] proposed a load de-
mand and renewable energy output prediction model that combines CNN with a deep
belief network (DBN) to form the CNN-DBN prediction model. The forecasted results are
then used to schedule EVs’ charging to minimize the operating cost of the distribution
network. Sun et al. [68] proposed an artificial intelligence-based hybrid model that consists
of K-means clustering, KNN classification, and LSTM prediction to predict EV driver charg-
ing behavior. Zhang et al. [65] proposed a CNN-based ensemble model to forecast traffic
flow and EV load demand. The forecasted load demand can be used to make electricity
trading decisions.
Artificial intelligence-based forecasting models are usually supervised learning-based
models which use labeled data to train the models for predictions. The forecasting models
are used to predict electricity price, EV load demand, driving pattern, and availability of
state of charge, which can then provide EV charging and discharging schedules and pricing.
However, the performance of these charging scheduling models is highly dependent on
the accuracy of the prediction models. Uncertainty is an inherent property of forecasting
models. The stochastic and unpredictable EV charging, discharging, and driving behaviors
make forecasting harder. Therefore, forecasting results alone will not result in optimal
charging scheduling strategies. Other than the suggested hybrid and ensemble techniques,
online-based prediction models [110–113], that take the latest available information to
update the prediction models might reduce the forecasting uncertainty.
Moreover, probabilistic learning algorithms, such as Gaussian processes (GP) [87,114],
can provide prediction with an uncertainty interval that could also be applied to EV
charging/discharging-related studies.
models mentioned in [119–121] perform well, the performance comparison of these models
is lacking in [119–121]. Dogan et al. [122] tested the performance of four different heuristic
algorithms on EV charging/discharging scheduling. The simulation results show that GA
can provide the lowest EV station operating cost and the most convenience for EV owners,
followed by PSO, DE, and ABC. Although heuristic algorithms are easy to implement and
can provide a fast solution, they do not always provide optimal solutions because they
might be trapped in local optimal and provide unstable performances [123]. In addition,
the parameters of these algorithms can largely impact their performance. Thus, the ranking
of these algorithms should only be used as guidance.
used in [20,21] to reduce charging costs and increase power grid reliability. Wang et al. [71]
proposed a DQN-based battery-swapping scheduling strategy to provide fast frequency
regulation services. The simulation results indicate that the proposed model can maximize
the battery swapping station operators’ revenue by autonomously scheduling the hourly
regulation capacity in real-time. Wang et al. [55] combined DQN and Dyna Q learning [126]
to allow the model to learn from both real-time experiences and simulated experiences,
which speeds up training. The proposed algorithm can reduce the long-term charging cost
while avoiding battery depletion during trips. Double-DQN was introduced in [131] to
reduce the overestimation of action values of DQN, and it is applied in [28] to control EVs’
charging/discharging based on the hourly electricity price. The simulation results indicate
that the Double-DQN algorithm reduces the correlation of the action values with the target
and provides better profit for EV owners than other state-of-the-art models such as RNN,
CDNN, and LSTM. However, the training time of DQN for problems with complex and
large state spaces is usually long due to its value iteration nature. Asynchronous Advantage
Actor Critic (A3C), which requires less training time than DQN [132], was introduced in
2016 by Google’s DeepMind [133]. Although A3C has not been applied to EV charging
and discharging-related studies in the reviewed literature, it is a good algorithm for a
complex problem with many states like EV charging and discharging and should be used
in the future.
In addition, both Q-learning and DQN are only suitable for discrete action spaces,
significantly reducing the effectiveness of the EV charging control algorithm due to the
limitation of the action space exploration. The deep deterministic policy gradient (DDPG)
is suitable for tasks with continuous state and action spaces [134]. It has been proposed
in [22–26] to maximize system operators’ financial benefit or reduce the charging cost
of EV owners. The simulation results in [24] show that the higher pricing frequency
can better reflect power system demand and supply situations and shift EV charging
load. The simulation results in [23] suggest that a DDPG-based EV charging strategy
can strictly guarantee voltage security by scheduling EV charging and discharging at
suitable time periods. Sun and Qiu [129] proposed DDPG to solve the voltage control
strategy problem to mitigate voltage fluctuation caused by stochastic EV and load demand.
Qiu et al. [30] proposed a prioritized deep deterministic policy gradient (PDDPG), a
combination of DDPG and prioritized experience replay, to optimize EV owners’ and
aggregators’ profits. PDDPG can solve the problem in multi-dimensional continuous
state and action spaces, which is preferable for practical application and requires lower
computational resources than DDPG with uniform sampling. The simulation results of [30]
show that PDDOG can provide 31%, 13%, and 5% higher profit than Q-learning, DQN,
and DDPG, respectively. Another off-policy algorithm, soft-actor-critic (SAC) [135], is used
by Yan et al. [125] to balance charging costs and drivers’ anxiety. The maximum entropy
framework of SAC makes the proposed model more sample efficient and robust [125]. Lee
and Choi [27] proposed a SAC model to provide dynamic EV charging and discharging
pricing to maximize the profits of multiple EVCSs.
proposed in [44] is more effective than hourly-based RTP in that it can more closely reflect
the actual conditions of the power system and electricity market. Therefore, the price
updating frequency of RTP is a crucial factor when designing the RTP mechanism.
charging and discharging price. Co-decided charging price can effectively improve the
EVCS operator’s income and efficiency of the EV charging system while reducing the EV
users’ charging costs [50]. A more transparent and open real-time pricing policy might
attract more EV owners to participate in the V2G program.
6. Discussion
The development of V2G can be classified into three phases [146]. The schema of V2G
development phases and corresponding EV charging/discharging techniques is shown in
Figure 1. First, the EV charging load only accounts for a small load demand proportion
of power grids in the preliminary phase. The development of V2G is currently in the first
phase. Therefore, the main charging control strategies presently used are uncontrolled
charging and controlled charging. Many V2G pilot projects are still in the planning and
testing phases. In addition, only a tiny portion of the existing commercial EVs is built
with V2G capability. The existing ToU tariff that is applied to other load demands can also
influence EV charging behaviors. However, the traditional ToU tariff will not be able to
handle the high penetration of EVs if stochastic EV charging loads are not considered in
the design of the ToU tariff.
In the second phase, the EV charging load will account for a higher proportion of
power grids’ load due to the high penetration of EVs. A large number of uncoordinated and
stochastic EV charging requests might burden power grids during peak demand periods,
which could endanger power grid stability. Therefore, aggregators must play an important
role in coordinating EVs to provide DSM services, such as enhancing renewable energy
utilization rate, peak shaving, valley filling, and alleviating congestion of distribution
networks [146]. The smart charging/discharging control strategies should be considered
in this phase. The main focus of the existing EV-related literature is on EV smart charg-
Energies 2023, 16, 146 16 of 26
can be used for reward function and tuning hyperparameters, respectively. Moreover, RL
methods are evolving and improving regularly. Therefore, the latest advanced RL methods
should be applied to EV charging/discharging scheduling and dynamic pricing to see if
performance can be further enhanced. Different parameters of RL-based models, such as
reward functions, learning rate, exploration and exploitation trade-off, and model structures,
also need to be tested and compared to find optimal combinations to suit the tasks.
Although simulation results of most EV charging/discharging scheduling show good
performance, most of them were tested on historical or artificial data. The historical or
artificial electricity price cannot reflect real-time operation conditions such as EV charging
load demand, V2G discharging requests, and renewable energy generation. Therefore, a
suitable dynamic pricing model that accurately reflects power system operating conditions
is essential.
Finally, in the third phase, adequate technology and management will allow many EVs
to provide ancillary services to power grids via bidirectional power flow to achieve an ideal
state of smart grids [146]. By then, substantial communication resources will be required
for EV owners and power system operators to interact with each other. The event-triggered
communication mechanism can save bandwidth and signal-processing resources compared
with the periodic communication mechanism [153]. In addition, battery degradation
and charging/discharging losses should be minimized to accelerate the adoption of V2G
technology. Therefore, artificial intelligence-based battery design and management are also
important research fields. Although V2G has many benefits, it is challenging to encourage
EV owners with different driving and charging preferences to participate in the same V2G
program. Moreover, designing different V2G programs to satisfy all the participants is
not feasible. Thus, dynamic pricing can be used as a special form of demand response to
influence EV owners charging and discharging behaviors.
Dynamic pricing has been applied in many fields, such as airline ticket pricing, e-
commerce product pricing, and advertising bid pricing. The strategy is to increase the
selling price when demand is higher than supply and decrease the selling price when
supply is high than demand. More factors, such as power system constraints, renewable
energy generation, and EV charging/discharging preferences, need to be considered when
designing a dynamic pricing model for power systems with high EV penetration. Therefore,
a properly designed dynamic pricing strategy that can accurately reflect the conditions
of the power grid is needed to correctly guide EV charging and discharging to optimize
the energy use of power grids. However, there is a limited literature focusing on the
design of EV charging/discharging dynamic pricing schemes. Most researchers focus
on designing scheduling models that consider dynamic pricing to minimize charging
costs [27,38,40,42,48,49,51]. In addition, most of the existing dynamic pricing schemes
provided by power utility companies might not consider the effect of high EV penetration
and EV owners’ preferences [27,38–40,42,45,46,48,49,51]. Although the P2P electricity
transaction mechanism allows sellers and buyers to co-decide a tariff, the auction-based
mechanism does not always converge. Moreover, it is difficult for EV owners to make
pricing decisions without having the whole picture of power system conditions and the
electricity market. For example, EV owners might undervalue their stored battery power
during power system constraints and overvalue their power when renewable energy
production is high. Therefore, more research should be done on dynamic pricing schedule
design that can solve the above-mentioned issues.
Energies 2023, 16, 146 18 of 26
Table 4. Summary of the relevant literature on EV charging and discharging forecasting, scheduling, and pricing strategies.
7. Conclusions
This paper reviews three crucial aspects of EV charging and discharging: forecasting,
scheduling, and dynamic pricing. The interconnected relationship between forecasting,
scheduling, and dynamic pricing is identified. Scheduling models’ performance mainly
depends on the accuracy of forecasting results and pricing strategies. On the other hand,
forecasting accuracy and scheduling performance largely influence the effectiveness of
dynamic pricing strategies in reflecting real-time power system conditions.
Most forecasting models mentioned in this paper are supervised, learning-based
models. Among them, LSTM and GRU are the most popular methods due to their ability to
handle nonlinear and long-term dependency. However, uncertainty is one of the inherent
properties of forecasting models. Therefore, the performance of forecasting models needs
to continue improving. Besides hybrid and ensemble techniques, using the latest available
data to update forecasting models and adding uncertainty intervals are other options to
assist decision-making.
Reinforcement learning-based optimization models that can take many variables as
state spaces have been applied by many researchers to make optimal EV charging and
discharging decisions based on the forecasted results, including charging and discharging
prices. DQN, DDPG, and SAC are some of the most popular reinforcement learning
models. Each of them has its advantages and disadvantages. DQN can overcome the curse
of dimensionality faced by conventional Q-learning. However, overestimation of action
values and long training time requirements are common issues faced by these methods.
Double-DQN and A3C can solve action value overestimation and reduce training time,
respectively. Improving reinforcement learning performance is also a key field of research.
Scheduling models cannot make effective charging/discharging decisions to optimize
power grids if the information they use to make decisions cannot accurately reflect the
power grid’s real-time conditions. Therefore, both forecasting results and dynamic pricing
signals that can reflect the real-time conditions of the power grid are important.
Many studies have explored the forecasting and scheduling aspects of EV charg-
ing. However, only a limited literature focused on EV discharging and dynamic pricing
design. Most of the existing dynamic pricing is designed by system operators without
considering EV owners’ preferences. In addition, not all the key factors related to EV charg-
ing/discharging are used to design dynamic pricing models. Dynamic pricing strategies are
very important for indirectly controlled charging/discharging. Moreover, dynamic pricing
can incentivize more EV owners to participate in V2G programs. Therefore, researchers
should pay more attention to designing dynamic pricing schemes that can reflect real-time
power system conditions and provide a balance between system operators and EV owners.
In addition to the technical aspect of EV charging/discharge scheduling and dynamic
pricing discussed in this paper, research on the social and economic aspects of EV charg-
ing/discharging and dynamic pricing is required. On the social aspect, the EV owners’
and system operators’ response to dynamic pricing needs to be surveyed and analyzed.
The opinions of all the involved parties can be used to enhance the dynamic pricing policy
design. On economic aspects, the feasibility and profitability of the dynamic pricing model
for the overall system, including individual EV owners, system operators, and power
systems, need to be investigated.
Author Contributions: Q.C. was responsible for conceptualization, formal analysis, and original
draft preparation; K.A.F. was responsible for the supervision, funding acquisition, review, and
editing of the final draft of the paper. All authors have read and agreed to the published version of
the manuscript.
Funding: This research was funded in part by the National Research Foundation (NRF) of South
Africa, grants number NRF Grant UID 118550.
Data Availability Statement: Not applicable.
Energies 2023, 16, 146 20 of 26
Abbreviations
References
1. Sun, X.; Li, Z.; Wang, X.; Li, C. Technology development of electric vehicles: A review. Energies 2020, 13, 90. [CrossRef]
2. IEA. Global EV Outlook 2022. IEA 2022. May 2022. Available online: https://www.iea.org/reports/global-ev-outlook-2022
(accessed on 5 October 2022).
3. Wang, Z.; Wang, S. Grid power peak shaving and valley filling using vehicle-to-grid systems. IEEE Trans. Power Deliv. 2013, 28,
1822–1829. [CrossRef]
4. Limmer, S. Dynamic pricing for electric vehicle charging—A literature review. Energies 2019, 12, 3574. [CrossRef]
Energies 2023, 16, 146 21 of 26
5. Rotering, N.; Ilic, M. Optimal charge control of plug-in hybrid electric vehicles in deregulated electricity markets. IEEE Trans.
Power Syst. 2011, 26, 1021–1029. [CrossRef]
6. Mohammad, A.; Zamora, R.; Lie, T.T. Integration of electric vehicles in the distribution network: A review of pv based electric
vehicle modelling. Energies 2020, 13, 4541. [CrossRef]
7. Kempton, W.; Tomic, J. Vehicle-to-grid power implementation: From stabilizing the grid to supporting large-scale renewable
energy. J. Power Sources 2005, 144, 280–294. [CrossRef]
8. Kempton, W.; Letendre, S.E. Electric vehicles as a new power source for electric utilities. Transp. Res. Part D Transp. Environ. 1996,
2, 157–175. [CrossRef]
9. Ravi, S.S.; Aziz, M. Utilization of electric vehicles for vehicle-to-grid services: Progress and perspectives. Energies 2022, 15, 589.
[CrossRef]
10. Scott, C.; Ahsan, M.; Albarbar, A. Machine learning based vehicle to grid strategy for improving the energy performance of public
buildings. Sustainability 2021, 13, 4003. [CrossRef]
11. Lund, H.; Kempton, W. Integration of renewable energy into the transport and electricity sectors through V2G. Energy Policy 2008,
36, 3578–3587. [CrossRef]
12. Al-Awami, A.T.; Sortomme, E. Coordinating vehicle-to-grid services with energy trading. IEEE Trans. Smart Grid 2012, 3, 453–462.
[CrossRef]
13. Sovacool, B.K.; Axsen, J.; Kempton, W. The future promise of vehicle-to-grid (V2G) integration: A sociotechnical review and
research agenda. Annu. Rev. Environ. Resour. 2017, 42, 377–406. [CrossRef]
14. V2G Hub Insights. Available online: https://www.v2g-hub.com/insights (accessed on 4 November 2022).
15. Kern, T.; Dossow, P.; von Roon, S. Integrating bidirectionally chargeable electric vehicles into the electricity markets. Energies
2020, 13, 5812. [CrossRef]
16. Sovacool, B.K.; Noel, L.; Axsen, J.; Kempton, W. The neglected social dimensions to a vehicle-to-grid (v2g) transition: A critical
and systematic review. Environ. Res. Lett. 2018, 13, 013001. [CrossRef]
17. Cao, Y.; Tang, S.; Li, C.; Zhang, P.; Tan, Y.; Zhang, Z.; Li, J. An optimized EV charging model considering TOU price and SOC
curve. IEEE Trans. Smart Grid 2012, 3, 388–393. [CrossRef]
18. Lee, Z.J.; Pang, J.Z.F.; Low, S.H. Pricing EV charging service with demand charge. Electr. Power Syst. Res. 2020, 189, 106694.
[CrossRef]
19. Wang, B.; Wang, Y.; Nazaripouya, H.; Qiu, C.; Chu, C.; Gadh, R. Predictive scheduling framework for electric vehicles with
uncertainties of user behaviors. IEEE Internet Things J. 2016, 4, 52–63. [CrossRef]
20. Wan, Z.; Li, H.; He, H.; Prokhorov, D. Model-free real-time EV charging scheduling based on deep reinforcement learning. IEEE
Trans. Smart Grid 2019, 10, 5246–5257. [CrossRef]
21. Lee, J.; Lee, E.; Kim, J. Electric vehicle charging and discharging algorithm based on reinforcement learning with data-driven
approach in dynamic pricing scheme. Energies 2020, 13, 1950. [CrossRef]
22. Zhang, F.; Yang, Q.; An, D. CDDPG: A deep-reinforcement-learning-based approach for electric vehicle charging control. IEEE
Internet Things J. 2021, 8, 3075–3087. [CrossRef]
23. Ding, T.; Zeng, Z.; Bai, J.; Qin, B.; Yang, Y.; Shahidehpour, M. Optimal electric vehicle charging strategy with markov decision
process and reinforcement learning technique. IEEE Trans. Ind. Appl. 2020, 56, 5811–5823. [CrossRef]
24. Liu, D.; Wang, W.; Wang, L.; Jia, H.; Shi, M. Deep deterministic policy gradient (DDPG) base reinforcement learning algorithm.
IEEE Access 2021, 9, 21556–21566. [CrossRef]
25. Li, S.; Hu, W.; Cao, D.; Dragičević, T.; Huang, Q.; Chen, Z.; Blaabjerg, F. Electric vehicle charging management based on deep
reinforcement learning. J. Mod. Power Syst. Clean Energy 2022, 10, 719–730. [CrossRef]
26. Wang, K.; Wang, H.; Yang, J.; Feng, J.; Li, Y.; Zhang, S.; Okoye, M.O. Electric vehicle clusters scheduling strategy considering
real-time electricity prices based on deep reinforcement learning. In Proceedings of the International Conference on New Energy
and Power Engineering (ICNEPE 2021), Sanya, China, 19–21 November 2021.
27. Lee, S.; Choi, D.H. Dynamic pricing and energy management for profit maximization in multiple smart electric vehicle charging
stations: A privacy-preserving deep reinforcement learning approach. Appl. Energy 2021, 304, 117754. [CrossRef]
28. Kiaee, F. Integration of electric vehicles in smart grid using deep reinforcement learning. In Proceedings of the 11th International
Conference on Information and Knowledge Technology (IKT), Tehran, Iran, 22–23 December 2020.
29. Lee, S.; Choi, D.H. Reinforcement learning-based energy management of smart home with rooftop solar photovoltaic system,
energy storage system, and home appliances. Sensors 2019, 19, 3937. [CrossRef]
30. Qiu, D.; Ye, Y.; Papadaskalopoulos, D.; Strbac, G. A deep reinforcement learning method for pricing electric vehicles with discrete
charging levels. IEEE Trans. Ind. Appl. 2020, 56, 5901–5912. [CrossRef]
31. Al-Ogaili, A.S.; Hashim, T.J.T.; Rahmat, N.A.; Ramasamy, A.K.; Marsadek, M.B.; Faisal, M.; Hannan, M.A. Review on scheduling,
clustering, and forecasting strategies for controlling electric vehicle charging: Challenges and recommendations. IEEE Access
2019, 7, 128353–128371. [CrossRef]
32. Amin, A.; Tareen, W.U.K.; Usman, M.; Ali, H.; Bari, I.; Horan, B.; Mekhilef, S.; Asif, M.; Ahmed, S.; Mahmood, A. A review of
optimal charging strategy for electric vehicles under dynamic pricing schemes in the distribution charging network. Sustainability
2020, 12, 10160. [CrossRef]
Energies 2023, 16, 146 22 of 26
33. Parsons, G.R.; Hidrue, M.K.; Kempton, W.; Gardner, M.P. Willingness to pay for vehicle-to-grid (V2G) electric vehicles and their
contract terms. Energy Econ. 2014, 42, 313–324. [CrossRef]
34. Shariff, S.M.; Iqbal, D.; Alam, M.S.; Ahmad, F. A state of the art review of electric vehicle to grid (V2G) technology. IOP Conf. Ser.
Mater. Sci. Eng. 2019, 561, 012103. [CrossRef]
35. Shao, S.; Zhang, T.; Pipattanasomporn, M.; Rahman, S. Impact of TOU rates on distribution load shapes in a smart grid with
PHEV penetration. In Proceedings of the IEEE PES T&D 2010, New Orleans, LA, USA, 19–22 April 2010.
36. Xu, X.; Niu, D.; Li, Y.; Sun, L. Optimal pricing strategy of electric vehicle charging station for promoting green behavior based on
time and space dimensions. J. Adv. Transp. 2020, 2020, 8890233. [CrossRef]
37. Lu, Z.; Qi, J.; Zhang, J.; He, L.; Zhao, H. modelling dynamic demand response for plug-in hybrid electric vehicles based on
real-time charging pricing. IET Gener. Transm. Distrib. 2017, 11, 228–235. [CrossRef]
38. Mao, T.; Lau, W.H.; Shum, C.; Chung, H.S.H.; Tsang, K.F.; Tse, N.C.F. A regulation policy of EV discharging price for demand
scheduling. IEEE Trans. Power Syst. 2018, 33, 1275–1288. [CrossRef]
39. Chekired, D.A.E.; Dhaou, S.; Khoukhi, L.; Mouftah, H.T. Dynamic pricing model for EV charging-discharging service based
on cloud computing scheduling. In Proceedings of the 13th International Wireless Communications and Mobile Computing
Conference, Valencia, Spain, 26–30 June 2017.
40. Shakya, S.; Kern, M.; Owusu, G.; Chin, C.M. Neural network demand models and evolutionary optimisers for dynamic pricing.
Knowl. Based Syst. 2012, 29, 44–53. [CrossRef]
41. Kim, B.G.; Zhang, Y.; van der Schaar, M.; Lee, J.W. Dynamic pricing and energy consumption scheduling with reinforcement
learning. IEEE Trans. Smart Grid 2016, 7, 2187–2198. [CrossRef]
42. Cedillo, M.H.; Sun, H.; Jiang, J.; Cao, Y. Dynamic pricing and control for EV charging stations with solar generation. Appl. Energy
2022, 326, 119920. [CrossRef]
43. Moghaddam, V.; Yazdani, A.; Wang, H.; Parlevliet, D.; Shahnia, F. An online reinforcement learning approach for dynamic pricing
of electric vehicle charging stations. IEEE Access 2020, 8, 130305–130313. [CrossRef]
44. Bitencourt, L.D.A.; Borba, B.S.M.C.; Maciel, R.S.; Fortes, M.Z.; Ferreira, V.H. Optimal EV charging and discharging control
considering dynamic pricing. In Proceedings of the 2017 IEEE Manchester PowerTech, Manchester, UK, 18–22 June 2017.
45. Ban, D.; Michailidis, G.; Devetsikiotis, M. Demand response control for PHEV charging stations by dynamic price adjustments. In
Proceedings of the 2012 IEEE PES Innovative Smart Grid Technologies (ISGT), Washington, DC, USA, 16–20 January 2012.
46. Xu, P.; Sun, X.; Wang, J.; Li, J.; Zheng, W.; Liu, H. Dynamic pricing at electric vehicle charging stations for waiting time
reduction. In Proceedings of the 4th International Conference on Communication and Information Processing, Qingdao, China,
2–4 November 2018.
47. Erdinc, O.; Paterakis, N.G.; Mendes, T.D.P.; Bakirtzis, A.G.; Catalão, J.P.S. Smart household operation considering bi-directional
EV and ESS utilization by real-time pricing-based DR. IEEE Trans. Smart Grid 2015, 6, 1281–1291. [CrossRef]
48. Luo, C.; Huang, Y.F.; Gupta, V. Dynamic pricing and energy management strategy for ev charging stations under uncertainties. In
Proceedings of the International Conference on Vehicle Technology and Intelligent Transport Systems—VEHITS, Rome, Italy,
23–24 April 2016.
49. Guo, Y.; Xiong, J.; Xu, S.; Su, W. Two-stage economic operation of microgrid-like electric vehicle parking deck. IEEE Trans. Smart
Grid 2016, 7, 1703–1712. [CrossRef]
50. Wang, B.; Hu, Y.; Xiao, Y.; Li, Y. An EV charging scheduling mechanism based on price negotiation. Future Internet 2018, 10, 40.
[CrossRef]
51. Maestre, R.; Duque, J.; Rubio, A.; Arevalo, J. Reinforcement learning for fair dynamic pricing. In Proceedings of the Intelligent
Systems Conference, London, UK, 6–7 September 2018.
52. Ahmed, M.; Zheng, Y.; Amine, A.; Fathiannasab, H.; Chen, Z. The role of artificial intelligence in the mass adoption of electric
vehicles. Joule 2021, 5, 2296–2322. [CrossRef]
53. Shahriar, S.; Al-Ali, A.R.; Osman, A.H.; Dahou, S.; Nijim, M. Machine learning approaches for EV charging behavior: A review.
IEEE Access 2020, 8, 168980–168993. [CrossRef]
54. Erol-Kantarci, M.; Mouftah, H.T. Prediction-based charging of PHEVs from the smart grid with dynamic pricing. In Proceedings
of the IEEE Local Computer Network Conference, Denver, CO, USA, 10–14 October 2010.
55. Wang, F.; Gao, J.; Li, M.; Zhao, L. Autonomous PEV charging scheduling using dyna-q reinforcement learning. IEEE Trans. Veh.
Technol. 2020, 69, 12609–12620. [CrossRef]
56. Dang, Q.; Wu, D.; Boulet, B. EV charging management with ANN-based electricity price forecasting. In Proceedings of the 2020
IEEE Transportation Electrification Conference & Expo (ITEC), Chicago, IL, USA, 23–26 June 2020.
57. Lu, Y.; Gu, J.; Xie, D.; Li, Y. Integrated route planning algorithm based on spot price and classified travel objectives for ev users.
IEEE Access 2019, 7, 122238–122250. [CrossRef]
58. Iversen, E.B.; Morales, J.M.; Madsen, H. Optimal charging of an electric vehicle using a markov decision process. Appl. Energy
2014, 123, 1–12. [CrossRef]
59. Shang, Y.; Li, Z.; Shao, Z.; Jian, L. Distributed V2G dispatching via LSTM network within cloud-edge collaboration framework. In
Proceedings of the 2021 IEEE/IAS Industrial and Commercial Power System Asia (I&CPS Asia), Chengdu, China, 18–21 July 2021.
Energies 2023, 16, 146 23 of 26
60. Li, S.; Gu, C.; Li, J.; Wang, H.; Yang, Q. Boosting grid efficiency and resiliency by releasing v2g potentiality through a novel rolling
prediction-decision framework and deep-lstm algorithm. IEEE Syst. J. 2021, 15, 2562–2570. [CrossRef]
61. Nogay, H.S. Estimating the aggregated available capacity for vehicle to grid services using deep learning and Nonlinear
Autoregressive Neural Network. Sustain. Energy Grids Netw. 2022, 29, 100590. [CrossRef]
62. Shipman, R.; Roberts, R.; Waldron, J.; Naylor, S.; Pinchin, J.; Rodrigues, L.; Gillotta, M. We got the power: Predicting available
capacity for vehicle-to-grid services using a deep recurrent neural network. Energy 2021, 221, 119813. [CrossRef]
63. Gautam, A.; Verma, A.K.; Srivastava, M. A novel algorithm for scheduling of electric vehicle using adaptive load forecasting
with vehicle-to-grid integration. In Proceedings of the 2019 8th International Conference on Power Systems (ICPS), Jaipur, India,
20–22 December 2019.
64. Kriekinge, G.V.; Cauwer, C.D.; Sapountzoglou, N.; Coosemans, T.; Messagie, M. Peak shaving and cost minimization using model
predictive control for uni- and bi-directional charging of electric vehicles. Energy Rep. 2021, 7, 8760–8771. [CrossRef]
65. Zhang, X.; Chan, K.W.; Li, H.; Wang, H.; Qiu, J.; Wang, G. Deep-Learning-Based Probabilistic Forecasting of Electric Vehicle
Charging Load With a Novel Queuing Model. IEEE Trans. Cybern. 2021, 51, 3157–3170. [CrossRef]
66. Zhong, J.; Xiong, X. An orderly EV charging scheduling method based on deep learning in cloud-edge collaborative environment.
Adv. Civ. Eng. 2021, 2021, 6690610. [CrossRef]
67. Kriekinge, G.V.; Cauwer, C.D.; Sapountzoglou, N.; Coosemans, T.; Messagie, M. Day-ahead forecast of electric vehicle charging
demand with deep neural networks. World Electr. Veh. J. 2021, 12, 178. [CrossRef]
68. Sun, D.; Ou, Q.; Yao, X.; Gao, S.; Wang, Z.; Ma, W.; Li, W. Integrated human-machine intelligence for EV charging prediction in
5G smart grid. EURASIP J. Wirel. Commun. Netw. 2020, 2020, 139. [CrossRef]
69. Wang, K.; Gu, L.; He, X.; Guo, S.; Sun, Y.; Vinel, A.; Shen, J. Distributed energy management for vehicle-to-grid networks. IEEE
Netw. 2017, 31, 22–28. [CrossRef]
70. Patil, V.; Sindhu, M.R. An intelligent control strategy for vehicle-to-grid and grid-to-vehicle energy transfer. IOP Conf. Ser. Mater.
Sci. Eng. 2019, 561, 012123. [CrossRef]
71. Wang, X.; Wang, J.; Liu, J. Vehicle to grid frequency regulation capacity optimal scheduling for battery swapping station using
deep q-network. IEEE Trans. Ind. Inform. 2021, 17, 1342–1351. [CrossRef]
72. Lu, Y.; Li, Y.; Xie, D.; Wei, E.; Bao, X.; Chen, H.; Zhong, X. The application of improved random forest algorithm on the prediction
of electric vehicle charging load. Energies 2018, 11, 3207. [CrossRef]
73. Boulakhbar, M.; Farag, M.; Benabdelaziz, K.; Kousksou, T.; Zazi, M. A deep learning approach for prediction of electrical vehicle
charging stations power demand in regulated electricity markets: The case of Morocco. Clean. Energy Syst. 2022, 3, 100039.
[CrossRef]
74. Lee, E.H.P.; Lukszo, Z.; Herder, P. Conceptualization of vehicle-to-grid contract types and their formalization in agent-based
models. Complexity 2018, 2018, 3569129.
75. Gao, Y.; Chen, Y.; Wang, C.Y.; Liu, K.J.R. Optimal contract design for ancillary services in vehicle-to-grid networks. In Pro-
ceedings of the 2012 IEEE Third International Conference on Smart Grid Communications (SmartGridComm), Tainan, Taiwan,
5–8 November 2012.
76. Wahyuda, S.B. Dynamic pricing in electricity: Research potential in Indonesia. Procedia Manuf. 2015, 4, 300–306. [CrossRef]
77. Dutschke, E.; Paetz, A.G. Dynamic electricity pricing—Which programs do consumers prefer? Energy Policy 2013, 59, 226–234.
[CrossRef]
78. Yoshida, Y.; Tanaka, K.; Managi, S. Which dynamic pricing rule is most preferred by consumers?—Application of choice
experiment. Econ. Struct. 2017, 6, 4. [CrossRef]
79. Latinopoulos, C.; Sivakumar, A.; Polak, J.W. Response of electric vehicle drivers to dynamic pricing of parking and charging
services: Risky choice in early reservations. Transp. Res. Part C 2017, 80, 175–189. [CrossRef]
80. Apostolaki-Iosifidou, E.; Codani, P.; Kempton, W. Measurement of power loss during electric vehicle charging and discharing.
Energy 2017, 127, 730–742. [CrossRef]
81. Myers, E.H.; Surampudy, M.; Saxena, A. Utilities and Electric Vehicles: Evolving to Unlock Grid Value; Smart Electric Power Alliance:
Washington, DC, USA, 2018.
82. Väre, V. The Vehicle-to-Grid Boom Is around the Corner. Virta Global. 7 December 2020. Available online: https://www.virta.
global/blog/vehicle-to-grid-boom-is-around-the-corner (accessed on 19 October 2022).
83. Barre, A.; Deguilhem, B.; Grolleau, S.; Gerard, M.; Suard, F.; Riu, D. A review on lithium-ion battery ageing mechanisms and
estimations for automotive applications. J. Power Source 2013, 241, 680–689. [CrossRef]
84. Guo, J.; Yang, J.; Lin, Z.; Serrano, C.; Cortes, A.M. Impact analysis of V2G services on EV battery degradation—A review. In
Proceedings of the 2019 IEEE Milan PowerTech, Milan, Italy, 23–27 June 2019.
85. Meng, J.; Boukhnifer, M.; Diallo, D. Lithium-Ion battery monitoring and observability analysis with extended equivalent circuit
model. In Proceedings of the 28th Mediterranean Conference on Control and Automation (MED), Saint-Raphaël, France,
15–18 September 2020.
86. Meng, J.; Yue, M.; Diallo, D. A degradation empirical-model-free battery end-of-life prediction framework based on Gaussian
process regression and Kalman filter. IEEE Trans. Transp. Electrif. 2022, 1–11. [CrossRef]
Energies 2023, 16, 146 24 of 26
87. Wang, J.; Deng, Z.; Yu, T.; Yoshida, A.; Xu, L.; Guan, G.; Abudula, A. State of health estimation based on modified Gaussian
process regression for lithium-ion batteries. J. Energy Storage 2022, 51, 104512. [CrossRef]
88. Chaoui, H.; Ibe-Ekeocha, C.C. State of charge and state of health estimation for Lithium batteries using recurrent neural networks.
IEEE Trans. Veh. Technol. 2017, 66, 8773–8783. [CrossRef]
89. Petit, M.; Prada, E.; Sauvant-Moynot, V. Development of an empirical aging model for Li-ion batteries and application to assess
the impact of Vehicle-to-Grid strategies on battery lifetime. Appl. Energy 2016, 172, 398–407. [CrossRef]
90. Pelletier, S.; Jabali, O.; Laporte, G.; Veneroni, M. Battery degradation and behaviour for electric vehicles: Review and numerical
analyses of several models. Transp. Res. Part B Methodol. 2017, 103, 158–187. [CrossRef]
91. Prochazka, P.; Cervinka, D.; Martis, J.; Cipin, R.; Vorel, P. Li-Ion battery deep discharge degradation. ECS Trans. 2016, 74, 31–36.
[CrossRef]
92. Guo, R.; Lu, L.; Ouyang, M.; Feng, X. Mechanism of the entire overdischarge process and overdischarge-induced internal short
circuit in lithium-ion batteries. Sci. Rep. 2016, 6, 30248. [CrossRef] [PubMed]
93. Krieger, E.M.; Arnold, C.B. Effects of undercharge and internal loss on the rate dependence of battery charge storage efficiency.
J. Power Source 2012, 210, 286–291. [CrossRef]
94. Quinlan, J.R. Induction of decision trees. Mach. Learn. 1986, 1, 81–106. [CrossRef]
95. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [CrossRef]
96. Cortes, C.; Vapnik, V. Support-vector networks. Machine Leaming 1995, 20, 273–297. [CrossRef]
97. Cover, T.; Hart, P.E. Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 1967, 13, 21–27. [CrossRef]
98. Jain, A.K.; Mao, J.; Mohiuddin, K.M. Artificial neural networks: A tutorial. Computer 1996, 29, 31–44. [CrossRef]
99. O’Shea, K.; Nash, R. An introduction to convolutional neural networks. arXiv 2015, arXiv:1511.08458.
100. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet classification with deep convolutional neural networks. Adv. Neural Inf.
Process. Syst. 2017, 60, 84–90. [CrossRef]
101. Cho, K.; Merrienboer, B.v.; Bahdanau, D.; Bengio, Y. On the properties of neural machine translation: Encoder-decoder approaches.
arXiv 2014, 1–9, arXiv:1409.1259.
102. Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [CrossRef] [PubMed]
103. Xydas, E.S.; Marmaras, C.E.; Cipcigan, L.M.; Hassan, A.S.; Jenkins, N. Forecasting electric vehicle charging demand using Support
Vector Machines. In Proceedings of the 48th International Universities’ Power Engineering Conference (UPEC), Dublin, Ireland,
2–5 September 2013.
104. Chung, J.; Gulcehre, C.; Cho, K.; Bengio, Y. Empirical evaluation of gated recurrent neural networks on sequence modelling. arXiv
2014, arXiv:1412.3555.
105. Gruber, N.; Jockisch, A. Are GRU cells more specific and LSTM cells more sensitive in motive classification of text? Front. Artif.
Intell. 2020, 3, 40. [CrossRef]
106. Mateus, B.C.; Mendes, M.; Farinha, J.T.; Assis, R.; Cardoso, A.M. Comparing LSTM and GRU models to predict the condition of a
pulp paper press. Energies 2021, 14, 6958. [CrossRef]
107. Jin, H.; Lee, S.; Nengroo, S.H.; Har, D. Development of charging/discharging scheduling algorithm for economical and energy-
efficient operation of multi-EV charging station. Appl. Sci. 2022, 12, 4786. [CrossRef]
108. Cahuantzi, R.; Chen, X.; Güttel, S. A comparison of LSTM and GRU networks for learning symbolic sequences. arXiv 2021,
arXiv:2107.02248.
109. Shipman, R.; Roberts, R.; Waldron, J.; Rimmer, C.; Rodrigues, L.; Gillotta, M. Online machine learning of available capacity for
vehicle-to-grid services during the coronavirus pandemic. Energies 2021, 14, 7176. [CrossRef]
110. Raju, M.P.; Laxmi, A.J. IOT based online load forecasting using machine learning algorithms. Procedia Comput. Sci. 2020, 171,
551–560. [CrossRef]
111. Laouafi, A.; Mordjaoui, M.; Haddad, S.; Boukelia, T.E.; Ganouche, A. Online electricity demand forecasting based on an effective
forecast combination methodology. Electr. Power Syst. Res. 2017, 148, 35–47. [CrossRef]
112. Krannichfeldt, L.V.; Wang, Y.; Hug, G. Online ensemble learning for load forecasting. IEEE Trans. Power Syst. 2021, 36, 545–548.
[CrossRef]
113. Fekri, M.N.; Patel, H.; Grolinger, K.; Sharma, V. Deep learning for load forecasting with smart meter data: Online adaptive
recurrent neural network. Appl. Energy 2021, 282, 116177. [CrossRef]
114. Zhang, Y.; Li, M.; Dong, Z.Y.; Meng, K. Probabilistic anomaly detection approach for data-driven wind turbine condition
monitoring. CSEE J. Power Energy Syst. 2019, 5, 149–158. [CrossRef]
115. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [CrossRef]
116. Kennedy, J.; Eberhart, R.C. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural
Networks, Perth, WA, Australia, 27 November–1 December 1995.
117. Storn, R.; Price, K. Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces. J. Glob.
Optim. 1997, 11, 341–359. [CrossRef]
118. Karaboga, D.; Basturk, B. Artificial bee colony (ABC) optimization algorithm for solving constrained optimization problems. In
Proceedings of the Foundations of Fuzzy Logic and Soft Computing, Cancun, Mexico, 18–21 June 2007.
Energies 2023, 16, 146 25 of 26
119. Ke, B.R.; Lin, Y.H.; Chen, H.Z.; Fang, S.C. Battery charging and discharging scheduling with demand response for an electric bus
public transportation system. Sustain. Energy Technol. Assess. 2020, 40, 100741. [CrossRef]
120. Farahani, H.F. Improving voltage unbalance of low-voltage distribution networks using plug-in electric vehicles. J. Clean. Prod.
2017, 148, 336–346. [CrossRef]
121. Gong, L.; Cao, W.; Liu, K.; Zhao, J.; Li, X. Spatial and temporal optimization strategy for plug-in electric vehicle charging to
mitigate impacts on distribution network. Energies 2018, 11, 1373. [CrossRef]
122. Dogan, A.; Bahceci, S.; Daldaban, F.; Alci, M. Optimization of charge/discharge coordination to satisfy network requirements
using heuristic algorithms in vehicle-to-grid concept. Adv. Electr. Comput. Eng. 2018, 18, 121–130. [CrossRef]
123. Qiu, H.; Liu, Y. Novel heuristic algorithm for large-scale complex optimization. Procedia Comput. Sci. 2016, 80, 744–751. [CrossRef]
124. Zadeh, L. Fuzzy sets. Inf. Control 1965, 8, 338–353. [CrossRef]
125. Yan, L.; Chen, X.; Zhou, J.; Chen, Y.; Wen, J. Deep reinforcement learning for continuous electric vehicles charging control with
dynamic user behaviors. IEEE Trans. Smart Grid 2021, 12, 5124–5134. [CrossRef]
126. Sutton, R.S.; Barto, A.G. Reinforcement Learning: An Introduction; The MIT Press: London, UK, 2018.
127. Watkins, C.J.C.H.; Dayan, P. Q-learning. Mach. Learn. 1992, 8, 279–292. [CrossRef]
128. Mhaisen, N.; Fetais, N.; Massoud, A. Real-time scheduling for electric vehicles charging/discharging using reinforcement learning.
In Proceedings of the 2020 IEEE International Conference on Informatics, IoT, and Enabling Technologies (ICIoT), Doha, Qatar,
2–5 February 2020.
129. Sun, X.; Qiu, J. A customized voltage control strategy for electric vehicles in distribution networks with reinforcement learning
method. IEEE Trans. Ind. Inform. 2021, 17, 6852–6863. [CrossRef]
130. Mnih, V.; Kavukcuoglu, K.; Silver, D.; Graves, A.; Antonoglou, I.; Wierstra, D.; Riedmiller, M. Playing Atari with deep reinforce-
ment learning. arXiv 2013, arXiv:1312.5602.
131. Hasselt, H.v.; Guez, A.; Silver, D. Deep reinforcement learning with double Q-learning. arXiv 2015, arXiv:1509.06461. [CrossRef]
132. Zeng, P.; Liu, A.; Zhu, C.; Wang, T.; Zhang, S. Trust-based multi-agent imitation learning for green edge computing in smart cities.
IEEE Trans. Green Commun. Netw. 2022, 6, 1635–1648. [CrossRef]
133. Mnih, V.; Badia, A.P.; Mirza, M.; Graves, A.; Harley, T.; Lillicrap, T.P.; Silver, D.; Kavukcuoglu, K. Asynchronous methods for deep
reinforcement learning. arXiv 2016, arXiv:1602.01783.
134. Lillicrap, T.P.; Hunt, J.J.; Pritzel, A.; Heess, N.; Erez, T.; Tassa, Y.; Silver, D.; Wierstra, D. Continuous control with deep
reinforcement learning. In Proceedings of the International Conference on Learning Representations, San Juan, Puerto Rico,
2–4 May 2016.
135. Haarnoja, T.; Zhou, A.; Abbeel, P.; Levine, S. Soft Actor-Critic: Off-policy maximum entropy deep reinforcement learning with a
stochastic actor. In Proceedings of the 35th International Conference on Machine Learning, Stockholm, Sweden, 10–15 July 2018.
136. Meeus, L.; Purchala, K.; Belmans, R. Development of the internal electricity market in Europe. Electr. J. 2005, 18, 25–35. [CrossRef]
137. Liu, H.; Zhang, Y.; Zheng, S.; Li, Y. Electric vehicle power trading mechanism based on blockchain and smart contract in V2G
network. IEEE Access 2019, 7, 160546–160558. [CrossRef]
138. Nhede, N. Smart Energy International, Synergy BV. 13 April 2019. Available online: https://www.smart-energy.com/industry-
sectors/electric-vehicles/as-energy-gets-smarter-time-of-use-tariffs-spread-globally/ (accessed on 6 October 2022).
139. Guo, Y.; Liu, X.; Yan, Y.; Zhang, N.; Wencong, S. Economic analysis of plug-in electric vehicle parking deck with dynamic pricing.
In Proceedings of the2014 IEEE PES General Meeting|Conference & Exposition, National Harbor, MD, USA, 27–31 July 2014.
140. Wolbertus, R.; Gerzon, B. Improving electric vehicle charging station efficiency through pricing. J. Adv. Transp. 2018, 2018, 4831951.
[CrossRef]
141. Geng, B.; Mills, J.K.; Sun, D. Two-stage charging strategy for plug-in electric vehicles at the residential transformer level. IEEE
Trans. Smart Grid 2013, 4, 1442–1452. [CrossRef]
142. Zethmayr, J.; Kolata, D. Charge for less: An analysis of hourly electricity pricing for electric vehicles. Would Electr. Veh. J. 2019, 10, 6.
[CrossRef]
143. Sather, G.; Granado, P.C.D.; Zaferanlouei, S. Peer-to-peer electricity trading in an industrial site: Value of buildings flexibility on
peak load reduction. Energy Build. 2021, 236, 110737. [CrossRef]
144. Liu, N.; Yu, X.; Wang, C.; Li, C.; Ma, L.; Lei, J. Energy-sharing model with price-based demand response for microgrids of
peer-to-peer prosumers. IEEE Trans. Power Syst. 2017, 32, 3569–3583. [CrossRef]
145. Marwala, T.; Hurwitz, E. Artificial Intelligence and Economic Theory: Skynet in the Market; Springer: Cham, Switzerland, 2017.
146. Shi, L.; LV, T.; Wang, Y. Vehicle-to-grid service development logic and management formulation. J. Mod. Power Syst. Clean Energy
2019, 7, 935–947. [CrossRef]
147. Hazra, A. Using the confidence interval confidently. J. Thorac. Dis. 2017, 9, 4125–4130. [CrossRef]
148. Gershman, S.J.; Daw, N.D. Reinforcement learning and episodic memory in humans and animals: An integrative framework.
Annu. Rev. Psychol. 2017, 68, 101–128. [CrossRef]
149. Botvinick, M.; Ritter, S.; Wang, J.X.; Kurth-Nelson, Z.; Blundell, C.; Hassabis, D. Reinforcement learning, fast and slow. Trends
Cogn. Sci. 2019, 23, 408–422. [CrossRef]
150. Hu, Z.; Wan, K.; Gao, X.; Zhai, Y. A dynamic adjusting reward function method for deep reinforcement learning with adjustable
parameters. Math. Probl. Eng. 2019, 2019, 7619483. [CrossRef]
Energies 2023, 16, 146 26 of 26
151. Wu, J.; Chen, X.Y.; Zhang, H.; Xiong, L.D.; Lei, H.; Deng, S.H. Hyperparameter optimization for machine learning models based
on bayesian optimization. J. Electron. Sci. Technol. 2019, 17, 26–40.
152. Feurer, M.; Hutter, F. Hyperparameter Optimization. In Automated Machine Learning; Springer: Cham, Switzerland, 2019; pp. 3–33.
153. Zhang, N.; Sun, Q.; Yang, L.; Li, Y. Event-triggered distributed hybrid control scheme for the integrated energy system. IEEE
Trans. Ind. Inform. 2022, 18, 835–846. [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.