Electric Vehicle Charging Load Forecasting Method Based On
Electric Vehicle Charging Load Forecasting Method Based On
1 School of Transportation and Logistics Engineering, Shandong Jiaotong University, Jinan 250023, China
2 Department of Traffic Management and Engineering, Chongqing Police College, Chongqing 401331, China;
[email protected]
* Correspondence: [email protected]; Tel.: +86-13791038708
Abstract: With the rapid global proliferation of electric vehicles (EVs), their integration
as a significant load component within power systems increasingly influences the stable
operation and planning of electrical grids. However, the high uncertainty and randomness
inherent in EV users’ charging behaviors render accurate load forecasting a challenging
task. In this context, the present study proposes a Particle Swarm Optimization (PSO)-
enhanced Long Short-Term Memory (LSTM) network forecasting model. By combining the
global search capability of the PSO algorithm with the advantages of LSTM networks in
time-series modeling, a PSO-LSTM hybrid framework optimized for seasonal variations is
developed. The results confirm that the PSO-LSTM model effectively captures seasonal
load variations, providing a high-precision, adaptive solution for dynamic grid scheduling
and charging infrastructure planning. This model supports the optimization of power
resource allocation and the enhancement of energy storage efficiency. Specifically, during
winter, the Mean Absolute Error (MAE) is 3.896, a reduction of 6.57% compared to the
LSTM model and 10.13% compared to the Gated Recurrent Unit (GRU) model. During the
winter–spring transition, the MAE is 3.806, which is 6.03% lower than that of the LSTM
Academic Editor: Grzegorz Sierpiński model and 12.81% lower than that of the GRU model. In the spring, the MAE is 3.910,
Received: 11 February 2025
showing a 2.71% improvement over the LSTM model and a 7.32% reduction compared to
Revised: 26 February 2025 the GRU model.
Accepted: 3 March 2025
Published: 5 March 2025 Keywords: electric vehicles (EVs); load forecasting; long short-term memory network;
Citation: Yang, X.; Zhang, L.; Han, X. particle swarm optimization; deep learning
Electric Vehicle Charging Load
Forecasting Method Based on
Improved Long Short-Term Memory
Model with Particle Swarm 1. Introduction
Optimization. World Electr. Veh. J.
2025, 16, 150. https://doi.org/
Amid the escalating global energy crisis and increasing environmental pollution
10.3390/wevj16030150 concerns, electric vehicles (EVs) have rapidly emerged as a sustainable transportation
solution. However, the widespread adoption of EVs has posed new challenges for power
Copyright: © 2025 by the authors.
Published by MDPI on behalf of the
grids. The charging load of EVs is inherently random and volatile, potentially disrupting
World Electric Vehicle Association. the stable operation of power grids. Therefore, accurately forecasting fluctuations in EV
Licensee MDPI, Basel, Switzerland. load [1] is essential for effective grid planning and operation.
This article is an open access article Traditional forecasting techniques encompass various methods, including Monte
distributed under the terms and Carlo simulation and Kalman filtering. Reference [2] developed a Monte Carlo-based
conditions of the Creative Commons
model that incorporates EV types, using the predicted number of EVs in a region as a
Attribution (CC BY) license
(https://creativecommons.org/
basis while considering other factors as model parameters. Reference [3] proposed a
licenses/by/4.0/). hybrid algorithm combining time-series analysis and Kalman filtering, which improved
accuracy and facilitated the derivation of state and observation equations, effectively
underscoring the benefits of hybrid algorithms in improving short-term load forecasting
for power systems.
Electric load forecasting is inherently nonlinear and influenced by multiple factors.
Machine learning, with its robust nonlinear mapping capabilities, has demonstrated efficacy
in addressing nonlinear problems in load forecasting. Traditional machine learning tech-
niques, such as support vector machines, decision trees, and random forests, are generally
suited for smaller datasets and are effective in addressing nonlinear problems. Reference [4]
addressed issues of low accuracy and inadequate consideration of seasonality in traditional
EV load forecasting by proposing a seasonal EV charging load prediction model based on
random forests. Reference [5] introduced an adaptive improvement method for Particle
Swarm Optimization to solve high-dimensional EV charging load prediction models.
Deep learning-based forecasting methods utilize neural networks as parameter struc-
tures for optimization. Neural networks, also known as artificial neural networks, are
mathematical models inspired by the way biological neural networks process and trans-
mit information. Commonly used neural networks for power load forecasting include
backpropagation (BP) networks, convolutional neural networks (CNNs), recurrent neural
networks (RNNs), and emerging Transformer models. For example, Reference [6] proposed
a spatiotemporal graph convolutional network (GCN+LSTM) that integrates graph convo-
lutional networks (GCNs) with Long Short-Term Memory (LSTM) networks to improve
the accuracy of electric vehicle (EV) charging demand predictions and alleviate traffic
congestion in high-demand areas. Reference [7] integrated Kolmogorov-Arnold Networks
(KANs) into traditional machine learning frameworks, specifically Convolutional Neural
Networks (CNNs). Reference [8] developed a short-term forecasting method based on a
bidirectional LSTM (BiLSTM) neural network optimized by a Sparrow Search Algorithm
(SSA) and variational mode decomposition (VMD).
Moreover, the accuracy of EV charging load forecasting is significantly influenced by
external factors, among which weather conditions play an essential role. Weather variables,
such as temperature, humidity, wind speed, and solar radiation, have a significant impact
on EV charging demand. Therefore, incorporating weather factors is crucial for enhancing
the accuracy and reliability of forecasting models. Reference [9] proposed an EV charging
load forecasting method that considers multiple influencing factors, including weather.
Reference [10] explored the effects of meteorological conditions on the spatiotemporal
distribution of EV charging loads at highway service areas, presenting a weather-inclusive
forecasting model for these areas. Reference [11] proposed a hybrid forecasting frame-
work that integrates Partial Least Squares Regression (PLSR) with a lightweight Gradient
Boosting Machine (LightGBM), combined with Bayesian hyperparameter optimization and
a second-order cone optimal scheduling scheme. This approach significantly improves
short-term load forecasting accuracy, with the Mean Absolute Error (MAE) reduced by
12.7%. Reference [12] addressed the high randomness and spatial heterogeneity of elec-
tric vehicle charging load by proposing a machine learning forecasting framework based
on a dual perspective of industrial parks and charging stations. By combining the MLP
and LSTM algorithms, the framework achieves high-precision forecasting for weekdays
(LSTM R2 = 0.9283), holidays (R2 = 0.9154), and weekends (MLP R2 = 0.9586).
As the world’s largest automobile market and leading producer of electric vehicles
(EVs), China is experiencing rapid growth in the number of EVs in use. In China, there are
significant regional differences in electric vehicle usage patterns and charging demands.
Taking first-tier cities such as Beijing, Shanghai, and Shenzhen as examples, due to their
dense populations and developed public transportation systems, electric vehicles are
primarily used for urban commuting, and charging demand is concentrated during the
World Electr. Veh. J. 2025, 16, 150 3 of 19
morning and evening peak hours on weekdays, as well as at residential charging stations
at night. In some second- and third-tier cities, the use of electric vehicles is more diverse,
including operational vehicles such as taxis and ride-hailing services, resulting in more
scattered and uncertain charging demand.
In order to accurately predict the electric vehicle charging load for rational grid capac-
ity planning, optimized charging facility layout, and maintaining power system stability,
this study proposes a PSO-LSTM model that combines Particle Swarm Optimization (PSO)
with the LSTM network to improve the accuracy of EV charging load prediction. Com-
parative experimental case studies between the PSO-LSTM model and traditional models
demonstrate that the PSO-LSTM model significantly outperforms others in prediction accu-
racy, providing effective data support for grid and power system operations. This research
not only validates the correctness and effectiveness of the PSO-LSTM model but also offers
new perspectives and solutions for the field of electric vehicle charging load forecasting.
2. Model Description
2.1. Parameters and Data
To ensure the scientific rigor and reproducibility of this study, this research systemati-
cally set and rigorously validated key parameters during the construction and optimization
of the electric vehicle charging load prediction model. These parameters cover core aspects
such as data preprocessing, model architecture design, and optimization algorithm config-
uration. Table 1 provides a complete list of the definitions and settings of all parameters,
allowing readers to fully understand the implementation details and key design logic of
this study.
To ensure the academic rigor of the expressions and consistency in reader compre-
hension, this study provides systematic definitions and standardized explanations of the
technical term abbreviations used. Table 2 provides a complete list of all abbreviations
used in the paper, along with their corresponding full forms, definitions, and application
contexts, in order to eliminate potential ambiguities.
World Electr. Veh. J. 2025, 16, 150 4 of 19
Traffic flow studies have shown that travelers’ travel patterns exhibit distinct regular-
ities, which, in turn, determine the patterns of urban road traffic conditions. As a result,
the traffic flow fluctuations generated by travelers on the same road segment also follow
a regular pattern, including periodic trends with yearly, quarterly, monthly, weekly, and
daily cycles. Collecting data with these time spans will form long-term, medium-term, and
short-term traffic flow patterns with strong regularity and high similarity in the fluctuation
curves. Based on the basic characteristics of urban road traffic flow, it is known that traffic
flow data are time-series data, making time similarity analysis essential.
The modeling data used in this study were sourced from a single charging pile. The
dataset includes the start and end times for each charging event, as well as the total energy
consumed [13]. The raw dataset was converted into a corresponding dataset containing the
hourly average charging load in kW—P(kWh). The data selected span from 1 January 2023
to 29 April 2023, with a prediction time period of one week, with an interval of one hour,
totaling 2856 data points.
To thoroughly analyze the seasonal variation characteristics of the charging load, this
study performed initial data preprocessing, including missing value imputation, outlier
removal, and normalization, to ensure data completeness and consistency. Subsequently, the
data were divided into the following three periods based on common seasonal classifications:
• Winter Period (1 January–3 February): During this period, low temperatures may lead
to reduced battery performance and changes in user charging behavior, potentially
affecting the charging load.
• Winter–Spring Transition Period (4 February–4 March): As temperatures gradually
rise, the charging load may exhibit transitional characteristics.
• Spring Period (5 March–29 April): In this period, temperatures are moderate, and the
charging load is likely to stabilize, reflecting typical spring user behavior patterns.
To assess the predictive performance of the model, the data for each period were split
into training and testing sets in an 80:20 ratio. The training set was used for model training
and hyperparameter optimization, while the testing set was employed to evaluate the
final prediction accuracy and generalization capability of the model. This study, through
seasonal partitioning and visualization analysis, comprehensively captured the seasonal
variation patterns of the charging load. By training the model independently for each
season, the influence of seasonal factors on the model was effectively reduced, allowing the
variation patterns of the charging load. By training the model independently for each sea-
World Electr. Veh. J. 2025, 16, 150 variation patterns of the charging load. By training the model independently for 5each of 19 sea-
son, the influence of seasonal factors on the model was effectively reduced, allowing the
son, the influence of seasonal factors on the model was effectively reduced, allowing the
seasonal characteristics to be more accurately modeled, thereby enhancing the accuracy
seasonal characteristics to be more accurately modeled, thereby enhancing the accuracy
ofseasonal
the prediction results.to be more accurately modeled, thereby enhancing the accuracy of
characteristics
of the prediction results.
Figures 1–3
the prediction present the three-dimensional visualization results of the charging load
results.
Figures 1–3 present the three-dimensional visualization results of the charging load
data, Figures
segmented by season.
1–3 present In the figures, thevisualization
the three-dimensional x-axis represents 24ofhthe
results of charging
a day, theload
y-axis
data, segmented by season. In the figures, the x-axis represents 24 h of a day, the y-axis
represents the days,
data, segmented and theInz-axis
by season. represents
the figures, the charging
the x-axis representspower (ina kWh).
24 h of day, theThe three-
y-axis
represents the days, and the z-axis represents the charging power (in kWh). The three-
dimensional
represents theplots provide
days, and theanz-axis
intuitive observation
represents of the power
the charging spatial–temporal
(in kWh). The distribution
three-
dimensional plots provide an intuitive observation of the spatial–temporal distribution
dimensional plots provide an intuitive observation of
characteristics of the charging load across different seasons. the spatial–temporal distribution
characteristics of the charging load across different seasons.
characteristics of the charging load across different seasons.
Figure
Figure1.1.The three-dimensional spatial–temporaldistribution
distributionof of charging load in the winter season.
Figure 1. The
The three-dimensional
three-dimensionalspatial–temporal
spatial–temporal distribution charging load
of charging in the
load in winter season.
the winter season.
Figure 2. The three-dimensional spatial–temporal distribution of charging load during the winter–
Figure 2. The three-dimensional spatial–temporal distribution of charging load during the winter–
spring transition
Figure period.
2. The three-dimensional spatial–temporal distribution of charging load during the winter–
spring transition period.
spring transition period.
WorldWorld
Electr. Veh.
Electr. J. 2025,
Veh. J. 2025,16,
16, xx FOR PEER
FOR PEER REVIEW
REVIEW 6 of 19
World Electr. Veh. J. 2025, 16, 150 6 of 19
Figure 3. The three-dimensional spatial–temporal distribution of charging load in the spring season.
Figure4.4.AAstructural
Figure structuraldiagram
diagramofofthe
theLSTM
LSTMnetwork.
network.
World Electr. Veh. J. 2025, 16, 150 7 of 19
where f t represents the output of the forget gate, σ is the sigmoid function, W f denotes the
weight matrix, ht−1 is the hidden state from the previous time step, xt is the current input,
and b f represents the bias term.
2. Input Gate:
It controls the storage of new information.
et = tanh(WC · [ht−1 , xt ] + bC )
C (3)
Ct = f t · Ct−1 + it · C
et (4)
ht = ot · tanh(Ct ) (6)
where ot represents the output of the output gate, and ht is the hidden state at the current
time step.
The information propagation process in an LSTM neural network is described as
follows [15]:
• Forgetting and Memory: The input information and stored information are multiplied
by weight matrices, and after adding the bias term, they pass through a sigmoid
function for normalization to obtain the final input information.
• New Information Input: During the input phase, the data are processed by passing
them through the weight matrix and multiplying them with the activation matrix,
producing the information that will be transferred to the memory unit.
• Cell State Update and Information Output: The results of the first two steps are
combined to compute the current cell state. This cell state is then multiplied by the
output matrix to generate the final output.
past time steps. In contrast, the bidirectional LSTM processes sequence data in both for-
ward and backward directions simultaneously, enabling it to capture comprehensive con-
forward and backward directions simultaneously, enabling it to capture comprehensive
textual information from both past and future time steps [16]. This design is particularly
contextual information from both past and future time steps [16]. This design is particularly
suitable for scenarios where there is a strong correlation between future and past time
suitable for scenarios where there is a strong correlation between future and past time
steps, such as in periodic load data. By integrating information from both time directions,
steps, such as in periodic load data. By integrating information from both time directions,
bidirectional LSTM significantly improves the model’s ability to understand the latent
bidirectional LSTM significantly improves the model’s ability to understand the latent
patterns in the sequence [17]. Stacking multiple LSTM layers allows the model to gradu-
patterns in the sequence [17]. Stacking multiple LSTM layers allows the model to gradually
ally extract higher-dimensional patterns, from shallow features to deeper ones.
extract higher-dimensional patterns, from shallow features to deeper ones.
This stacking approach mimics the structure of deep neural networks, progressively
This stacking approach mimics the structure of deep neural networks, progressively
extracting more abstract features. During implementation, to ensure smooth information
extracting more abstract features. During implementation, to ensure smooth information
transmission across multiple layers, the first LSTM layer is set with “return_sequences =
transmission across multiple layers, the first LSTM layer is set with “return_sequences = True”
True” to output the complete time series for the next layer. The final LSTM layer is con-
to output the complete time series for the next layer. The final LSTM layer is configured
figured with “return_sequences = False” to produce a fixed-length vector as output, which
with “return_sequences = False” to produce a fixed-length vector as output, which is then
is then connected to a dense layer for subsequent stages of prediction tasks.
connected to a dense layer for subsequent stages of prediction tasks.
The architecture of the improved LSTM model is shown in Figure 5. In this design,
The architecture of the improved LSTM model is shown in Figure 5. In this design, the
the bidirectional LSTM processes both forward and backward sequence data simultane-
bidirectional LSTM processes both forward and backward sequence data simultaneously,
ously, enabling the model to capture global contextual information between past and fu-
enabling the model to capture global contextual information between past and future time
ture time steps. The stacking of LSTM layers is as follows: The first layer consists of a
steps. The stacking of LSTM layers is as follows: The first layer consists of a bidirectional
bidirectional LSTM layer, with a dropout layer inserted in between to randomly deacti-
LSTM layer, with a dropout layer inserted in between to randomly deactivate neurons
vate neurons and reduce overfitting. The final LSTM layer outputs a fixed-length vector,
and reduce overfitting. The final LSTM layer outputs a fixed-length vector, which is then
which is then connected to a fully connected layer for the final load prediction. The input
connected to a fully connected layer for the final load prediction. The input layer receives
layer receives sequential data, with “x0, x1, x2, ..., xi” representing the sequence elements.
sequential data, with “x0, x1 , x2 , . . ., xi ” representing the sequence elements. “A” and “A‘“
“A” and “A‘“ represent LSTM units processing in opposite directions, with red arrows
represent LSTM units processing in opposite directions, with red arrows indicating forward
indicating forward time steps and blue arrows denoting backward time steps. The se-
time steps and blue arrows denoting backward time steps. The sequence “h0 , h1 , h2 , . . ., hi ”
quence “h0, h1, h2, ..., hi” represents the output hidden states.
represents the output hidden states.
Figure 5.
Figure A diagram
5. A diagram of
of the
the improved
improved model
model structure.
structure.
World Electr. Veh. J. 2025, 16, 150 9 of 19
t +1 t
Xid = Xid + Vidt+1 (8)
t and V t represent the velocity and position of particle i in dimension d during
Here, Xid id
iteration t, respectively. Pbestidt is the best position found by particle i in dimension d up
to iteration t, while Gbesttd is the best position found by the entire swarm in dimension d
during iteration t. t and t + 1 represent the current and next iterations, respectively [21]. ω
denotes the inertia weight, and c1 and c2 are the acceleration coefficients for the cognitive
and social components, respectively. r1 and r2 are random numbers uniformly distributed
in the range [0, 1]. The personal best position and the global best position of each particle
are updated in each iteration using the following equations:
X t +1 t +1 t
t +1 id , f Xid ≤ f Pbestid
Pbestid = (9)
Pbestt t +1 t
id , f Xid > f Pbestid
Gbesttd t +1
, f Gbesttd ≤ min f Pbestid
Gbestdt+1 = i (10)
Pbestt+1 t +1
, f Gbesttd > min f Pbestid
id i
Figure 6 illustrates the concept of particle optimization, depicting the Particle Swarm
Optimization (PSO) algorithm. The particle velocity update equation in the PSO algorithm
Gbestd = (10)
t +1
Pbestid
i
(
, f ( Gbestdt ) > min f ( Pbestidt +1 ) )
World Electr. Veh. J. 2025, 16, 150 Figure 6 illustrates the concept of particle optimization, depicting the Particle Swarm 10 of 19
Optimization (PSO) algorithm. The particle velocity update equation in the PSO algorithm
consists of three components. The first component represents the particle’s inertia, reflect-
ingconsists of three components.
its “memory” Thevelocity,
of the previous first component represents the
and is controlled by particle’s
the inertiainertia,
weight ( ω ).
reflecting
its “memory” of the previous velocity, and is controlled by the inertia
The second component captures the particle’s self-awareness, driving it to move toward weight (ω). The
t t +1
itssecond component
own historical bestcaptures
position the particle’s
( Pbest self-awareness,
), which driving it to move
reflects “self-awareness.” Here,toward
“ X ”its
own historical best position (Pbestt ), whicht +1 reflects “self-awareness”. Here, “X t+1 ” denotes
denotes the adjusted position, tand “ V ” represents the velocity increment. The third
the adjusted position, and “V +1 ” represents the velocity increment. The third component
component represents the exchange of information and cooperation between particles,
represents the exchange of information and cooperation between particles, referred to
referred to as “socialization” [22], guiding particles toward the global best position (
as “socialization”
t [22], guiding particles toward the global best position (Gbestt ) and
Gbest ) and group
facilitating facilitating group information
information sharing. sharing.
Figure
Figure 6. Diagram
6. Diagram of of particle
particle global
global andand historical
historical optimal
optimal solutions,
solutions, velocity,
velocity, and
and position.
position.
In In applied
applied research,the
research, theeffectiveness
effectivenessofofaamodel
modelisisprimarily
primarilydetermined
determined byby the
the care-
careful
selection of key hyperparameters. However, traditional manual tuning
ful selection of key hyperparameters. However, traditional manual tuning methods are methods are not
only
not onlytime-consuming
time-consumingand andlabor-intensive
labor-intensivebut but also prone
also pronetoto
getting stuck
getting stuckinin
local optima,
local op-
preventing the model from fully realizing its potential. To address this issue,
tima, preventing the model from fully realizing its potential. To address this issue, this this study
uses the Particle Swarm Optimization (PSO) algorithm to optimize five key parameters
study uses the Particle Swarm Optimization (PSO) algorithm to optimize five key param-
in the LSTM model: the number of neurons in the first hidden layer (Hidden Units1), the
eters in the LSTM model: the number of neurons in the first hidden layer (Hidden Units1),
number of neurons in the second hidden layer (Hidden Units2), the dropout rate (Dropout),
the number of neurons in the second hidden layer (Hidden Units2), the dropout rate
the batch size (Batch Size), and the number of training epochs (Epochs). Through the global
(Dropout), the batch size (Batch Size), and the number of training epochs (Epochs).
search capabilities and efficient parameter adjustment strategy of the PSO algorithm, this
Through the global search capabilities and efficient parameter adjustment strategy of the
study aims to identify the optimal combination of parameters to maximize the predictive
PSO algorithm, this study aims to identify the optimal combination of parameters to max-
performance of the LSTM model while minimizing resource consumption during the
imize the predictive performance of the LSTM model while minimizing resource con-
training process.
sumption during the training process.
2.4. Development of the Electric Vehicle Load Forecasting Model
2.4. Development of the Electric Vehicle Load Forecasting Model
Identifying the optimal parameters for a predictive model is inherently challenging,
Identifying the optimal parameters for a predictive model is inherently challenging,
and the training process is both time-consuming and computationally intensive. The
and the training process is both time-consuming and computationally intensive. The op-
optimal values obtained through manual tuning are often only locally optimal rather than
timal values obtained through manual tuning are often only locally optimal rather than
globally optimal. To mitigate the errors and randomness introduced by manual tuning
globally optimal. To mitigate the errors and randomness introduced by manual tuning
and to achieve automatic parameter optimization, this study employs a swarm intelligence
and to achieve automatic parameter optimization, this study employs a swarm intelli-
optimization algorithm for parameter adjustment and optimization.
gence optimization algorithm for parameter adjustment and optimization.
The predictive model in this study is based on the LSTM architecture, where key
hyperparameters—such as the number of neurons, dropout rate, batch size, and number of
training epochs—significantly influence both prediction accuracy and convergence speed.
To enhance the model’s performance, the Particle Swarm Optimization (PSO) algorithm is
used to optimize these hyperparameters [23].
Figure 7 presents the overall framework of the PSO-LSTM model. Starting from
the data preprocessing stage, the purple box section performs cleaning, missing value
imputation, and normalization on the raw charging pile load data to ensure the quality
speed. To enhance the model’s performance, the Particle Swarm Optimization (PSO) al-
gorithm is used to optimize these hyperparameters [23].
Figure 7 presents the overall framework of the PSO-LSTM model. Starting from the
World Electr. Veh. J. 2025, 16, 150 data preprocessing stage, the purple box section performs cleaning, missing value impu- 11 of 19
tation, and normalization on the raw charging pile load data to ensure the quality and
consistency of the input data. Hyperparameter optimization plays a critical role in en-
and consistency
hancing of the input
model performance. In data.
the blueHyperparameter
box section, the optimization
PSO algorithm playsis aused
critical role in
to opti-
mize the hyperparameters of the LSTM model, ensuring that the model identifies the glob- to
enhancing model performance. In the blue box section, the PSO algorithm is used
optimize
ally optimalthe hyperparameters
parameter of the
combination LSTMcomplex
within model, ensuring that the
data patterns, model improving
thereby identifies the
globally optimal parameter combination within complex data patterns,
prediction accuracy and stability. The model construction phase is represented within thereby improving
the
green box, where, based on the optimized parameters, a prediction model is built thatthe
prediction accuracy and stability. The model construction phase is represented within
green box,
includes where, based
a bidirectional LSTMon the
layer,optimized
a dropout parameters,
layer, and aaprediction modellayer.
fully connected is builtThe that
includes a LSTM
bidirectional bidirectional LSTM layer,
layer captures a dropout
both forward andlayer, and atemporal
backward fully connected layer. in
dependencies The
the data, while the dropout layer effectively prevents overfitting by randomly deactivat- in
bidirectional LSTM layer captures both forward and backward temporal dependencies
ingthe data, while
neurons. the dropout
The fully layer
connected effectively
layer prevents
integrates overfitting
the extracted by randomly deactivating
high-dimensional features
and outputs the final prediction results. This framework not only significantly features
neurons. The fully connected layer integrates the extracted high-dimensional enhances and
theoutputs
model’sthe final prediction
predictive results.
performance butThis
alsoframework not only
provides reliable significantly
technical support enhances
for accu-the
model’s predictive performance but also provides
rate load forecasting of electric vehicle charging stations. reliable technical support for accurate
load forecasting of electric vehicle charging stations.
The dataset used in this study, after undergoing missing value imputation, reorganiza-
tion, and standardization, consists of a total of 2857 samples. Among them, 2352 samples
are used in the validation set and fed into the model for parameter optimization. Once
the optimization process is complete, the position of the global best particle in the five-
dimensional space represents the optimal hyperparameters for the LSTM model.
The specific process for optimizing LSTM parameters using PSO is as follows:
Step 1: Initialize the parameters of the LSTM algorithm. Define the five hyperpa-
rameters to be optimized as the five-dimensional particle space. Initialize the position X
and velocity V of each particle; set the number of particles, N; and specify the number of
iterations, num.
Step 2: Define the fitness function as the mean squared error (MSE) of the LSTM model
on the validation set. Compute the fitness values of all particles and identify the positions
corresponding to the initial personal best and global best, with the lowest MSE values
serving as the criteria.
Step 3: Update the velocity and position of each particle according to Equations (9)
and (10), and then recalculate the fitness value for each particle.
Step 4: Update the individual best position and the global best position.
Step 5: Check whether the termination condition is met. If yes, output the global best
position; if not, return to Step 3.
The final position of the global best particle in the five-dimensional space at the
conclusion of the optimization process is considered the optimal set of hyperparameters for
the LSTM model. Based on these optimized parameters, the PSO-LSTM predictive model is
constructed. The initialization parameters for the PSO-LSTM model are shown in Table 3.
ically designed for Python programming. It offers features such as code editing, debugging,
testing, version control, and more, specifically designed for Python development, including
powerful code autocompletion, code inspection, and one-click code navigation.
The specific experimental environment is shown in Table 4.
1 n
n i∑
MAE = |yi − ŷi | (11)
=1
where n is the number of samples, yi represents the actual values, and ŷi denotes the
predicted values.
2. MSE is the average of the squared errors:
It controls the storage of new information.
1 n
n i∑
MSE = (yi − ŷi )2 (12)
=1
where n is the number of samples, yi represents the actual values, and ŷi denotes the
predicted values.
3. RMSE is the square root of the Mean Squared Error (MSE):
It determines the output of the hidden state.
s
1 n
n i∑
RMSE = (yi − ŷi )2 (13)
=1
where n is the number of samples, yi represents the actual values, and ŷi denotes the
predicted values.
During the experiment, we independently trained and tested three different time-series
forecasting models—GRU, LSTM, and PSO-LSTM—on the same dataset. This process was
crucial for capturing the models’ performance under different random initialization condi-
tions, allowing for an accurate assessment of their stability [26]. Specifically, the training
process for the GRU and LSTM models involved data normalization, dataset creation, net-
work structure definition, and training loops. In contrast, the LSTM model optimized using
PSO incorporated an additional phase, where the Particle Swarm Optimization algorithm
was applied to fine-tune the parameters.
This study conducted three independent experiments using charging load data from
different seasons, namely, winter, winter–spring transition, and spring, for model training
and testing. Figures 8–10 present comparison curves between the real load values and the
tion
mizedalgorithm
using PSO wasincorporated
applied to fine-tune the parameters.
an additional phase, where the Particle Swarm Optimiza-
tion This studywas
algorithm conducted
appliedthree independent
to fine-tune experiments using charging load data from
the parameters.
different seasons, namely, winter, winter–spring
This study conducted three independent experiments transition, using
and spring,
charging for model training
load data from
World Electr. Veh. J. 2025, 16, 150 and testing. Figures 8–10 present
different seasons, namely, winter,comparison
winter–spring curves between
transition, andthespring,
real loadfor values and
15 ofthe
model training 19
predicted
and testing.results
Figuresfrom different
8–10 presentmodels (GRU,curves
comparison LSTM,between
PSO-LSTM) for load
the real eachvalues
season.and
In the
the
figures,
predictedtheresults
black from
solid different
line represents
modelsthe real LSTM,
(GRU, load values, the green
PSO-LSTM) for dashed line repre-
each season. In the
predicted
sents results from
thethe
predicted different
results models (GRU, LSTM, PSO-LSTM) for each season.the
Inpre-
the
figures, black solid linefrom the GRU
represents themodel, the values,
real load blue dashed line
the green represents
dashed line repre-
figures,
dicted the black
results solid
from the line represents
original the real load values, the green dashed line represents
sents the predicted results fromLSTM
the GRUmodel, andthe
model, theblue
red dashed
dotted line
line represents
represents thethe pre-
pre-
the predicted
dicted results results
from from
the the GRU model,
PSO-optimized LSTM themodel.
blue dashed line represents the predicted
dicted results from the original LSTM model, and the red dotted line represents the pre-
results from the original LSTM model, and the red dotted line represents the predicted
dicted results from the PSO-optimized LSTM model.
results from the PSO-optimized LSTM model.
Figure 8. Winter charging load prediction comparison (1 January 2023–3 February 2023).
Figure 8. Winter charging load prediction comparison (1 January 2023–3 February 2023).
Figure 8. Winter charging load prediction comparison (1 January 2023–3 February 2023).
The model output data were organized and summarized for evaluation metric analysis,
with MAE, MSE, and RMSE calculated for each model. MAE provides a straightforward
measure of the average absolute deviation between the model’s predicted values and
the true values. A smaller MAE indicates that the model’s predictions have a smaller
average error, meaning the predictions are closer to the actual values. MSE averages the
squared prediction errors, amplifying the effect of larger errors and offering a more sensitive
reflection of how well the model handles outliers. A smaller MSE suggests better overall
prediction accuracy. RMSE, the square root of MSE, has the same units as the original data,
making it easier to compare with the actual data and providing a more accurate measure of
the average deviation between predicted and true values.
Figure 9. Winter–spring transition charging load prediction comparison (4 February 2023–4 March
World Electr. Veh. J. 2025, 16, 150 16 of 19
2023).
Table 6 summarizes
The model output data all were
the evaluation
organized metrics, which helps
and summarized us gain insights
for evaluation metric from
anal-
multiple dimensions into the strengths and weaknesses of each model under
ysis, with MAE, MSE, and RMSE calculated for each model. MAE provides a straightfor- different
seasonal conditions. This will provide a solid and reliable foundation for further model
ward measure of the average absolute deviation between the model’s predicted values
optimization and the development of charging load forecasting strategies.
and the true values. A smaller MAE indicates that the model’s predictions have a smaller
average error, meaning the predictions are closer to the actual values. MSE averages the
Table 6. Comparison of model results.
squared prediction errors, amplifying the effect of larger errors and offering a more sen-
Model sitive reflection
Seasonof how well the modelMAE handles outliers. A smaller MSE suggests
MSE RMSE better
overall prediction
Winter accuracy. RMSE, the square
4.335 root of MSE, has
33.171 the same units as the orig-
5.760
GRU inal data, making
Winter–spring it easier to compare with
transition 4.365the actual data33.382
and providing a more5.778accurate
measure of the average deviation between
Spring predicted and 31.046
4.219 true values. 5.572
TableWinter
6 summarizes all the evaluation4.170 metrics, which helps us gain insights
29.701 5.450 from
LSTM multiple dimensions
Winter–spring into the strengths4.050
transition and weaknesses 30.625
of each model under different
5.534
Spring 4.019 31.292 5.594
seasonal conditions. This will provide a solid and reliable foundation for further model
optimization and the development of charging
Winter 3.896 load forecasting
28.717 strategies. 5.359
PSO-LSTM Winter–spring transition 3.806 29.012 5.386
Spring of model results.
Table 6. Comparison 3.910 29.796 5.458
in terms of prediction accuracy but still falls short compared to PSO-LSTM in most cases,
suggesting that although the LSTM model has some capacity for handling sequential data,
it has limitations when facing complex seasonal charging load patterns. The PSO-LSTM
model, however, maintains a low error level across all three seasons (winter, winter–
spring transition, and spring), reflecting excellent generalization ability and adaptability to
different seasonal data characteristics. Furthermore, it outperforms both GRU and original
LSTM models in key evaluation metrics—MAE, MSE, and RMSE—demonstrating that
the LSTM model optimized by the PSO algorithm offers superior performance and higher
stability in load forecasting.
In conclusion, the LSTM model’s powerful sequence modeling capability gives it an
advantage in time-series forecasting. When combined with PSO algorithm-based parameter
optimization, the model’s performance is further enhanced, improving both prediction
accuracy and computational efficiency. These findings not only validate the effectiveness
of the LSTM model and PSO algorithm in time-series forecasting but also provide new
directions for future research [27]. This involves examining the integration of various opti-
mization algorithms with deep learning models and exploring their potential applications
across a broader range of fields.
4. Conclusions
This study proposes a PSO-LSTM network model to enhance the accuracy and ro-
bustness of electric vehicle charging load forecasting. The LSTM model is trained using
historical load data, and the PSO algorithm dynamically optimizes its key hyperparame-
ters, such as the number of hidden layer neurons, dropout rate, batch size, and number
of training epochs. Experimental results show that the PSO-LSTM model significantly
outperforms traditional methods across different seasonal scenarios. For example, in winter,
its MAE (3.896), MSE (28.717), and RMSE (5.359) are reduced by 10.13%, 13.43%, and 6.96%
compared to the GRU model and by 6.57%, 3.31%, and 1.67% compared to the original
LSTM model, validating the effectiveness of the PSO algorithm in parameter tuning. Addi-
tionally, the model maintains stable prediction performance in the winter–spring transition
period and spring, demonstrating its ability to effectively capture seasonal load variation
patterns and reduce the impact of climate and user behavior differences on forecast re-
sults, providing a highly adaptable solution for power system scheduling and charging
facility planning.
Future research can expand in the following directions: First, by integrating me-
teorological data (e.g., temperature, humidity, wind speed) and traffic information, a
multi-variable input model could be constructed to more comprehensively characterize the
coupling relationship between charging behaviors and external environmental factors [28].
Second, a hybrid model architecture combining BiLSTM and a CNN could be developed,
leveraging the bidirectional temporal modeling capability of BiLSTM and the spatial feature
extraction advantages of CNN to further enhance the model’s ability to analyze complex
spatiotemporal patterns [29].
Author Contributions: Study conception and design: L.Z.; data collection: X.Y. and X.H.; analysis and
interpretation of results: X.H. and L.Z.; draft manuscript preparation: L.Z. and X.Y.; draft manuscript
editing and reviewing: X.H. and L.Z. All authors have read and agreed to the published version of
the manuscript.
Funding: This research was funded by [the National Natural Science Foundation of China] grant
number [61773243]; and [the Science and Technology Project of Chongqing Municipal Education
Commission] grant number [KJZD-K202101702].
World Electr. Veh. J. 2025, 16, 150 18 of 19
Data Availability Statement: The raw data supporting the conclusions of this article will be made
available by the authors upon reasonable request.
Conflicts of Interest: The authors declare that they have no conflicts of interest.
References
1. Lee, Z.J. Large-Scale Adaptive Electric Vehicle Charging. In Proceedings of the 2018 IEEE Global Conference on Signal and
Information Processing (GlobalSIP), Anaheim, CA, USA, 26–28 November 2018; pp. 863–864. [CrossRef]
2. Wei, J.Z.; Ma, Z.P. Monte-Carlo-algorithm-based Load Prediction of Electric Vehicles Large-scale Charging. Electr. Eng. 2024,
3, 49–53. [CrossRef]
3. Shi, W.Q.; Wu, K.Y.; Wang, D.X. Eclectic Power System Short-Term Load Forecasting Model Based on Time Series Analysis and
Kalman Filter Algorithm. Control Theory Appl. 2018, 37, 9–12+23.
4. Zhang, X.; Li, L. Seasonal Electric Vehicle Charging Load Prediction Based on Random Forest. Softw. Eng. 2024, 27, 11–14+37.
5. Song, M.S.; Li, Z.W.; Song, S. Research on the Optimization Strategy of Electric Vehicle OrderlyCharge and Discharge in Intelligent
Community. Tech. Autom. Appl. 2022, 41, 17–22+27. [CrossRef]
6. Geng, P.; Yang, H.J.; Shi, Z.X. Electric Vehicle Forecasting Charging Demand Based on Spatiotemporal Graph Convolutional
Networks. J. Transp. Eng. 2024, 24, 37–45.
7. Pei, Z.; Zhang, Z.; Chen, J. KAN-CNN: A Novel Framework for Electric Vehicle Load Forecasting with Enhanced Engineering
Applicability and Simplified Neural Network Tuning. Electronics 2025, 14, 414. [CrossRef]
8. Liu, Y.X.; Gao, H. Load Prediction Method of Charging Station Based on SSA-VMD-BiLSTM Model. Guangdong Electr. Power 2024,
37, 53–61.
9. Lin, X.; Zhang, H.; Ma, Y.L. Electric vehicle charging load prediction based on improved LSTM neural network. Mod. Electron.
Tech. 2024, 47, 97–101.
10. Huang, Y.X.; Xiao, S.W. Forecasting of electric vehicle charging load in highway service areas considering meteorological factors.
Appl. Energy 2025, 383, 125337.
11. Yin, W.; Ji, J. Research on EV charging load forecasting and orderly charging scheduling based on model fusion. Energy 2024,
290, 130126. [CrossRef]
12. Ma, S.; Ning, J.; Mao, N.; Liu, J.; Shi, R. Research on Machine Learning-Based Method for Predicting Industrial Park Electric
Vehicle Charging Load. Sustainability 2024, 16, 7258. [CrossRef]
13. Ge, Q.; Guo, C.; Jiang, H. Industrial power load forecasting method based on reinforcement learning and PSO-LSSVM. IEEE
Trans. Cybern. 2020, 52, 1112–1124. [CrossRef] [PubMed]
14. Jin, Y.; Guo, H.; Wang, J. A hybrid system based on LSTM for short-term power load forecasting. Energies 2020, 13, 6241. [CrossRef]
15. Saoud, A.; Recioui, A. Load Energy Forecasting based on a Hybrid PSO LSTM-AE Model. Alger. J. Environ. Sci. Technol. 2023,
9, 2938–2946.
16. Liu, X.; Ma, Z.; Guo, H. Short-term power load forecasting based on DE-IHHO optimized BiLSTM. IEEE Access 2024, 12, 145341–145349.
[CrossRef]
17. Lai, Y.; Wang, Q.; Chen, G. Short-term Power Load Prediction Method based on VMD and EDE-BiLSTM. IEEE Access 2024,
13, 10481–10488. [CrossRef]
18. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural
Networks, Perth, WA, Australia, 27 November—1 December 1995; Volume 4, pp. 1942–1948.
19. Liu, Z.; Chen, X.; Liang, X.; Huang, S.; Zhao, Y. Research on Sustainable Form Design of NEV Vehicle Based on Particle Swarm
Algorithm Optimized Support Vector Regression. Sustainability 2024, 16, 7812. [CrossRef]
20. Dai, X.; Sheng, K.; Shu, F. Ship power load forecasting based on PSO-SVM. Math. Biosci. Eng. 2022, 19, 4547–4567. [CrossRef]
21. Geng, G.; He, Y.; Zhang, J. Short-term power load forecasting based on PSO-optimized VMD-TCN-attention mechanism. Energies
2023, 16, 4616. [CrossRef]
22. Fan, W.; Hu, Z.; Veerasamy, V. PSO-based model predictive control for load frequency regulation with wind turbines. Energies
2022, 15, 8219. [CrossRef]
23. Jain, M.; Saihjpal, V.; Singh, N. An overview of variants and advancements of PSO algorithm. Appl. Sci. 2022, 12, 8392. [CrossRef]
24. Kim, H.J.; Kim, M.K. Spatial-Temporal Graph Convolutional-Based Recurrent Network for Electric Vehicle Charging Stations
Demand Forecasting in Energy Market. IEEE Trans. Smart Grid 2024, 15, 3979–3993. [CrossRef]
25. Güven, A.F. Integrating electric vehicles into hybrid microgrids: A stochastic approach to future-ready renewable energy solutions
and management. Energy 2024, 303, 131968. [CrossRef]
26. Ding, L.; Ke, S.; Zhang, F. Forecasting of electric-vehicle charging load considering travel demand and guidance strategy. Electr.
Power Constr. 2024, 45, 10–26.
World Electr. Veh. J. 2025, 16, 150 19 of 19
27. Zhang, Q.; Lu, J.; Kuang, W.; Wu, L.; Wang, Z. Short-Term Charging Load Prediction of Electric Vehicles with Dynamic Traffic
Information Based on a Support Vector Machine. World Electr. Veh. J. 2024, 15, 189. [CrossRef]
28. Qian, Y.; Kong, Y.; Huang, C. Review of Power Load Forecasting. Sichuan Electr. Power Technol. 2023, 46, 37–43+58. [CrossRef]
29. Zhang, X.W.; Liang, J.; Wang, Y.G.; Han, J. Overview of Research on Spatiotemporal Distribution Prediction of Electric Vehicle
Charging. Electr. Power Constr. 2023, 44, 161–173.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.