Next Article in Journal
Antioxidant and Anti-Inflammatory Phytochemicals for the Treatment of Inflammatory Bowel Disease: A Systematic Review
Previous Article in Journal
Virtual Power Plant’s Optimal Scheduling Strategy in Day-Ahead and Balancing Markets Considering Reserve Provision Model of Energy Storage System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Improved Sparrow Search Algorithm for the Optimization of Variational Modal Decomposition Parameters

1
Key Laboratory of CNC Equipment Reliability, Ministry of Education, Jilin University, Changchun 130025, China
2
School of Mechanical and Aerospace Engineering, Jilin University, Changchun 130025, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(5), 2174; https://doi.org/10.3390/app14052174
Submission received: 19 January 2024 / Revised: 20 February 2024 / Accepted: 23 February 2024 / Published: 5 March 2024

Abstract

:
Variational modal decomposition (VMD) is frequently employed for both signal decomposition and extracting features; however, the decomposition outcome is influenced by the quantity of intrinsic modal functions (IMFs) and the specific parameter values of penalty factors. To tackle this issue, we propose an algorithm based on the Halton sequence and the Laplace crossover operator for the sparrow search algorithm–VMD (HLSSA-VMD) to fine-tune the parameters of VMD. First, the population initialization by the Halton sequence yields higher-quality initial solutions, which effectively addresses the issue of the algorithm’s sluggish convergence due to overlapping and the lack of diversity of the initial solutions. Second, the introduction of the Laplace crossover operator (LX) to perturb the position of the best individual in each iteration helps to prevent the algorithm from becoming ensnared in a local optimum and improves the convergence speed of the algorithm. Finally, from the simulation of 17 benchmark test functions, we found that the HLSSA exhibited superior convergence accuracy and accelerated convergence pace, as well as better robustness than the particle swarm optimization (PSO) algorithm, the whale optimization algorithm (WOA), the multiverse optimization (MVO) algorithm, and the traditional sparrow search algorithm (SSA). In addition, we verified the effectiveness of the HLSSA-VMD algorithm on two simulated signals and compared it with PSO-VMD, WOA-VMD, MVO-VMD, and SSA-VMD. The experimental findings indicate that the HLSSA-VMD obtains better parameters, confirming the superiority of the algorithm.

1. Introduction

As science and technology advance swiftly, and with the continuous evolution of social needs, optimization algorithms are becoming increasingly crucial in solving complex problems in various fields. From mechanical design and manufacturing to artificial intelligence, the rapid progress in these fields not only promotes the development of industry but also triggers the demand for optimization algorithms to be more efficient and innovative [1].
Within signal decomposition, numerous techniques have been developed for extracting fault features. Common algorithms include wavelet transform (WT) [2], empirical mode decomposition (EMD) [3], empirical wavelet transform (EWT) [4], and the local mean decomposition (LMD) [5], among others. Although these algorithms are widely applied and play a positive role in many fields, they also exhibit certain issues that impact the effectiveness of signal decomposition. When using WT for signal processing, the fixed choice of mother wavelet leads to a lack of flexibility and less applicability when dealing with different types of signals. EMD suffers from severe mode mixing and endpoint effects when decomposing signals. Although the LMD method efficiently mitigates the emergence of over- and under-envelopes in EMD decomposition, it continues to demonstrate endpoint effects. The limited adaptability and robustness of EWT restrict its capability for signal decomposition.
In 2014, Dragomiretskiy proposed VMD [6], a method with both a strong theoretical basis and notable robustness in extracting fault features in complex environments, such as strong noise. This algorithm has found application across diverse fields and has shown excellent performance [7,8,9]. However, the VMD algorithm itself has limitations, and the effectiveness of its decomposition may be affected by both the quantity of intrinsic mode functions (IMFs), denoted as K, and the parameter α for the penalty factor. With inappropriate parameter settings, issues such as mode mixing may arise when employing the VMD algorithm in signal processing. To address the shortcomings of the VMD algorithm, many researchers opt to employ swarm intelligence optimization algorithms for parameter tuning. Li et al. [10] utilized a genetic algorithm for the adaptive searching of optimal parameters for VMD, while Liu et al. [11] utilized the grey wolf algorithm to acquire the optimal parameters for VMD. Other swarm intelligence algorithms used for optimizing VMD parameters include the PSO [12], WOA [13], multiverse optimization (MVO) algorithm [14], sailfish optimization (SFO) algorithm [15], Archimedes optimization algorithm (AOA) [16], etc.
In 2020, Xue et al. [17] introduced the SSA, known for its computational efficiency, simplicity of implementation, and ease of scalability. According to Xue et al.’s analysis, the SSA demonstrates superior performance compared to several established algorithms like the PSO, GWO, and gravity search algorithm (GSA) regarding convergence accuracy, convergence speed, and robustness. Nevertheless, akin to other swarm intelligence algorithms, the SSA may suffer from low initial solution quality, and in solving complex optimization problems, it may experience a decrease in population diversity and vulnerability to becoming trapped in local optima toward the later stages of the solution process. Researchers have addressed the aforementioned issues and achieved positive results through various studies. Gao et al. [18] proposed a multistrategy enhanced evolutionary sparrow search algorithm (ESSA), integrating tent map chaos during the initialization phase to expedite convergence speed and improve convergence accuracy. Geng et al. [19] introduced chaotic back-propagation learning and dynamic weights to prevent the SSA from being entrapped in local optima. Wu et al. [20] combined Levy flights and nonlinear inertia weight to present an improved SSA. Xiong et al. [21] introduced a fractional-order chaotic improved SSA, demonstrating higher convergence accuracy than the SSA. Sun et al. [22] implemented CAT chaotic mapping to initialize a population and explored the initial population’s randomness and searchability. Han et al. [23] used sin chaotic mapping to initialize the sparrow population, aiming to improve the quality of initial solutions. Liu et al. [24] introduced circular chaotic mapping into the SSA, enhancing the algorithm’s capacity for global exploration during population initialization. They additionally introduced the t-distribution in the formula for updating the positions of sparrows across various iteration cycles, aiding the algorithm in avoiding local optima.
The current research places significant emphasis on the population initialization process in the SSA since the population’s quality distribution significantly influences the efficacy of the search for the global optimum. In the early stages of the algorithm, a well-designed population initialization process can effectively improve the algorithm’s capability for global exploration. Chaos mapping is a frequently used approach in the existing literature on initialization, as mentioned above. However, chaotic systems are highly sensitive to starting conditions and display unpredictable randomness. Additionally, chaotic mapping algorithms involve complex mathematical operations, leading to increased computational costs.
In this paper, we propose the Halton sequence for population initialization. In comparison to the aforementioned methods, the Halton sequence ensures a more uniformly distributed initial population across the entire space and lacks random components, making it more robust. Moreover, its lower sensitivity to initial conditions and parameters enhances its reliability in algorithm applications. The crossover operator plays a crucial role in optimization algorithms by promoting population diversity and exploring the search space. The Laplace crossover operator (LX) is a particularly unique crossover operator that has been applied in various fields [25,26]. Deep et al. [27] first introduced the LX in 2007 and applied it to genetic algorithms (GAs). Experimental results demonstrated that the LX-GA outperformed other types of GAs. Consequently, in many studies related to GAs, researchers tend to utilize the LX for performing crossover operations [28,29]. The LX is suitable for optimization problems involving continuous parameters and generates random numbers based on the Laplace distribution, exhibiting good randomness and perturbation properties. Additionally, the basic operation of the LX is based on the Laplace distribution, requiring minimal parameter tuning. Considering the numerous advantages of the LX, in this study, we utilized the LX to disturb the position of the best individual at each iteration with the goal of improving the algorithm’s ability for local exploration and preventing the population from becoming ensnared in local optima. The primary contributions of this paper are outlined as follows:
(1)
In the sparrow initialization stage, cleverly utilizing the Halton sequence to generate initial solutions effectively improves the quality of the initial solutions, thereby significantly enhancing the algorithm’s robustness. This initialization strategy not only aids in a more evenly distributed set of initial solutions across the entire solution space but also reduces the algorithm’s sensitivity to initial conditions.
(2)
Introducing the Laplace crossover operator to cleverly perturb the position of the best individual in each iteration successfully mitigates the possibility of the algorithm becoming ensnared in local optima while significantly improving the convergence speed. By incorporating this effective local search strategy during the optimization process, we effectively expanded the algorithm’s exploration range in the solution space.
(3)
We extensively validated the effectiveness and outstanding performance of the algorithm on 17 benchmark functions. Moreover, the enhanced algorithm was effectively employed in optimizing parameters for the VMD algorithm, ultimately achieving satisfactory results and thus highlighting the practical value of this algorithm.
The remainder of the paper is organized as follows: Section 2 outlines the pertinent theoretical approaches contributing to our understanding. Section 3 provides a comprehensive description of the method proposed in this paper. Section 4 validates the proposed methodology. Finally, Section 5 presents the conclusions of this study, along with suggestions for future work.

2. The Principles of SSA

In the SSA, a population consisting of n sparrows can be represented in the following form:
X = x 1 , 1 x 1 , 2 x 1 , d x 2 , 1 x 2 , 2 x 2 , d x n , 1 x n , 2 x n , d
where d represents the dimensionality of the problem to be solved. The fitness values of all sparrows can be represented in the following form:
F X = f ( [ x 1 , 1   x 1 , 2     x 1 , d ] ) f ( [ x 2 , 1   x 2 , 2     x 2 , d ] ) f ( [ x n , 1   x n , 2     x n , d ] )
where f represents an individual’s fitness value, and F X represents the fitness value of all sparrows.
The primary task of the founder in the sparrow population is to search for food in the environment, providing the entire population with the location and direction of the discovered food. Since founders are more likely to find food, the fitness values of producers are superior, and their position in the entire solution space is close to the location of the optimal solution. During each iteration of the search process, the founder updates its position while searching for food, and the specific calculation is expressed with the following equation:
X i , j t + 1 = X i , j t exp ( i α i t e r max )   i f   R 2 < S T X i , j t + Q L i f   R 2 S T
where t denotes the current iteration number; j = 1, 2, 3, ..., d; i t e r max represents the current maximum iteration count; X i j represents the position of the i-th sparrow along the j-th dimension; and α ( 0 , 1 ) is the uniform random number. R 2 ( R 2 0 , 1 ) and ST ( S T 0.5 , 1 ) denote the warning value and safety value, respectively. Q is a randomly generated number following a normal distribution, and L represents a 1 × d matrix, where every element in the matrix is equal to 1.
R 2 < S T indicates that, in the foraging environment, there are no predators present, allowing the founder to explore freely and extensively. However, R 2 S T indicates that certain sparrows within the population have detected predators and signaled warnings to other members. Consequently, all sparrows must promptly relocate to alternative safe locations for foraging.
While foraging, certain scroungers consistently observe the founder. Upon realizing that the founder has discovered superior food, they promptly abandon their current position to vie for the food. The position update process for scroungers is outlined as follows:
X i , j t + 1 = Q exp ( X w o r s t X i , j t i 2 ) i f   i > n / 2 X P t + 1 + X i , j t X P t + 1 A + L   o t h e r w i s e
where X P represents the current optimal position held by the founder; X w o r s t represents the current globally worst position; A is a 1 × d matrix consisting of elements randomly assigned values of either 1 or −1; and A + = A T ( A A T ) 1 . When i > n / 2 , the i-th joiner with a lower fitness value has not yet secured food and is in a highly hungry state. Consequently, it must relocate to another area for foraging to replenish its energy.
In the simulation experiments, the proportion of sparrows exhibiting danger perception was determined to range between 10% and 20% of the total population. Their initial positions are randomly distributed throughout the entire population, and their mathematical representation is as follows:
X i , j t + 1 = X b e s t t + β X i , j t X b e s t t     i f   f i > f g X i , j t + K ( X i , j t X w o r s t t ( f i f w ) + ε )       i f   f i = f g
where X b e s t represents the present global optimal position; β represents the step size control parameter, with its value being a randomly generated number following a normal distribution with a mean of 0 and a variance of 1; K ( 1 , 1 ) is the uniform random number; f i is the fitness score of the present sparrow individual; f g and f w represent the current global best and worst fitness values, respectively; and ε is a numerically small constant to prevent division by zero. When f i > f g , the sparrow is at the population boundary and susceptible to predator attacks, signifying that the sparrow’s position is the safest at this point. When f i = f g , sparrows in the centroid of the population have detected danger, so they need to converge with other sparrows to minimize the threat of predation.

3. Improved SSA Based on Halton Sequence and LX

3.1. Population Initialization Based on the Halton Sequence

The Halton sequence [30,31,32] is a common multidimensional low-discrepancy sequence with an optimal order difference of N 1 log N d , where d is the dimension. As shown in Figure 1, the solutions generated with the Halton sequence are more uniformly distributed in space than those generated using random initialization.
The definition of the Halton sequence is based on the inverse radical function, and its defining function is expressed as follows:
ϕ p ( n ) = b 0 p + b 1 p 2 + + b m p m + 1
where p is a prime number, and b k is the k-th digit of n in the base-b expansion ( 0 k m ).
n = b 0 + b 1 p + + b m p m
The Halton sequence in d dimensions can be represented with the following formula:
H a l t o n S e t n = ϕ p 1 n , ϕ p 2 n , , ϕ p d n
The initial positions for the population can be generated using the following formula:
x i , j = l b j + H a l t o n S e t n × u b j l b j
where l b j and u b j denote the lower and upper limits of the positions of the sparrow population in the j-th dimension ( 1 i n , 1 j d ).

3.2. Optimal Position Perturbation Based on LX

The LX generates a pair of offspring, y 1 = y 1 1 , y 1 2 , , y 1 a and y 2 = y 2 1 , y 2 2 , , y 2 a , from a pair of parents, namely x 1 = x 1 1 , x 1 2 , , x 1 a and x 2 = x 2 1 , x 2 2 , , x 2 a . The descendants produced by the LX are symmetrically positioned relative to the parents. LX random numbers are determined according to the following rules [27]:
l s = p q log e u s   v s 1 / 2 p + q log e u s   v s > 1 / 2
where u s and v s are two random numbers uniformly distributed within the range [0, 1]; p R is the location parameter, which can be used to control the distribution of the offspring positions in the search space; and q > 0 is the scale parameter. A smaller q value results in offspring positions being closer to the parents, while a larger q value leads to offspring positions being farther from the parents. In this study, the value of q was 1, and the value of p was 0. The rules for offspring generation are as follows:
y 1 s = x 1 s + l s x 1 s x 2 s y 2 s = x 2 s + l s x 1 s x 2 s
If the range of offspring positions exceeds the search space, i.e., y j < l b j or y j > u b j , then y j is set to a random number from the interval l b j , u b j .
The pseudocode of HLSSA is shown in Algorithm 1.
Algorithm 1 Pseudocode of HLSSA
1. set t = 0
2. Initialize the population using Equation (9).
3. compute the fitness value for each sparrow individual using the fitness function, and then arrange them in order, noting the best position X b e s t and best fitness value f g .
4. While (t < i t e r max )
 determine the proportion of founders PD1, the ratio of scroungers PD2 and the proportion of sparrows with danger perception PD3
 For i = 1: PD1
 Update the founders positions based on Formula (3).
 End for
 For i = 1: PD2
 Update the scroungers positions based on Formula (4).
 End for
 For i = 1: PD3
 Update the sparrows with danger perception positions based on Formula (5).
 End for
 Calculate the fitness value for each sparrow and select the best one.
 Disturb the optimal position according to Formula (11) and compute its fitness value. Compare the fitness values before and after the disturbance and choose the better one.
 t = t + 1
 End while
5. Return X b e s t , f g

4. Method Validation

To assess the effectiveness of the enhanced algorithm, two sets of comparative experiments were devised in this study. In the initial set of experiments, we selected seven unimodal test functions, characterized by having only one extremum, to verify the algorithm’s convergence speed, optimization accuracy, and local search capability. The second set of experiments involved the selection of five multimodal test functions, characterized by having multiple local extremum points, to examine the algorithm’s performance in escaping local optima and possessing global exploration capabilities. To reduce the bias in single-run results, we conducted 30 iterations for each function and documented their optimal values, mean values, and standard deviations. Simultaneously, we compared the improved algorithm with the PSO, MVO, WOA, and SSA. Table 1 provides the detailed parameter settings of each algorithm. All experiments were executed on a computer featuring an Intel i7 processor operating at 2.30 GHz and 16 GB of RAM, utilizing the MATLAB 2023a environment. This experimental design and environment setup aimed to comprehensively and reliably evaluate algorithm performance and facilitate an objective comparison.

4.1. Unimodal Test Function Experiments

Evaluating algorithm performance using test functions with known global optimal values is a common approach in this field. Unimodal test functions are particularly useful for validating algorithm performance in local search, as these functions contain only one global optimum without other local optima. Table 2 provides the formulas, dimensions, search space, and global optimal values for unimodal test functions. The optimal value represents the global minimum value, and the search space refers to the range of x i .

4.1.1. Experimental Results

Table 3 presents the optimization outcomes for the five algorithms; the “Best” column lists the best fitness values. Figure 2, Figure 3, Figure 4, Figure 5, Figure 6, Figure 7 and Figure 8 present three-dimensional plots and convergence curves for the seven benchmark functions. In Figure 2a, Figure 3a, Figure 4a, Figure 5a, Figure 6a, Figure 7a and Figure 8a, n = 2 for the purpose of displaying the two-dimensional surface of the functions. In the results shown in Figure 2b, Figure 3b, Figure 4b, Figure 5b, Figure 6b, Figure 7b and Figure 8b, the value of n is 30.

4.1.2. Analysis of the Results

Convergence accuracy analysis: As depicted in Table 3, using the HLSSA, we successfully determined the optimal values for test functions F1 to F4. Although they did not reach the optimal values for F5 to F7, the algorithm’s performance in terms of the obtained optimal values, averages, and standard deviations was significantly better than the PSO, MVO, WOA, and SSA. This indicates that the HLSSA has stronger optimization capabilities when solving unimodal test functions.
Stability analysis: According to the standard deviation (STD) test data in Table 3, the HLSSA had STD values of 0 for F1 to F4 and the lowest STD values for F5 to F7. This suggests that the HLSSA has better stability on unimodal test functions than other algorithms.
Convergence speed analysis: As shown in Figure 2, Figure 3, Figure 4, Figure 5, Figure 6, Figure 7 and Figure 8, the HLSSA had an absolute advantage in convergence speed on test functions F1 to F7. On test functions F1 to F4 and F6, HLSSA yielded the optimal values at the beginning of the iterations, indicating that initialization based on the Halton sequence helps the algorithm obtain better initial solutions.
In summary, the proposed HLSSA exhibited stronger optimization capabilities, faster convergence speed, and greater stability when dealing with unimodal functions. This study proposes an efficient and reliable algorithm for unimodal test function optimization problems.

4.2. Multimodal Test Function Experiments

Multimodal test functions typically contain many local optimum values, making them more challenging and difficult to use for finding the global optimum. Therefore, testing on such functions allows for a more comprehensive evaluation of the algorithm’s exploration capabilities and potential to escape local optima. Table 4 provides the formulas, dimensions, search spaces, and global optimal values for multimodal test functions.

4.2.1. Experimental Results

Table 5 presents the optimization results for the five algorithms, and Figure 9, Figure 10, Figure 11, Figure 12 and Figure 13 depict the three-dimensional plots and convergence plots for the five test functions. In Figure 9a, Figure 10a, Figure 11a, Figure 12a and Figure 13a, n = 2 for the purpose of displaying the two-dimensional surface of the functions. In the results shown in Figure 9b, Figure 10b, Figure 11b, Figure 12b and Figure 13b, the value of n is 30.

4.2.2. Analysis of the Results

Convergence accuracy analysis: As shown in Table 5, the HLSSA successfully yielded the optimal values for test functions F8, F9, and F11, and its optimization performance was significantly better than the PSO, MVO, and WOA. Although the HLSSA did not yield the optimal values for F10 and F12, its performance in terms of the obtained optimal values, averages, and standard deviations was still better than other algorithms. For F9, F10, and F11, the SSA performed similarly to the HLSSA, but regarding the other two functions, the SSA was more prone to becoming entrapped in local optima. Overall, the HLSSA exhibited stronger global search capabilities when solving multimodal test functions, confirming the effectiveness of LX perturbation in helping the SSA escape local optima.
Stability analysis: According to the standard deviation (STD) data in Table 5, the HLSSA exhibited comparable performance to the SSA considering F9, F10, and F11. However, in terms of the remaining multimodal test functions, the HLSSA revealed consistently lower STD values, indicating better stability on multimodal test functions.
Convergence speed analysis: As shown in Figure 9, Figure 10, Figure 11, Figure 12 and Figure 13, the HLSSA had an absolute advantage in convergence speed on the five multimodal test functions, and the initial solutions obtained were very close to the optimal values. This suggests that the HLSSA demonstrates accelerated convergence when addressing high-dimensional complex problems.
In summary, the HLSSA exhibited enhanced convergence accuracy, accelerated convergence speed, and improved stability when dealing with high-dimensional multimodal problems. Compared to the basic SSA, the HLSSA achieved significant improvements in performance.

4.3. HLSSA-VMD Algorithm Validation

Variational mode decomposition (VMD) is one of the most commonly used methods in the field of signal decomposition, and its decomposition performance is often heavily influenced by the number of modes (K) and the quadratic penalty term (α). Improper parameter settings can lead to information loss and frequency aliasing. Researchers typically employ optimization algorithms for parameter tuning, yet different optimization algorithms yield varying degrees of effectiveness in optimization. This section presents two sets of simulated signals for validating the effectiveness of the HLSSA-VMD algorithm. A comparative analysis was conducted with the PSO-VMD, WOA-VMD, MVO-VMD, and SSA-VMD to further highlight the superior performance of the HLSSA-VMD algorithm.
The first set of simulated signals we constructed is the simulated signal representing an outer race fault in a bearing, denoted as x(t). The construction formula is as follows [33]:
x ( t ) = x 1 ( t ) + x 2 ( t ) + x 3 ( t ) + n ( t ) x 1 ( t ) = A e ξ 2 π f n t sin ( 2 π f n 1 ξ 2 t ) x 2 ( t ) = i = 1 M l i sin ( 2 π f i 1 t + θ i ) x 3 ( t ) = i = 1 N k i sin ( 2 π f i 2 t + φ i )
where f n is the natural frequency; ξ is the damping coefficient; A is the amplitude of the pulse fault; and n(t) is Gaussian white noise with an SNR of −10 dB.
The detailed information of the simulated signal is shown in Table 6, with a sampling frequency of 20 KHz and a number of 4096 data samples. Figure 14a,b display the initial state and the state after adding Gaussian white noise to the simulated bearing fault signal, respectively.
We employed envelope entropy minimization as the fitness function for optimizing VMD parameters through the HLSSA. Envelope entropy reflects the sparsity traits of the original signal. When the obtained intrinsic mode functions (IMFs) after decomposition have more noise and less feature information, the envelope entropy value is larger. Conversely, when there is less noise and more feature information in the decomposed IMFs, the envelope entropy value is smaller. Currently, many studies use envelope entropy as the objective function for optimizing VMD parameters.
The search range for K was [1, 10], and the search range for α was [100, 3000]. Figure 15 illustrates the iterative optimization process for the five algorithms, and Table 7 presents the optimal solutions obtained with these algorithms.
From Table 7, it is evident that both the SSA and HLSSA demonstrated superior performance in terms of convergence accuracy. However, as shown in Figure 15, the HLSSA outperformed the SSA regarding initialization and convergence speed. This indicates that the proposed improvement to the SSA is effective, and the HLSSA demonstrates higher-quality initialization, faster convergence speed, and superior convergence accuracy.
The second set of simulated signals consisted of three signals with different frequencies that were superimposed at different time intervals. To simulate real-world signals more accurately, Gaussian white noise was added to the signal s. The construction process of signal s is as follows:
s 1 = 1.5 sin ( 100 π t ) s 2 = 2 sin ( 300 π t ) s 3 = sin ( 500 π t ) s 4 = 2.5 sin ( 700 π t ) s = s 1 + s 2 + s 3 + s 4 + s 5
where s 5 is a Gaussian white noise signal with a mean of 0 and a standard deviation of 1.
Figure 16a is a visual representation of each simulated signal. Signal s 1 had an amplitude of 1.5, a frequency of 50 Hz, and a duration of 0–1 s; signal s 2 had an amplitude of 2, a frequency of 150 Hz, and a duration of 0.35–0.65 s; signal s 3 had an amplitude of 1.5, a frequency of 250 Hz, and a duration of 0.15–0.5 s; and signal s 4 had an amplitude of 2.5, a frequency of 350 Hz, and a duration of 0.4–0.8 s. Figure 16b shows the amplitude–frequency plot obtained by performing a Fourier transform on the synthesized signal s. Figure 16b demonstrates that the synthesized signal s contained the frequencies corresponding to the simulated signals s 1 s 4 .
The optimal parameters K and α for the VMD algorithm in decomposing signal s were obtained using the HLSSA. The search ranges for K and α were set to [100, 6000] and [1, 10], respectively. Figure 17 illustrates the iterative optimization process for the five algorithms, and Table 8 presents the optimal solutions obtained with each algorithm.
As shown in Figure 17 and Table 8, the MVO and PSO exhibited relatively poorer convergence accuracy, while the WOA, SSA, and HLSSA yielded the same optimization results. However, the HLSSA demonstrated a faster convergence speed, and it had the lowest initial fitness value, indicating that the HLSSA provides a higher quality initial solution.
In Figure 18a, the decomposition results of MVO-VMD are displayed, revealing that the algorithm only decomposed three signals of different frequencies, while the 250 Hz signal was absent. Figure 18b shows the decomposition results of PSO-VMD, and although it successfully decomposed the signals of four different frequencies, each frequency component exhibited significant jitter, resulting in less smooth decomposition. Figure 18c–e demonstrate the decomposition results of WOA-VMD, SSA-VMD, and HLSSA-VMD, respectively. In contrast to PSO-VMD, these three algorithms not only successfully decomposed all frequencies but also provided smoother decomposition results, indicating superior performance.
This comparative analysis further validates the outstanding performance of the HLSSA in optimizing VMD parameters, showcasing its faster convergence speed and higher-quality decomposition results.

5. Conclusions

In this study, we proposed the HLSSA-VMD algorithm for optimizing VMD parameters. Firstly, we validated the performance of the HLSSA across 17 benchmark functions, demonstrating its superiority in convergence accuracy, convergence speed, and robustness compared to the PSO, WOA, MVO, and SSA. Secondly, we evaluated the performance of the HLSSA-VMD algorithm on two simulated signals, revealing that, in comparison with other algorithms, HLSSA-VMD effectively enhances VMD’s parameter optimization capabilities and improves signal decomposition quality.

Author Contributions

Conceptualization, H.D.; methodology, H.D.; software, H.D.; validation, H.D.; formal analysis, H.D.; investigation, H.D.; resources, H.D.; data curation, H.D.; writing—original draft preparation, H.D.; writing—review and editing, W.Q. and X.Z.; visualization, W.Q. and X.Z.; supervision, J.W.; project administration, J.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data presented in this study are available upon request from the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Deng, L.; Liu, S. A Novel Hybrid Grasshopper Optimization Algorithm for Numerical and Engineering Optimization Problems. Neural Process. Lett. 2023, 55, 9851–9905. [Google Scholar] [CrossRef]
  2. Wang, X.; Zhou, F.; He, Y.; Wu, Y. Weak Fault Diagnosis of Rolling Bearing under Variable Speed Condition Using IEWT-Based Enhanced Envelope Order Spectrum. Meas. Sci. Technol. 2019, 30, 035003. [Google Scholar] [CrossRef]
  3. Xu, Y.; Cai, Z.; Ding, K. An Enhanced Bearing Fault Diagnosis Method Based on TVF-EMD and a High-Order Energy Operator. Meas. Sci. Technol. 2018, 29, 095108. [Google Scholar] [CrossRef]
  4. Chegini, S.N.; Bagheri, A.; Najafi, F. Application of a New EWT-Based Denoising Technique in Bearing Fault Diagnosis. Measurement 2019, 144, 275–297. [Google Scholar] [CrossRef]
  5. Keyhani, A.; Mohammadi, S. Structural Modal Parameter Identification Using Local Mean Decomposition. Meas. Sci. Technol. 2018, 29, 025003. [Google Scholar] [CrossRef]
  6. Dragomiretskiy, K.; Zosso, D. Variational Mode Decomposition. IEEE Trans. Signal Process. 2014, 62, 531–544. [Google Scholar] [CrossRef]
  7. Wang, W.; Guo, S.; Zhao, S.; Lu, Z.; Xing, Z.; Jing, Z.; Wei, Z.; Wang, Y. Intelligent Fault Diagnosis Method Based on VMD-Hilbert Spectrum and ShuffleNet-V2: Application to the Gears in a Mine Scraper Conveyor Gearbox. Sensors 2023, 23, 4951. [Google Scholar] [CrossRef]
  8. Luo, J.; Wen, G.; Lei, Z.; Su, Y.; Chen, X. Weak Signal Enhancement for Rolling Bearing Fault Diagnosis Based on Adaptive Optimized VMD and SR under Strong Noise Background. Meas. Sci. Technol. 2023, 34, 064001. [Google Scholar] [CrossRef]
  9. Zhang, M.; Cao, Y.; Sun, Y.; Su, S. Vibration Signal-Based Defect Detection Method for Railway Signal Relay Using Parameter-Optimized VMD and Ensemble Feature Selection. Control Eng. Pract. 2023, 139, 105630. [Google Scholar] [CrossRef]
  10. Li, T.; Zhang, F.; Lin, J.; Bai, X.; Liu, H. Fading Noise Suppression Method of Φ-OTDR System Based on Non-Local Means Filtering. Opt. Fiber Technol. 2023, 81, 103572. [Google Scholar] [CrossRef]
  11. Liu, B.; Liu, C.; Zhou, Y.; Wang, D. A Chatter Detection Method in Milling Based on Gray Wolf Optimization VMD and Multi-Entropy Features. Int. J. Adv. Manuf. Technol. 2023, 125, 831–854. [Google Scholar] [CrossRef]
  12. Mao, M.; Chang, J.; Sun, J.; Lin, S.; Wang, Z. Research on VMD-Based Adaptive TDLAS Signal Denoising Method. Photonics 2023, 10, 674. [Google Scholar] [CrossRef]
  13. Lu, D.; Shen, S.; Li, Y.; Zhao, B.; Liu, X.; Fang, G. An Ice-Penetrating Signal Denoising Method Based on WOA-VMD-BD. Electronics 2023, 12, 1658. [Google Scholar] [CrossRef]
  14. Jin, Z.; He, D.; Ma, R.; Zou, X.; Chen, Y.; Shan, S. Fault Diagnosis of Train Rotating Parts Based on Multi-Objective VMD Optimization and Ensemble Learning. Digit. Signal Process. 2022, 121, 103312. [Google Scholar] [CrossRef]
  15. Jing, L.; Bian, J.; He, X.; Liu, Y. Study on the Optimization of the Classification Method of Rolling Bearing Fault Type and Damage Degree Based on SFO–VMD. Meas. Sci. Technol. 2023, 34, 125047. [Google Scholar] [CrossRef]
  16. Wang, J.; Zhan, C.; Li, S.; Zhao, Q.; Liu, J.; Xie, Z. Adaptive Variational Mode Decomposition Based on Archimedes Optimization Algorithm and Its Application to Bearing Fault Diagnosis. Measurement 2022, 191, 110798. [Google Scholar] [CrossRef]
  17. Xue, J.; Shen, B. A Novel Swarm Intelligence Optimization Approach: Sparrow Search Algorithm. Syst. Sci. Control Eng. 2020, 8, 22–34. [Google Scholar] [CrossRef]
  18. Gao, B.; Shen, W.; Guan, H.; Zheng, L.; Zhang, W. Research on Multistrategy Improved Evolutionary Sparrow Search Algorithm and Its Application. IEEE Access 2022, 10, 62520–62534. [Google Scholar] [CrossRef]
  19. Geng, J.; Sun, X.; Wang, H.; Bu, X.; Liu, D.; Li, F.; Zhao, Z. A Modified Adaptive Sparrow Search Algorithm Based on Chaotic Reverse Learning and Spiral Search for Global Optimization. Neural Comput. Applic. 2023, 35, 24603–24620. [Google Scholar] [CrossRef]
  20. Du, Y.; Yuan, H.; Jia, K.; Li, F. Research on Threshold Segmentation Method of Two-Dimensional Otsu Image Based on Improved Sparrow Search Algorithm. IEEE Access 2023, 11, 70459–70469. [Google Scholar] [CrossRef]
  21. Xiong, Q.; Zhang, X.; He, S.; Shen, J. A Fractional-Order Chaotic Sparrow Search Algorithm for Enhancement of Long Distance Iris Image. Mathematics 2021, 9, 2790. [Google Scholar] [CrossRef]
  22. Sun, H.; Wang, J.; Chen, C.; Li, Z.; Li, J. ISSA-ELM: A Network Security Situation Prediction Model. Electronics 2022, 12, 25. [Google Scholar] [CrossRef]
  23. Han, M.; Zhong, J.; Sang, P.; Liao, H.; Tan, A. A Combined Model Incorporating Improved SSA and LSTM Algorithms for Short-Term Load Forecasting. Electronics 2022, 11, 1835. [Google Scholar] [CrossRef]
  24. Jianhua, L.; Zhiheng, W. A Hybrid Sparrow Search Algorithm Based on Constructing Similarity. IEEE Access 2021, 9, 117581–117595. [Google Scholar] [CrossRef]
  25. Hanh, N.T.; Binh, H.T.T.; Hoai, N.X.; Palaniswami, M.S. An Efficient Genetic Algorithm for Maximizing Area Coverage in Wireless Sensor Networks. Inf. Sci. 2019, 488, 58–75. [Google Scholar] [CrossRef]
  26. Ul Haq, E.; Ahmad, I.; Almanjahie, I.M. A Novel Parent Centric Crossover with the Log-Logistic Probabilistic Approach Using Multimodal Test Problems for Real-Coded Genetic Algorithms. Math. Probl. Eng. 2020, 2020, 1–17. [Google Scholar] [CrossRef]
  27. Deep, K.; Thakur, M. A New Crossover Operator for Real Coded Genetic Algorithms. Appl. Math. Comput. 2007, 188, 895–911. [Google Scholar] [CrossRef]
  28. Thakur, M.; Meghwani, S.S.; Jalota, H. A Modified Real Coded Genetic Algorithm for Constrained Optimization. Appl. Math. Comput. 2014, 235, 292–317. [Google Scholar] [CrossRef]
  29. Emami, M.; Mozaffari, A.; Azad, N.L.; Rezaie, B. An Empirical Investigation into the Effects of Chaos on Different Types of Evolutionary Crossover Operators for Efficient Global Search in Complicated Landscapes. Int. J. Comput. Math. 2016, 93, 3–26. [Google Scholar] [CrossRef]
  30. Halton, J.H. Algorithm 247: Radical-Inverse Quasi-Random Point Sequence. Commun. ACM 1964, 7, 701–702. [Google Scholar] [CrossRef]
  31. Halton, J.H. On the Efficiency of Certain Quasi-Random Sequences of Points in Evaluating Multi-Dimensional Integrals. Numer. Math. 1960, 2, 84–90. [Google Scholar] [CrossRef]
  32. Kocis, L.; Whiten, W.J. Computational Investigations of Low-Discrepancy Sequences. ACM Trans. Math. Softw. 1997, 23, 266–294. [Google Scholar] [CrossRef]
  33. Tian, S.; Zhen, D.; Liang, X.; Feng, G.; Cui, L.; Gu, F. Early Fault Feature Extraction for Rolling Bearings Using Adaptive Variational Mode Decomposition with Noise Suppression and Fast Spectral Correlation. Meas. Sci. Technol. 2023, 34, 065112. [Google Scholar] [CrossRef]
Figure 1. (a) Halton sequence initialization; (b) random sequence initialization.
Figure 1. (a) Halton sequence initialization; (b) random sequence initialization.
Applsci 14 02174 g001
Figure 2. (a) Three-dimensional plot of F1; (b) convergence curve.
Figure 2. (a) Three-dimensional plot of F1; (b) convergence curve.
Applsci 14 02174 g002
Figure 3. (a) Three-dimensional plot of F2; (b) convergence curve.
Figure 3. (a) Three-dimensional plot of F2; (b) convergence curve.
Applsci 14 02174 g003
Figure 4. (a) Three-dimensional plot of F3; (b) convergence curve.
Figure 4. (a) Three-dimensional plot of F3; (b) convergence curve.
Applsci 14 02174 g004
Figure 5. (a) Three-dimensional plot of F4; (b) convergence curve.
Figure 5. (a) Three-dimensional plot of F4; (b) convergence curve.
Applsci 14 02174 g005
Figure 6. (a) Three-dimensional plot of F5; (b) convergence curve.
Figure 6. (a) Three-dimensional plot of F5; (b) convergence curve.
Applsci 14 02174 g006
Figure 7. (a) Three-dimensional plot of F6; (b) convergence curve.
Figure 7. (a) Three-dimensional plot of F6; (b) convergence curve.
Applsci 14 02174 g007
Figure 8. (a) Three-dimensional plot of F7; (b) convergence curve.
Figure 8. (a) Three-dimensional plot of F7; (b) convergence curve.
Applsci 14 02174 g008
Figure 9. (a) Three-dimensional plot of F8; (b) convergence curve.
Figure 9. (a) Three-dimensional plot of F8; (b) convergence curve.
Applsci 14 02174 g009
Figure 10. (a) Three-dimensional plot of F9; (b) convergence curve.
Figure 10. (a) Three-dimensional plot of F9; (b) convergence curve.
Applsci 14 02174 g010
Figure 11. (a) Three-dimensional plot of F10; (b) convergence curve.
Figure 11. (a) Three-dimensional plot of F10; (b) convergence curve.
Applsci 14 02174 g011
Figure 12. (a) Three-dimensional plot of F11; (b) convergence curve.
Figure 12. (a) Three-dimensional plot of F11; (b) convergence curve.
Applsci 14 02174 g012
Figure 13. (a) Three-dimensional plot of F12; (b) convergence curve.
Figure 13. (a) Three-dimensional plot of F12; (b) convergence curve.
Applsci 14 02174 g013
Figure 14. Two states of the bearing outer ring fault signal: (a) original state; (b) state after adding noise.
Figure 14. Two states of the bearing outer ring fault signal: (a) original state; (b) state after adding noise.
Applsci 14 02174 g014
Figure 15. The iterative optimization process of the five algorithms.
Figure 15. The iterative optimization process of the five algorithms.
Applsci 14 02174 g015
Figure 16. (a) Simulated signal; (b) amplitude–frequency plot of the synthesized signal s.
Figure 16. (a) Simulated signal; (b) amplitude–frequency plot of the synthesized signal s.
Applsci 14 02174 g016
Figure 17. The iterative optimization process of the five algorithms.
Figure 17. The iterative optimization process of the five algorithms.
Applsci 14 02174 g017
Figure 18. Hilbert spectra corresponding to different algorithms: (a) MVO-VMD; (b) PSO-VMD; (c) WOA-VMD; (d) SSA-VMD; (e) HLSSA-VMD.
Figure 18. Hilbert spectra corresponding to different algorithms: (a) MVO-VMD; (b) PSO-VMD; (c) WOA-VMD; (d) SSA-VMD; (e) HLSSA-VMD.
Applsci 14 02174 g018
Table 1. Details of algorithm parameter settings.
Table 1. Details of algorithm parameter settings.
AlgorithmParameters
PSOParticle count: 30; max iterations: 500; learning factors: c1 = c2 = 1.5; inertia weight: w = 0.7;
MVOnumber of universes: 30; maximum iteration count: 500; probability for the existence of wormholes: W E P 0.2 , 1
WOANumber of whales: 30; maximum iteration count: 500;
SSApopulation size: 30; maximum iteration count: 500; warning value: ST = 0.6; the proportion of founders: PD = 0.7, the rest are joiners; the ratio of sparrows aware of the presence of sparrows that sense danger and sound the alarm: SD = 0.2
HLSSAPopulation size: 30; maximum iteration count: 500; warning value: ST = 0.6; the proportion of founders: PD = 0.7, the rest are joiners; the ratio of sparrows of the presence of sparrows that sense danger and sound the alarm: SD = 0.2; Halton sequence parameters: Skip = 0, Leap = 1; LX parameters: p = 0.5, q = 1
Table 2. Details of unimodal benchmark functions.
Table 2. Details of unimodal benchmark functions.
Benchmark Functionsn (Dimension)Search SpaceOptimal Value
F 1 ( x ) = i = 1 n x i 2 30[−100, 100]0
F 2 ( x ) = i = 1 n x i + i = 1 n x i 30[−10, 10]0
F 3 ( x ) = i = 1 n ( j = 1 i x j ) 2 30[−100, 100]0
F 4 ( x ) = max i x i , 1 i n 30[−100, 100]0
F 5 ( x ) = i = 1 n 1 100 x i + 1 x i 2 2 + x i 1 2 30[−30, 30]0
F 6 ( x ) = i = 1 n x i + 0.5 2 30[−100, 100]0
F 7 ( x ) = i = 1 n i x i 4 + r a n d o m 0 , 1 30[−1.28, 1.28]0
Table 3. Results of optimizing seven unimodal benchmark functions using the five algorithms.
Table 3. Results of optimizing seven unimodal benchmark functions using the five algorithms.
FunctionAlgorithmBestAverageSTDVariance
F1PSO0.000330.0029263910.002516.302 × 10−6
MVO0.00601580.0137803950.0065175214.24781 × 10−5
WOA1.88 × 10−851.47 × 10−742.98625 × 10−748.9177 × 10−148
SSA05.54 × 10−861.9214×10−853.6918 × 10−170
HLSSA0000
F2PSO0000
MVO0.0166370.036812650.0114067640.000130114
WOA9.31 × 10−574.58 × 10−511.31735 × 10−501.7354 × 10−100
SSA01.26 × 10−453.79338 × 10−451.43898 × 10−89
HLSSA0000
F3PSO0.012020.127412650.14651950.021467964
MVO0.0102960.121500250.0892123520.007958844
WOA1.514.33 × 10412,839.5126164,853,083.9
SSA01.89 × 10−628.23118 × 10−626.7752 × 10−123
HLSSA0000
F4PSO0.012470.183083050.1241168340.015404989
MVO0.0425940.096465650.0356311930.001269582
WOA0.13200651736.2127726730.75549409945.9004169
SSA02.10 × 10−488.89079 × 10−487.90461 × 10−95
HLSSA0000
F5PSO6.308833.79438541.362616711710.866061
MVO6.8941137.409115203.918464841,582.74029
WOA27.0537299227.979918640.5010563880.251057504
SSA9.40 × 10−86.21 × 10−59.98646 × 10−59.97294 × 10−9
HLSSA2.75 × 10−91.75 × 10−62.57505 × 10−66.63087 × 10−12
F6PSO0.000209520.004193520.0043671111.90717 × 10−5
MVO0.00376760.013022050.0053108192.82048 × 10-5
WOA0.1415654250.4239010930.2331041060.054337524
SSA6.86 × 10−102.96 × 10−73.99442 × 10−71.59554 × 10−13
HLSSA2.67 × 10−133.25 × 10−92.99447 × 10−98.96684 × 10−18
F7PSO0.000153490.0006247920.0004625112.13917 × 10−7
MVO0.00137220.0033214750.0015587952.42984 × 10−6
WOA3.21249 × 10−50.0053928080.0051063152.60745 × 10−5
SSA0.0000666490.0004003850.0002943768.66569 × 10−8
HLSSA2.09 × 10−64.91 × 10−52.81604 × 10−57.93008 × 10−10
Table 4. Details of multimodal benchmark functions.
Table 4. Details of multimodal benchmark functions.
Benchmark Functionsn (Dimension)Search SpaceOptimal Value
F 8 ( x ) = i = 1 n x i sin x i 30[−500, 500]−12,569.5
F 9 ( x ) = i = 1 n x i 2 10 cos ( 2 π x i ) + 10 30[−5.12, 5.12]0
F 10 ( x ) = 20 exp ( 0.2 1 n i = 1 n x i 2 ) exp ( 1 n i = 1 n cos ( 2 π x i ) ) + 20 + e 30[−32, 32]0
F 11 ( x ) = 1 4000 i = 1 n x i 2 i = 1 n cos ( x i i ) + 1 30[−600, 600]0
F 12 ( x ) = π n { 10 sin ( π y 1 ) + i = 1 n 1 ( y i 1 ) 2 [ 1 + 10 sin 2 ( π y i + 1 ) + ( y n 1 ) 2 ] } + i = 1 n u ( x i , 10 , 100 , 4 ) , y i = 1 + x i + 1 4 , u ( x i , a , k , m ) = k ( x i a ) m , x i > a 0 , a < x i < a k ( x i a ) m ,   x i < a 30[−30, 30]0
Table 5. Optimization results of the five algorithms on five unimodal benchmark functions.
Table 5. Optimization results of the five algorithms on five unimodal benchmark functions.
FunctionAlgorithmBestAverageSTDVariance
F8PSO−1880.0986−2524.558865377.4638744142,478.9765
MVO−2154.8354−3008.179335380.7394914144,962.5603
WOA−1.26 × 104−1.03 × 1041727.5122092,984,298.431
SSA−12,569.0584−10,417.307632201.8245064,848,031.157
HLSSA−12,569.5−12,569.51.81899 × 10−123.30872 × 10−24
F9PSO6.001823.20627514.66438476215.0441805
MVO6.971815.528855.80493545233.6972756
WOA0000
SSA0000
HLSSA0000
F10PSO0.0156741.2601360.8196976720.671904273
MVO0.0282730.207857550.480862540.231228782
WOA4.44 × 10−164.53 × 10−152.58031 × 10−156.65799 × 10−30
SSA4.44 × 10−164.44 × 10−169.86076 × 10−329.72346 × 10−63
HLSSA4.44 × 10−164.44 × 10−169.86076 × 10−329.36772 × 10−33
F11PSO0.0247420.124814850.0680847590.004635534
MVO0.178820.3122240.1242715980.01544343
WOA00.025510.070079490.004911135
SSA0000
HLSSA0000
F12PSO0.000688580.1266357510.3262881080.10646393
MVO0.000293490.1070828260.281951280.079496524
WOA0.00520.0217550.0223623230.000500073
SSA1.14 × 10−107.83 × 10−81.24041 × 10−71.53862 × 10−14
HLSSA1.44 × 10−132.57 × 10−92.7137 × 10−97.36418 × 10−18
Table 6. The detailed parameters of the simulated signal.
Table 6. The detailed parameters of the simulated signal.
SignalParameters
x 1 ( t ) A ξ f n (Hz)
30.022600
x 2 ( t ) Ml f 1 1 (Hz) θ 1
10.025700
x 3 ( t ) Nk f 1 2 (Hz) φ 1
10.0351000
Table 7. The optimization results obtained with the five algorithms.
Table 7. The optimization results obtained with the five algorithms.
MethodsFitness[K, α]
MVO-VMD8.15943[6, 1033]
PSO-VMD8.15474[6, 1097]
WOA-VMD8.15474[6, 1097]
SSA-VMD8.14811[6, 1236]
HLSSA-VMD8.14811[6, 1236]
Table 8. The optimization results obtained with the five algorithms.
Table 8. The optimization results obtained with the five algorithms.
MethodsFitness[K, α]
MVO-VMD6.4152[3, 5803]
PSO-VMD6.4139[4, 3361]
WOA-VMD6.410[4, 6000]
SSA-VMD6.410[4, 6000]
HLSSA-VMD6.410[4, 6000]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Du, H.; Wang, J.; Qian, W.; Zhang, X. An Improved Sparrow Search Algorithm for the Optimization of Variational Modal Decomposition Parameters. Appl. Sci. 2024, 14, 2174. https://doi.org/10.3390/app14052174

AMA Style

Du H, Wang J, Qian W, Zhang X. An Improved Sparrow Search Algorithm for the Optimization of Variational Modal Decomposition Parameters. Applied Sciences. 2024; 14(5):2174. https://doi.org/10.3390/app14052174

Chicago/Turabian Style

Du, Haoran, Jixin Wang, Wenjun Qian, and Xunan Zhang. 2024. "An Improved Sparrow Search Algorithm for the Optimization of Variational Modal Decomposition Parameters" Applied Sciences 14, no. 5: 2174. https://doi.org/10.3390/app14052174

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop