Next Article in Journal
Bioinspired NEMS—Prospective of Collaboration with Nature
Next Article in Special Issue
Passenger Flow-Oriented Metro Operation without Timetables
Previous Article in Journal
A Comparison of Hybrid and End-to-End ASR Systems for the IberSpeech-RTVE 2020 Speech-to-Text Transcription Challenge
Previous Article in Special Issue
Grid-Based Hybrid Genetic Approach to Relaxed Flexible Flow Shop with Sequence-Dependent Setup Times
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Initialisation Approaches for Population-Based Metaheuristic Algorithms: A Comprehensive Review

by
Jeffrey O. Agushaka
and
Absalom E. Ezugwu
*
School of Mathematics, Statistics and Computer Science, University of KwaZulu-Natal, King Edward Road, Pietermaritzburg 3201, KwaZulu-Natal, South Africa
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(2), 896; https://doi.org/10.3390/app12020896
Submission received: 10 November 2021 / Revised: 4 January 2022 / Accepted: 10 January 2022 / Published: 17 January 2022
(This article belongs to the Special Issue Evolutionary Algorithms and Large-Scale Real-World Applications)

Abstract

:
A situation where the set of initial solutions lies near the position of the true optimality (most favourable or desirable solution) by chance can increase the probability of finding the true optimality and significantly reduce the search efforts. In optimisation problems, the location of the global optimum solution is unknown a priori, and initialisation is a stochastic process. In addition, the population size is equally important; if there are problems with high dimensions, a small population size may lie sparsely in unpromising regions, and may return suboptimal solutions with bias. In addition, the different distributions used as position vectors for the initial population may have different sampling emphasis; hence, different degrees of diversity. The initialisation control parameters of population-based metaheuristic algorithms play a significant role in improving the performance of the algorithms. Researchers have identified this significance, and they have put much effort into finding various distribution schemes that will enhance the diversity of the initial populations of the algorithms, and obtain the correct balance of the population size and number of iterations which will guarantee optimal solutions for a given problem set. Despite the affirmation of the role initialisation plays, to our knowledge few studies or surveys have been conducted on this subject area. Therefore, this paper presents a comprehensive survey of different initialisation schemes to improve the quality of solutions obtained by most metaheuristic optimisers for a given problem set. Popular schemes used to improve the diversity of the population can be categorised into random numbers, quasirandom sequences, chaos theory, probability distributions, hybrids of other heuristic or metaheuristic algorithms, Lévy, and others. We discuss the different levels of success of these schemes and identify their limitations. Similarly, we identify gaps and present useful insights for future research directions. Finally, we present a comparison of the effect of population size, the maximum number of iterations, and ten (10) different initialisation methods on the performance of three (3) population-based metaheuristic optimizers: bat algorithm (BA), Grey Wolf Optimizer (GWO), and butterfly optimization algorithm (BOA).

1. Introduction

The primary concern of optimisation is finding either the minima or maxima of the objective function, subject to some given constraints. Optimisation problems naturally occur in machine learning, artificial intelligence, computer science, and operations research. Optimisation has been used to improve processes in all human endeavours. A wide variety of techniques for optimisation exist. These techniques include linear programming, quadratic programming, convex optimization, interior-point method, trust-region method, conjugate-gradient methods, evolutionary algorithms, heuristics, and metaheuristics [1]. The era of artificial intelligence ushered in techniques for optimisation that are capable of finding near-optimal solutions to challenging and complex real-world optimisation problems. Then came the nature-inspired and bio-inspired metaheuristic optimization era, with huge successes recorded and increasing popularity over the past four decades.
Many attributed the popularity of nature-inspired and bio-inspired metaheuristics optimization algorithms to their ability to find near-optimal solutions [2]. This success can be attributed to how these nature-inspired and bio-inspired metaheuristics optimizers mimic natural phenomena [3]. These natural phenomena have inspired the development of almost all of the metaheuristic algorithms. Evolutionary techniques are gaining popularity in the same vein, too, with many novel techniques developed regularly. The performance of evolutionary techniques matches the nature-inspired or bio-inspired algorithms [4].
The successes of these metaheuristic optimizers on real-world problems come with tremendous challenges. These challenges arise from the fact that real-world optimisation problems are complex and have multiple nonlinear constraints. The ability of optimisers to navigate these challenges and achieve optimality depends heavily on how the initial population is distributed, especially for gradient-based optimizers [5]. Though metaheuristic optimizers are gradient-free, they must also be initialised; thus, they are greatly influenced by the nature of the initial population, especially the large-scale multimodal problems. Population-based metaheuristic algorithms show varying abilities to reach a global optimum when the initialisation scheme is varied [6].
Interestingly, in the last decade, there has been an exponential growth in the number of proposed nature-inspired optimisation algorithms. Furthermore, there has been a corresponding claim of novelty and solid capability of the algorithms serving as powerful optimisation tools. Unfortunately, most algorithms do not seem to draw inspiration from nature or incorporate any successful methodology that mimics natural phenomena or systems [7]. From the concept of the theorem of No Free Lunch, many real-world optimisation problems still require new approaches or methods to be solved perfectly or optimally. The theorem proved that any method could solve a problem efficiently, but no single method can effectively solve all problems. Studies have shown the popularity and successes of proposing algorithms that mimic the behaviours of animals to solve optimisation problems with reasonable accuracy.
The commonly used distribution for initialisation by most metaheuristic algorithms is the random number generator, which generates sequences (used as position vectors) that follow the uniform probability distribution. However, these sequences do not have low discrepancies, or are not equidistributed in a given search area, and do not efficiently cover the search space [8]. On the other hand, quasirandom numbers can be generated with low discrepancy; these have been proven to have optimal discrepancy because they tend to cover the search space better and are helpful in optimisation [9]. Low discrepancy sequences, like Van der Corput, Sobol, Faure, and Halton, are potent computational method tools and have been used to improve the performance of optimisation algorithms. Many other approaches exist in the literature, and these are presented in this paper.
A situation where the set of initial solutions lies near the position of the true optimality by chance, can increase the probability of finding the true optimality and significantly reduce the search efforts. In optimisation problems, the location of the true optimality is unknown a priori, and initialisation is a stochastic process. Additionally, the population size is equally important and when considering problems with high dimensions, small population size may lie sparsely in unpromising regions, and can return suboptimal solutions with bias. In addition, the different distributions used as position vectors for the initial population may have a different sampling emphasis and hence different degrees of diversity.
To demonstrate the importance of initialisation, consider the Bukin N. 6 function shown in Figure 1. We assumed a search space of [ 20 , 0 ] × [ 8 , 4 ] . The advanced arithmetic optimization algorithm (nAOA) [10] was initialised with beta distribution, and the distribution of the population after the first iteration is shown in Figure 2. The blue dots represent the current location of the population, the red asterisk (*) represents the current best solution, and the red star (★) denotes the global optimal solution of the Bukin function. The nAOA converged towards the optimal solution after a few iterations, as shown in Figure 3. Similarly, the nAOA was initialised with the random number, and the distribution of the population after the first iteration is shown in Figure 4. The distribution of the population of nAOA quickly falls into a local optimum after a few iterations, as shown in Figure 5.
Although initialisation plays a significant role in the performance of most metaheuristic optimizers, few studies or surveys have been conducted on the subject area. A search using the keywords survey OR review, initialisation (initialization), and metaheuristics, yielded no comprehensive review or survey articles in the literature. However, in discussing PSO variants, ref. [11] provide a paragraph on attempts to improve PSO performance using different initialisation schemes. The authors discuss how low discrepancy sequences and variants of opposition-based learning enhance the initial swarm population. Another attempt using GA was presented by [12], where the effect of three initialisation functions, namely, nearest neighbour (NN), insertion (In), and Solomon’s heuristic, were studied. Li, Liu, and Yang [13] evaluated the effect of 22 different probability distribution initialisation methods on the convergence and accuracy of five optimisation algorithms. In this regard, we formulate the research question given below to accomplish our work:
What literature modified the initialisation control parameters comprising size and diversity of population and the maximum number of iterations to improve the algorithms’ performance?
The following questions are formulated to answer the main research question:
  • What research exists that used distributions other than the random number for initialisation of the population to improve the performance of metaheuristic algorithms?
  • What study exists that fine-tuned the population size and the number of iterations of different algorithms?
  • What are the major initialisation distributions used by the population-based algorithm?
  • What problems were solved by the modified algorithms?
  • What are other challenges yet to be explored by researchers in the research area?
To the best of our knowledge, no survey or review article focuses on general efforts to improve the performances of different metaheuristic optimizers using different initialisation schemes in the literature, which motivates the current research contribution. Therefore, this study presents a comprehensive survey of different initialisation methods employed by metaheuristic algorithm designers and optimisation enthusiasts to improve the performance of the different metaheuristic optimizers available in the literature. The study covers articles published between 2000–2021, and the specific contributions of this paper are summarised as follows:
  • We present a comprehensive review of the different distributions used to improve the diversity of the initial population of population-based metaheuristic algorithms.
  • We categorise the schemes into random numbers, quasirandom sequences, chaos theory, probability distributions, hybrids of other heuristic or metaheuristic algorithms, Lévy, and others.
  • We also discuss the different levels of success of these schemes and identify their limitations.
  • An in-depth highlight of the glossary of efforts to improve the performance of metaheuristic algorithms using several initialisation schemes is presented. Metaheuristic research enthusiasts can easily reference this glossary.
  • Finally, we provide the research gaps, useful insights, and future directions.
The rest of the paper is organised as follows. In Section 2, we provide the methodology used for collecting papers. The major initialisation methods used to improve the performance of the algorithms are presented in Section 3. In Section 4, we discuss the various application areas of the present study. Results and discussion of findings from our experiment are presented in Section 5. Finally, Section 6 presents the concluding remarks.

2. Methodology and Paper Collection Technique

This section discussed the procedure used for paper selection, collection, and review. Search keywords, search techniques, data sources, databases, plus inclusion and exclusion criteria are explained. We followed the systematic literature review procedure provided in the work of [14], and we were guided by the work of [15].

2.1. Keywords

In order to retrieve relevant articles to achieve our review goal, we carefully selected some useful keywords, that we used to search the database: initialization (initialisation), metaheuristic, optimization (optimisation), OR algorithm. The initial search for these articles was carried out between 20 to 24 September 2020, and the final search was carried out between 25 to 30 October 2021. Articles retrieved based on the keywords searched were perused during each search in order to collect more related articles from their citations and references sections.

2.2. Academic Databases

The keywords selected were used to search and retrieve relevant works from the body of literature. We targeted only articles that are published in reputable peer-review journals, edited books, and conference proceedings indexed in two (2) academic databases. The Web of Science (WoS) and Scopus repositories are the academic databases that we used to extract articles. These are repositories with high-quality articles that are published in SCI-indexed journals and ranked international conferences. We performed a search based on the above keywords in these repositories up to 2021.

2.3. Inclusion/Exclusion Criteria

We formulated some inclusion and exclusion criteria in order to collect solely relevant literature examples. The collected articles are either included or excluded based on some criteria after perusing their titles, abstracts, conclusions and, in some cases, the complete content. The selected criteria are given in Table 1.

2.4. Eligibility

We applied the inclusion and exclusion criteria to determine the eligibility of the selected articles. A total of 99 articles were returned by WoS and 58 articles were returned by Scopus repositories, respectively. Figure 6 shows the document type of distribution from the WoS repository, where 83 articles, 16 conference proceedings, five (5) early access, and one book chapter have been published. Similarly, Figure 7 shows the distribution of document types from Scopus, with 39 articles, 17 conference papers, one book, and one conference review, respectively. Both figures show that more articles are published in journals than in conferences and book chapters.
After cross-referencing the two repositories, we found many papers that intersect both, and we excluded these articles from the other repository. In addition, we found articles that were included in the search because they contained the keyword “initialization”, but they did not relate to our research; hence, we also excluded them. A total of 52 articles were selected for this survey, after applying the inclusion criteria.

3. Major Initialisation Methods

This section discusses the updated efforts on improving the initial condition of the population of metaheuristic algorithms. This provided an answer to our research question, what research examples exist that used distributions other than the random number for initialisation of the population, to improve the performance of metaheuristic algorithms? The different initialisation schemes identified in the literature were summarised or categorised into pseudo-random number or Monte Carlo methods, quasirandom methods, probability distributions, hybrid, chaos theory, Lévy, and ad hoc knowledge of the domain, and others. The categorisation was performed to aid our discussion of the schemes that we identified.

3.1. Pseudo-Random Number or Monte Carlo Methods

By default, the random number generation or Monte Carlo method is the most used initialisation scheme for most metaheuristic algorithms. It uses the uniform probability distribution to generate uniform pseudo-random number sequences that are used as location vectors for the population. Many population-based metaheuristic algorithms use this scheme, and interested readers can refer to the respective optimisers for details. The role of the random number generation, as an essential part of the initialisation process, has been greatly emphasised [16,17]. Despite its popularity, the random number sequence suffers because its discrepancy is not low and does not efficiently cover the search space [8]. The discrepancy of the random number greatly influences how genuinely random the resulting randomly generated solutions are within the solution search spaces [18]. Research works, such as those by [19,20], have shown that the random number does not result in an optimal discrepancy that will aid the convergence of the algorithms. Figure 8 shows how the random numbers tend to form clusters after several iterations, instead of filling up the search space. This is a significant disadvantage of using random number generators to initialise the population of the metaheuristic algorithms.
We did not include a table for this category because most existing metaheuristic algorithms belong here. A table for this would be huge, and there is no area of application that this scheme has not been applied to.

3.2. Quasirandom Methods

Quasirandom number generators are known to generate sequences that are proven to have low discrepancy [9]. Low discrepancy sequences, like Van der Corput, Sobol, Faure, and Halton, are potent computational method tools, which have been used to improve the performance of optimisation algorithms. Quasirandom numbers are effective initialisation mechanisms for metaheuristic algorithms to uniformly cover the search space in order to obtain the optimal solution. The particle swarm population in the work of [21] was initialised using the randomized low discrepancy sequences of Halton, Sobol, and Faure. The three modified PSO were applied to the benchmark test functions, and results were then compared with the global best PSO. This showed that PSO was significantly improved with Sobol, while the results showed a varying improvement with Faure and Halton. Similarly, the Van der Corput and Sobol sequences were used to initialise the PSO and were then applied to solve the benchmark functions [8]. The results obtained were promising when compared to the original PSO.
The krill population in the KH algorithm was initialised using the Faure, Sobol, and Var der Corptut sequences [22]. The benchmark test functions were used to test the efficacy of the modified KH. Our findings revealed significant improvements in the performance of the KH algorithm when initialised using Faure, Sobol, and Var der Corptut low-discrepancy sequences, which was also the case with the guaranteed convergence particle swarm optimization (GCPSO) algorithm, using the Niching methods to initialise the swarm population [23]; the Niching methods are based on the Faure low-discrepancy sequence, and the benchmark test functions were used to evaluate the performance of GCPSO, with promising results.
The initialisation schemes that were implemented using low-discrepancy sequences are known to perform poorly, as the problem dimension or graph size scales up. Figure 9 shows how the Halton sequence spreads and fills the search space at the 1000th iteration, improving the convergence of algorithms. We have noted authors [24] who use the Halton sequence to initialise the search agents of the Wingsuit Flying Search (WFS) algorithm.
Table 2 summarises the glossary of efforts that used low discrepancy sequences (quasirandom numbers) to initialise the population of some metaheuristic optimizers. Interested readers can refer to the references for more details about the efforts. In all the papers reviewed in this section, the authors claimed that fine-tuning the initialisation control parameters (population size and diversity and maximum number of iterations or function evaluations) improved the performance of the algorithm.

3.3. Probability Distributions

The probability distribution describes the possible values and likelihood that a random number is effective within a defined interval. Different probability distributions and their rigorous statistical properties can be used to initialise the population of metaheuristic algorithms. Li, Liu, and Yang [13] used variants of Beta distribution, uniform distribution, normal distribution, logarithmic normal distribution, exponential distribution, Rayleigh distribution, Weibull distribution, and Latin hypercube sampling [31] to form 22 different initialisation schemes in order to evaluate PSO, CS, DE, ABC, and GA. The variants of the probability distributions are as follows:
  • Beta distribution
The Beta distribution is a continuous probability distribution over the interval (0,1). It can be written as X ~ B e ( a , b ) . Varying the values of a   and   b resulted in a variant of the Beta distribution, generating sequences with different behaviours in the search space. Three variants of the Beta distribution were used,
  • Uniform distribution
A uniform distribution is defined over the interval [a, b], and it is usually written as X ~ U ( a , b ) . One variant of the normal distribution was used.
  • Normal distribution
The Gaussian Normal distribution is usually written as X ~ N ( μ , σ 2 ) . In addition, varying the values of μ   and   σ 2 resulted in three (3) variants of the normal distribution, which generates sequences with different behaviours in the search space.
  • Logarithmic normal distribution
The logarithmic normal distribution is often written as l n X ~ N ( μ , σ 2 ) . Four (4) variants of the logarithmic normal distribution were created by varying the values of μ   and   σ 2 .
  • Exponential distribution
An exponential distribution is asymmetric with a long tail and can be written as X ~ exp ( λ ) . Varying λ , resulted in three variants of the distribution which were used to initialise the population of the five algorithms.
  • Rayleigh distribution
The Rayleigh distribution can be written as X ~ R a y l e i g h ( σ ) . Three (3) variants of the distribution were created by varying the value of σ .
  • Weibull distribution
This distribution can be considered as a generalisation of a few other distributions. It can be written as X ~ W e i b u l l ( λ , k ) . For example, k = 1 corresponds to an exponential distribution, while k = 2 leads to the Rayleigh distribution. In the same vein, three variants of the distribution were created.
The convergence and accuracy of five metaheuristic optimizers were evaluated on the benchmark test functions and the CEC2020 test functions. These optimisers are then initialised using 22 different initialization schemes [13]. The findings of those authors showed that PSO and CS are more sensitive to the initialisation scheme used, whereas DE was less susceptible to the initialisation scheme used. In addition, PSO relies on a greater population size, whereas CS requires a lesser population size. DE does well with an increased number of iterations. The Beta, Rayleigh, and exponential distributions are great performers as the results showed that they greatly influence the convergence of the optimisers used.
Georgioudakis, Lagaros, and Papadrakakis [31] incorporated Latin hypercube sampling (LHS) to initialise four (4) optimisers; namely, the evolution strategies (ES), covariance matrix adaptation (CMA), elitist covariance matrix adaptation (ECMA) and differential evolution (DE). They use these optimisers to investigate the relation between the geometry of the structural components, and their service life. They aimed to improve the service life of structural components under fatigue. Their choice of LHS instead of the random Monte Carlo simulation optimised the number of samples needed to calculate the problem regarding the formulation of the statistical quantities.
The stochastic fractal search (SFS) technique was used in the work of [32] to improve the performance of the multi-layer perceptron neural network. It was used to obtain the optimal set of weights and threshold parameters. The hybrid approach was tested on EEE 14- and 118-bus systems, and the results were compared with other non-optimized MLP (optimized MLP based on genetic algorithm (MLP-GA) and Particle Swarm Optimization (MLP-PSO)). The precision was up by 20–50%, and the computational time was down by 30–50%. However, SFS tends to ignore local search; the correct balance between the global and local search is desired. Similarly, the levy-flight was replaced by stochastic random sampling of simpler fat-tailed distributions enhanced with scaled-chaotic sequences to boost cuckoo search (CS) performance in solving the complex wellbore trajectories problem [33].
Probability distributions generally suffer from issues such as equiprobable disjunct intervals and errors in correlations between variables. We summarise efforts in this category in Table 3.

3.4. Hybrid with Other Metaheuristic Algorithms

Most researchers used another metaheuristic algorithm to find an optimal solution for the initial position of the population in this approach. Metaheuristic algorithms with a high convergence rate in a specific problem domain are often used to find an initial solution. These solutions are then fed into the other metaheuristic algorithms as the initial conditions. A hybridization of ABC and TS was proposed in the work of [39], where the bee population was initialised using the randomized breadth-first search. The performance of their hybrid was better than the algorithms they compared it with; however, it suffers from the time complexity problem of BFS. The authors [40] initialised the monarch butterfly algorithm by equally partitioning the search space and in the F and T random distribution to mutate the divided population. The results showed significant improvements. The Krill in the work of [41] were initialised using the pairwise linear optimisation, which uses fuzzy rules to create clusters that are used as the initial point for the KH. However, the results showed that this improvement would only suit systems based on fuzzy approximators. The CRO was improved using the VNS algorithm with a new processor selection model for the initialisation. The results are promising; however, parameter sensitivity still needs to be resolved [42].
The cuckoo population was initialised using quasi-opposition-based learning (QOBL) [43]. Reaching the optimal search is enhanced by considering a guess and its quasi-opposite guess. The initialisation schemes of BA are improved using a quasirandom sequence with low discrepancy called Torus [25]. Their results were good; however, the results were not evaluated for higher-dimensional problems. Four (4) different dispatching rules (DR)-based initialisation strategies were used by [44], with varying advantages and disadvantages. The best result was obtained when all of the strategies were used together, which means that the diversity of the population contributed less to the algorithm’s overall performance. In [45], a scheme inspired by SAM was developed, and it is a simplified heuristic model that begins the swarm search with an initial set of high-quality solutions.
ABC was used to find the optimal cluster centre of the FCM [46]. An improved ABC was also proposed to solve the vehicle routing problem (VRP) [47]. Among other improvements, the bees were initialised using push forward insertion. An improved DE, named the enhanced differential evolution algorithm (EDE), used the opposition-based learning for the initialisation, along with other improvements, in order to enhance the performance of DE [48]. The optimised stream clustering algorithm (OpStream) used an optimal solution of a metaheuristic algorithm to initialise the first set of the cluster [49]. The optimal solution of the optimal shortening of covering arrays (OSCAR) problem was used as the initialisation function of a metaheuristic algorithm [50].
Mandal, Chatterjee, and Maitra [51] used the PSO to solve the problem that hampered the Chan and Vese algorithm for image segmentation problems, which is low-performance if the contours are not well initialised; the contours are initialised simultaneously with the population. Their hybrid solution made contour initialisation irrelevant to the performance of the algorithm. Another effort was presented by [52], where a scheme to initialise the fuzzy c-means (FCM) clustering algorithm using the PSO was proposed. Finding the optimal cluster centres was set as the objective function of the PSO.
A memetic algorithm that uses the greedy randomized adaptive search procedure (GRASP) metaheuristic and path relinking to initialise and mutate the population was proposed [53]. However, the scalability of the MA was untested. The authors [54] proposed an initialisation scheme that used both the Metropolis-Hastings (MH) and function domain contraction technique (FDCT). MH is helpful when generating the direct sequence of a PD that is difficult. However, MH is best for high multidimensional complex optimisations, as these are problem-dependent. In such a situation, the FDCT is then employed. The FDCT is a sequential three-step solution starting with a random solution generator; and if this is not feasible, then the GBEST PSO generator is applied. If the previous two fail, then the search space reduction technique (SSRT) is applied. These steps ensure that the initialised population leads to a better solution.
Competitive swarm optimizer (CSO) is a variant of PSO used by [55] to improve the extreme learning machines (ELM) network by depending on the individual competition of the particles, which optimise its weights and structure. Although the results show great promise, it took more training time to generate effective models. Sawant, Prabukumar, and Samiappan [56] evaluated an approach to initialise the cuckoo nest based on the correlation between the spectral band of the nest that was proposed. The goal is to ensure convergence by making sure the location of the nest does not repeat. The k-means clustering algorithm is used to select specific clusters on the band based on their correlation coefficient. Another approach is presented to resolve the lack of diversity of PSO and its sensitivity to initialisation, which quickly leads to premature convergence. The crown jewel defence (CJD) is used to escape being stocked in the local optima by relocating and reinitialising the global and local best position. However, the performance of this improvement is not tested in higher dimensions [57].
The DE and local search were combined to improve or enhance the chances of an optimal solution to the hybrid flow-shop scheduling problem [58]. The brainstorm optimisation (BSO) was improved in the work of [59] by implementing a scheme that allows for a reinitialisation scheme to be triggered, based on the current population. In the work of [60], those authors used FA to detect the maxima and number of image clusters through a histogram-based segmentation; the maxima are then used to initialise the parameter estimates of the Gaussian mixture model (GMM). In the work [61], the authors proposed a scheme that enhances the initial conditions of an algorithm by considering these initial conditions to be a sub-optimisation problem where the initial conditions are the parameters to be optimised by the MLA. Their obtained results showed improvements compared to the other algorithms used. The FA was also used in the work of [62] as an optimiser to obtain the initial location of the translation parameters for WNNs. This led to a reduction in the number of hidden nodes of WNN and significantly increased the simultaneous generalisation capability of WNNs.
However, time and computational complexity may be a problem for this approach. In addition, a lack of a proven way to hybridise these algorithms greatly depends on the experience of the researcher. A summary of research efforts in this category is given in Table 4.

3.5. Chaos Theory

Chaos theory describes the unpredictability of systems, and over the years, many advances have been made in this area. Chaotic sequences follow these properties: sensitive to initial conditions, ergodicity, and randomicity. This type of sequence has the advantages of introducing chaos or unpredictability into the optimisation, increasing the range of chaotic motion, and using these chaotic induced variables to search the space effectively [69].
Using the logistic chaotic function, ref. [70] proposed novel improvements on the CS, and one of these improvements is the use of the logistic chaotic function to initialise the population. While their results are promising, they suffer from high computational complexity. The same scheme was used in the work of [71] to improve BA, where the bat population was initialised using chaotic sequences, instead of the random number generator. In addition, the bacterial population of BFO was initialised using chaotic sequences that were generated using logistic mapping [72]. Similarly, the butterflies in the work of [73] were initialised using the homogenous chaotic sequence which were adapted to the ultraviolet changes. Among other improvements proposed in the work of [74], the chaotic initialisation strategy was used to initialise the whales in the multi-strategy ensemble whale optimization algorithm (MSWOA).
The chaos theory was used to initialise the moth-flame optimization (MFO) [75], firefly algorithm (FA) [76], artificial bee colony (ABC) [77], biogeography based optimization (BBO) [78]), krill herd (KH) [79], water cycle algorithm (WCA) [80], and grey wolf optimizer (GWO) [81]. In all, the authors claimed superiority of their results over other algorithms; however, high computational complexity remains an issue for this category, and we provide a summary of efforts in Table 5.

3.6. Ad Hoc Knowledge of the Domain

In the ad hoc knowledge of the domain approach, the authors used background knowledge of the domain to design the initialisation scheme of an algorithm. The nature of the problem is what influences the diversity and spread of the initial population. The scheme proposed in the work of [86] used this scheme to generate initial solutions, serving as the initial point for the metaheuristic method. Their results were better and, in some cases, competitive; however, we believe that this method is excessively problem-dependent as such a generalisation is impossible. In the same vein, ref. [87] proposed the initialisation of the bats method, based on ad hoc knowledge of the PV domain. Precisely, they used the peaks with similar duty ratios that occur at the power versus duty ratio of the boost converter curve. Yao et al. [88] used the objective function to minimise the wear and tear of the actuators when initialising the population.
The clans in EHO [89] were initialised by considering the acoustic decay model that is used to obtain the distance between the sensor and the noise source. Depending on the noise level, the intersection of the source coordinates will be at the radii, which is less likely to be single. The clans are initialised, while being based at the centre of the intersection. The technique suffers from being problem-dependent and requires much adaptation before being used in other domains. Finally, a scheme to help PSO avoid reinitialisation to capture the global peaks, when PSO changes its position and value in the P-V curve, was developed by [90]. Particles are sent to areas of anticipated peaks; once located, particles are sent there to cater for them. Table 6 gives a summary of this approach.

3.7. Lévy Flights

A two-way approach to improving the initialisation scheme for the bees algorithm was also proposed [93]. The patch environment and levy motion imitate the natural food environment and the foraging motion of the bees, respectively. Although the patch concept is used in the original Bees algorithm for the neighbourhood search, its use for initialisation and the levy motion greatly improved its performance. In addition, the performance of the GWO algorithm [94] was enhanced using the Lévy flight (LF) and greedy selection. An improved modified GWO algorithm is proposed to solve global or real-world optimisation problems. In order to boost the efficacy of GWO, strategies are integrated with the modified hunting phases. However, no test was carried out on a specific optimisation domain; hence, no comparison was made. A glossary of efforts on the use of this approach is given in Table 7, and authors claimed superiority of their results over other algorithms.

3.8. Others

Other approaches to improve the diversity, spread, and optimality of the initial population of metaheuristic algorithms exist in the literature. This category includes approaches that used mathematical and statistical functions to aid the initial population in an exhaustive search.
A nonlinear simplex method was used to initialise the swarms [102]. Their results showed that the particles gravitated better towards the excellent quality solutions. An approach where a particle is placed in the centre, and the rest of the particles are spread around it in the search space was considered by [103]. Their result is promising; however, it is not entirely without bias. The use of complex-valued encoding for metaheuristic optimization research is gaining attention from researchers. A comprehensive and extensive overview of this approach is presented in [104].
The complex-valued encoding metaheuristic algorithms have been applied significantly in function optimization, engineering optimization design, and combinatorial optimization. The regular metaheuristic algorithms are based on continuous or discrete encoding. The advantage of the complex-valued encoding metaheuristic algorithm is that it expands the search region and efficiently avoids falling into the local minimum. Finally, eight metaheuristic algorithms were enhanced using the complex-valued encoding, and they were tested using 29 benchmark test functions and five engineering optimisation design problems. The superiority of complex-valued encoding was proved by analysing and comparing the results with statistical significance, and the complex-valued encoding metaheuristic algorithm returned the best performances. We present a summary of what authors have done in this category in Table 8.

4. Areas of Application

Much of the research that improved the performance of metaheuristic algorithms, by improving the nature and diversity of the initial population of the algorithms, have been applied in different areas of human endeavour, with significant successes recorded. Figure 10 gives the various application areas of the articles that were found in the literature.

4.1. Computer Science

Figure 10 shows the computer science subcategory as having the highest number of publications, and this can be attributed to the fact that the optimisation problems naturally occur in this area, with over 43 articles published in this area in journals indexed in Scopus and over 60 articles published in journals indexed in WoS. This means that the vast majority of these improvements are applied to solve optimisation problems in computer science, particularly in the area of artificial intelligence. The area of artificial intelligence alone has about 40 articles that are indexed in WoS; this is the most researched area in computer science. The most cited paper in this area is that of [67], who proposed a hybrid of differential evolution and greedy algorithm to exploit the advantages of both methods to improve initialisation, among other improvements. This was used in solving the multi-skill resource-constrained project scheduling problem, and it has been cited 30 times. The hybrid of metaheuristic algorithms is the most common initialisation approach used, apart from the random number generator. The chaos theory and low discrepancy sequences are also popular in this area of application.

4.2. Engineering

Optimisation problems naturally occur in engineering with many metaheuristic algorithms being used to solve problems in this domain; further, this area has the second-highest number of publications. WoS subdivided this category into electrical, electronic, multidisciplinary, industrial, manufacturing, telecommunication, and mechanical categories, whereas Scopus combined them into one category. The sub-area of electrical electronics is the most researched area, with over 25 articles indexed in WoS. A total of 12 articles are indexed in Scopus and over 30 articles in WoS. The most cited article in this category is by [65], in which the authors developed a multi-objective evolutionary algorithm (MOEA)-based proactive-reactive method. This introduced a stability objective, and heuristic initialisation strategies used for the initial solution, and the decision-making approach are also validated. The article was cited 105 times. The area of telecommunications is also well researched and it has nine articles indexed in the repository.

4.3. Mathematics

The area of mathematics has provided the foundation for optimisation techniques used by metaheuristic algorithms. Over 28 articles indexed in Scopus are related to this area, and WoS further divided this category into multidisciplinary, computational biology, interdisciplinary, and applied mathematics. This area intersects with engineering and computer science, and many articles in this category are also classified under these other categories.

4.4. Others

We categorise all the areas with five publications into the category: others. This category comprises automation control systems, remote sensing, robotics, acoustics, chemistry, environmental sciences, management, transportation science technology, energy, neuroscience, and social sciences. Clearly, we see that 90% of articles published in this subject area and indexed in WoS or Scopus are primarily applied or solved problems in the area of computer science and engineering. Great diversity in the application areas can also be alluded to, as can be seen in pockets of research that fall under other categories with few publications.

5. Experiment, Result, and Discussion

5.1. Experimental Setup

This section presents the three (3) different metaheuristic algorithms and ten (10) initialisation schemes used in our work. The choice of these algorithms and initialisation methods is based on their performances in solving optimisation problems, the availability of codes online, and part of many other algorithms and initialisation methods we are using in our current research projects. Table 9 summarises these algorithms, including the year the article was first published, the authors, and the application area of the first publication. Table 10 and Table 11 summarise the ten initialisation schemes and the control parameters of the algorithms as were used for the experiments, respectively.
The variation of the population size and number of iterations are as given in Table 12. The variation is such that a large population size goes with a small number of iterations and vice versa. We also included situations where the two are relatively even.
We also conducted a series of experiments to evaluate the effect of the initialisation schemes presented in Table 10 on the three metaheuristic algorithms. We carried out the experiments on ten classical test functions, namely: sphere, quartic, Zakharov, Schwefel 1.2, Booth, Michalewicz, Rastrigin, Rosenbrock, Griewank, and Ackley, consisting of a wide variety of separable, unimodal, non-separable multimodal, numbers of local optima, and multi-dimensional problems. F1 and F2 are unimodal and separable benchmark functions with dimension (D) set at 30. Additionally, F3 and F4 have dimensions set at 30D and are unimodal and non-separable benchmark functions. Similarly, F5, F6, and F7 are multimodal and separable benchmark functions with dimensions set at 2D, 10D, and 30D, respectively. The multimodal and non-separable benchmark functions are F8, F9, and F10, with dimensions set to 30D.
All algorithms were implemented in MATLAB R2019a, and the experiments were conducted using Windows 10 OS, Intel Core i7-8550U CPU, 16G RAM. The number of maximum iterations is set at 1000, and the number of independent runs is set at 20. We round up any solution value less than 10−8 to zero, and the results are reported using the following performance indicators: best, worst, mean, standard deviation, and the algorithm mean runtime. We then statistically compared the results from the experiments using Friedman’s test and post-hoc analysis, based on the Wilcoxon signed ranks test.

5.2. Results and Discussion

The experiment results on the effect of population size and the maximum number of iterations on the metaheuristic algorithms considered are presented in Table 13, Table 14 and Table 15. The Friedman test results for all of the results are given in Table 16. This showed a statistically significant difference in the effect of population size and number of iterations for all algorithms tested. The chi-square and p-value as shown in Table 16, and all the p-values are less than the tolerance level of 0.05. Post hoc analysis with Wilcoxon signed-rank tests was conducted with a Bonferroni correction applied, resulting in a significance level set at p < 0.001.
The test results for BA are shown in Table 13. We noted that the best results are returned when the population size is 1000, the number of iterations is 10, and it has the lowest mean rank, as shown in the corresponding column in Table 16. Further post hoc results confirmed that a significant difference occurred between this comparison. The implication is that BA performed better with larger population sizes. Similarly, the results for GWO are given in Table 14, and it can be seen that GWO failed to return the optimal solution for Rosenbrock. However, it performed optimally when the population size was 50, the number of iterations was 600, and the lowest mean rank was recorded in this category. A further post hoc test confirmed that GWO performed better when the number of iterations was greater. The results for BOA are presented in Table 15, and we noted that excellent results are returned for small population sizes. The least mean rank is returned when the population size is 30, and the number of iterations is 800. The post hoc test confirmed that BOA performs optimally for a greater number of iterations.
The results of experiments that were conducted to show the effect of 10 different initialisation schemes on the algorithms are presented, and the findings in the experiments are discussed. The best, worst, mean, standard deviation, and the mean runtime results obtained from the experiments are shown in Table 17, Table 18 and Table 19. It can be seen from the results, that the ten different initialisation schemes have a different effect on the performance of the algorithms. For some functions, the results are better than others. For some functions, the results appeared to be inconsistent because, while the best value is accurate, the mean value seemed to be inaccurate. The inconsistency could mean that the initial population is close to the global optimum when the best value was returned. It could also mean that the diversity is best suited to the function, hence, its ability to yield a good result. In other cases, more iterations might be needed, or a different diverse population might be used to achieve the desired result.
The results for experiments conducted on BA are given in Table 17. The betarnd(3,2), betarnd(2.5,2.5), raylrnd(0.4), and sobol outperformed the rand for most functions. To obtain the general effect of the initialisation schemes on BA, we treated the ten initialisation schemes as observations for the Friedman’s test, and the summary is given in the corresponding column in Table 20. The p-value is 0.000, which is less than α = 0.05, and hence we rejected the hypothesis. This means that the performance of BA is sensitive to the initialisation schemes. After a post hoc test based on the Wilcoxon signed ranks test of all the initialisation schemes using a Bonferroni correction with a significance level set at p < 0.001, the betarnd(2.5,2.5) returned the lowest mean and is ranked first and, therefore, we recommended betarnd(2.5,2.5) for BA.
The result for BOA is shown in Table 18, and it showed that the betarnd(3,2) and unifrnd(0,1) are the best performing initialisation schemes. As shown in Table 20, BOA has a p-value of 0.050, which is equal to α = 0.05, hence we retained the hypothesis. This means that BOA is not sensitive to the initialisation schemes. Similarly, the results for GWO (Table 19) showed that lognrnd(0,0.5) and betarnd(3,2) are the best performing initialisation schemes. The Friedman’s test result showed that the p-value is 0.287, which is greater than α = 0.05, hence we retained the hypothesis. This means that BOA is not sensitive to the initialisation schemes.

6. Conclusions

So many works exist in the literature that clearly outline the nature of the role of the initial population in the overall performance of metaheuristic algorithms. However, despite the role that initialisation plays and the efforts put forward by researchers in this research area, to our knowledge, no comprehensive survey of articles on the subject area exists. Therefore, the present study presents a comprehensive survey of different approaches to improving performances of metaheuristic optimizers, using their initialisation scheme. We also show the publication trends for research in this area, and the number of citations. Finally, we provided a glossary of efforts that have been made to improve the performance of metaheuristic algorithms using their initialisation scheme. We also include the areas of application of these improvements for easy reference by metaheuristic research enthusiasts.
The number of articles published to date in the repositories that were discussed earlier showed that the area which focuses on the initialisation of the population of metaheuristic algorithms is relatively uncharted. Many of these metaheuristic algorithms have been proposed; however, less effort has been made regarding their initialisation scheme. Most researchers opt for the commonly used random number generator whose disadvantages have been significantly studied. The ease of implementation of the random number generator may have contributed to its use by researchers. On the one hand, the hybridisation of metaheuristic algorithms has yielded great results in the literature. Authors have had a great degree of success in using different initialisation schemes for the algorithms. We see a promising avenue whereby researchers can explore these high-performing initialisation schemes to assess their efficacy. The size of the population and the iteration number can be varied along with these schemes. This can help in increasing the performance of the algorithms.
Our experiments demonstrate that for the classical functions under consideration, BA is sensitive to the initialisation schemes, whereas GWO and BOA are not. The sensitivity of the algorithms is also problem-dependent, meaning that some functions were insensitive to the initialisation scheme. The population size and number of iterations play a role in the performance of the algorithms. We discovered that BA performed better with larger population sizes. GWO and BOA performed better when the number of iterations was greater. This conclusion is heavily dependent on the dimension problem; however, we believe that good population diversity and number of iterations will most likely lead to optimal solutions.
We also identified the need for an initialisation method for these algorithms that are best suited to the specific problem domain with statistical backing to yield an optimal solution for that set of problems. Unfortunately, most papers on meta-heuristics usually perform very little statistical validation, and if they do it is only on a single problem that the researchers describe. Benchmarking meta-heuristics with systematic and sound statistical techniques is usually lacking from many published works in the literature. In addition, a tuning/adaptive scheme could be developed, and this scheme should be capable of choosing an initialisation method from a suite of initialisation schemes that will lead to better solutions, depending on the nature of the problem encountered. This approach will also lead to the diversity of the population.

Author Contributions

Conceptualization, J.O.A. and A.E.E.; methodology, J.O.A. and A.E.E.; software, J.O.A. and A.E.E.; validation, J.O.A. and A.E.E.; formal analysis, J.O.A. and A.E.E.; investigation, J.O.A. and A.E.E.; writing—J.O.A.; writing—A.E.E.; supervision, A.E.E.; project administration, A.E.E. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Yang, X.S. Social Algorithms. In Encyclopedia of Complexity and Systems Science; Meyers, R.A., Ed.; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar] [CrossRef]
  2. Ezugwu, A.E.; Shukla, A.K.; Nath, R.; Akinyelu, A.A.; Agushaka, J.O.; Chiroma, H.; Muhuri, P.K. Metaheuristics: A comprehensive overview and classification along with bibliometric analysis. Artif. Intell. Rev. 2021, 54, 4237–4316. [Google Scholar] [CrossRef]
  3. Ezugwu, A.E.; Adeleke, O.J.; Akinyelu, A.A.; Viriri, S. A conceptual comparison of several metaheuristic algorithms on continuous optimization problems. Neural Comput. Appl. 2020, 32, 6207–6251. [Google Scholar] [CrossRef]
  4. Dokeroglu, T.; Sevinc, E.; Kucukyilmaz, T.; Cosar, A. A survey on new generation metaheuristic algorithms. Comput. Ind. Eng. 2019, 137, 106040. [Google Scholar] [CrossRef]
  5. Kondamadugula, S.; Naidu, S.R. Accelerated evolutionary algorithms with parameter importance based population initialization for variation-aware analog yield optimization. In Proceedings of the 2016 IEEE 59th International Midwest Symposium on Circuits and Systems (MWSCAS), Abu Dhabi, United Arab Emirates, 16–19 October 2016. [Google Scholar]
  6. Elsayed, S.; Sarker, R.; Coello, C.A.C. Sequence-based deterministic initialization for evolutionary algorithms. IEEE Trans. Cybern. 2016, 47, 2911–2923. [Google Scholar] [CrossRef] [PubMed]
  7. Tzanetos, A.; Dounias, G. Nature inspired optimization algorithms or simply variations of metaheuristics. Artif. Intell. Rev. 2021, 54, 1841–1862. [Google Scholar] [CrossRef]
  8. Pant, M.; Thangaraj, R.; Grosan, C.; Abraham, A. Improved particle swarm optimization with low-discrepancy sequences. In Proceedings of the 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence), Hong Kong, China, 1–6 June 2008. [Google Scholar]
  9. Gentle, J. Random Number Generation and Monte Carlo Methods; Springer Science & Business Media: Boston, MA, USA, 2006. [Google Scholar]
  10. Agushaka, J.O.; Ezugwu, A.E. Advanced Arithmetic Optimization Algorithm for solving mechanical engineering design problems. PLoS ONE 2021, 16, e0255703. [Google Scholar] [CrossRef] [PubMed]
  11. Imran, M.; Hashima, R.; Khalid, N.E.A. An overview of particle swarm optimization variants. Procedia Eng. 2013, 53, 491–496. [Google Scholar] [CrossRef] [Green Version]
  12. Osaba, E.; Carballedo, R.; Diaz, F.; Onieva, E.; Lopez, P.; Perallos, A. On the influence of using initialization functions on genetic algorithms solving combinatorial optimization problems: A first study on the TSP. In Proceedings of the 2014 IEEE Conference on Evolving and Adaptive Intelligent Systems (EAIS), Linz, Austria, 2–4 June 2014. [Google Scholar]
  13. Li, Q.; Liu, S.Y.; Yang, X.S. Influence of initialization on the performance of metaheuristic optimizers. Appl. Soft Comput. 2020, 91, 106193. [Google Scholar] [CrossRef] [Green Version]
  14. Weidt Neiva, F.; de Souza da Silva, R.L. Systematic Literature Review in Computer Science—A Practical Guide; Technical Report of Computer Science Department DCC/UFJF RelaTeDCC 002/2016; Federal University of Juiz de Fora: Juiz de Fora, Brazil, 2016. [Google Scholar]
  15. Jauro, F.; Chiroma, H.; Gital, A.; Almutairi, M.; Shafi’i, M.; Abawajy, J. Deep learning architectures in emerging cloud computing architectures: Recent development, challenges and next research trend. Appl. Soft Comput. 2020, 96, 106582. [Google Scholar] [CrossRef]
  16. Cantú-Paz, E. On random numbers and the performance of genetic algorithms. Comput. Sci. Prepr. Arch. 2002, 2002, 203–210. [Google Scholar]
  17. Daida, J.; Ross, S.; McClain, J.; Ampy, D.; Holczer, M. Challenges with verification, repeatability, and meaningful comparisons in genetic programming. In Genetic Programming 1997: Proceedings of the Second Annual Conference; Morgan Kaufmann Publishers: San Francisco, CA, USA, 1997. [Google Scholar]
  18. Wang, X.; Hickernell, F. Randomized halton sequences. Math. Comput. Model. 2000, 32, 887–899. [Google Scholar] [CrossRef]
  19. Niederreiter, H. Random Number Generation and Quasi-Monte Carlo Methods; Society for Industrial and Applied Mathematics: Philadelphia, PA, USA, 1992. [Google Scholar]
  20. Morokoff, W.; Caflisch, R. Quasirandom sequences and their discrepancies. SIAM J. Sci. Comput. 1994, 15, 1251–1279. [Google Scholar] [CrossRef] [Green Version]
  21. Uy, N.Q.; Hoai, N.; McKay, R.; Tuan, P. Initialising PSO with randomized low-discrepancy sequences: The comparative results. In Proceedings of the 2007 IEEE Congress on Evolutionary Computation, Singapore, 25–28 September 2007. [Google Scholar]
  22. Agushaka, J.; Ezugwu, A. Influence of Initializing Krill Herd Algorithm with Low-Discrepancy Sequences. IEEE Access 2020, 8, 210886–210909. [Google Scholar] [CrossRef]
  23. Brits, R.; Engelbrecht, A.; van den Bergh, F. A niching particle swarm optimizer. In Proceedings of the 4th Asia-Pacific Conference on Simulated Evolution and Learning, Singapore, 18–22 November 2002; Volume 2. [Google Scholar]
  24. Covic, N.; Lacevic, B. Wingsuit flying search—A novel global optimization algorithm. IEEE Access 2020, 8, 53883–53900. [Google Scholar] [CrossRef]
  25. Bangyal, W.H.; Ahmad, J.; Rauf, H.T.; Pervaiz, S. An improved bat algorithm based on novel initialization technique for global optimization problem. Int. J. Adv. Comput. Sci. Appl. 2018, 9, 158–166. [Google Scholar] [CrossRef]
  26. Kimura, S.; Matsumura, K. Genetic algorithms using low-discrepancy sequences. In Proceedings of the 7th Annual Conference on Genetic and Evolutionary Computation, Washington, DC, USA, 25–29 June 2005. [Google Scholar]
  27. Kucherenko, S.; Sytsko, Y. Application of deterministic low-discrepancy sequences in global optimization. Comput. Optim. Appl. 2005, 30, 297–318. [Google Scholar] [CrossRef]
  28. Thangaraj, R.; Pant, M.; Abraham, A.; Badr, Y. Hybrid evolutionary algorithm for solving global optimization problems. In Proceedings of the International Conference on Hybrid Artificial Intelligence Systems, Salamanca, Spain, 10–12 June 2009; Springer: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
  29. Bangyal, W.H.; Ahmad, J.; Rauf, H.T. Comparison of Different Bat Initialization Techniques for Global Optimization Problems. Int. J. Appl. Metaheuristic Comput. 2021, 12, 157–184. [Google Scholar] [CrossRef]
  30. Nakib, A.; Daachi, B.; Siarry, P. Hybrid differential evolution using low-discrepancy sequences for image segmentation. In Proceedings of the 2012 IEEE 26th International Parallel and Distributed Processing Symposium Workshops & PhD Forum, Shanghai, China, 21–25 May 2012. [Google Scholar]
  31. Georgioudakis, M.; Lagaros, N.D.; Papadrakakis, M. Probabilistic shape design optimization of structural components under fatigue. Comput. Struct. 2017, 182, 252–266. [Google Scholar] [CrossRef]
  32. Mosbah, H.; El-Hawary, M.E. Optimization of neural network parameters by Stochastic Fractal Search for dynamic state estimation under communication failure. Electr. Power Syst. Res. 2017, 147, 288–301. [Google Scholar] [CrossRef]
  33. Wood, D.-A. Hybrid cuckoo search optimization algorithms applied to complex wellbore trajectories aided by dynamic, chaos-enhanced, fat-tailed distribution sampling and metaheuristic profiling. J. Nat. Gas Sci. Eng. 2016, 34, 236–252. [Google Scholar] [CrossRef]
  34. Shanmugam, G.; Ganesan, P.; Vanathi, D.P. Meta heuristic algorithms for vehicle routing problem with stochastic demands. J. Comput. Sci. 2011, 7, 533. [Google Scholar] [CrossRef] [Green Version]
  35. de Melo, V.V.; Delbem, A.C.B. Investigating smart sampling as a population initialization method for differential evolution in continuous problems. Inf. Sci. 2012, 193, 36–53. [Google Scholar] [CrossRef]
  36. Rauf, H.T.; Bangyal, W.H.; Ahmad, J.; Bangyal, S.A. Training of artificial neural network using pso with novel initialization technique. In Proceedings of the 2018 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT), Zallaq, Bahrain, 18–20 November 2018. [Google Scholar]
  37. Bui, D.T.; Pradhan, B.; Nampak, H.; Bui, Q.T.; Tran, Q.A.; Nguyen, Q.P. Hybrid artificial intelligence approach based on neural fuzzy inference model and metaheuristic optimization for flood susceptibilitgy modeling in a high-frequency tropical cyclone area using GIS. J. Hydrol. 2016, 540, 317–330. [Google Scholar]
  38. Termeh, S.V.R.; Khosravi, K.; Sartaj, M.; Keesstra, S.D.; Tsai, F.T.C.; Dijksma, R.; Pham, B.T. Optimization of an adaptive neuro-fuzzy inference system for groundwater potential mapping. Hydrogeol. J. 2019, 27, 2511–2534. [Google Scholar] [CrossRef]
  39. Lozano, M.; Duarte, A.; Gortázar, F.; Martí, R. A hybrid metaheuristic for the cyclic antibandwidth problem. Knowl.-Based Syst. 2013, 54, 103–113. [Google Scholar] [CrossRef]
  40. Wang, G.G.; Hao, G.S.; Cheng, S.; Cui, Z. An improved monarch butterfly optimization with equal partition and f/t mutation. In Proceedings of the International Conference on Swarm Intelligence, Hong Kong, China, 25–27 March 2017. [Google Scholar]
  41. Hodashinsky, I.A.; Filimonenko, I.V.; Sarin, K.S. Krill herd and piecewise-linear initialization algorithms for designing Takagi–Sugeno systems. Optoelectron. Instrum. Data Process. 2017, 53, 379–387. [Google Scholar] [CrossRef]
  42. Jiang, Y.; Shao, Z.; Guo, Y.; Zhang, H.; Niu, K. Drscro: A metaheuristic algorithm for task scheduling on heterogeneous systems. Math. Probl. Eng. 2015, 2015, 396582. [Google Scholar] [CrossRef] [Green Version]
  43. Kang, T.; Yao, J.; Jin, M.; Yang, S.; Duong, T. A novel improved cuckoo search algorithm for parameter estimation of photovoltaic (PV) models. Energies 2018, 11, 1060. [Google Scholar] [CrossRef] [Green Version]
  44. Vlašić, I.; Ðurasević, M.; Jakobović, D. Improving genetic algorithm performance by population initialization with dispatching rules. Comput. Ind. Eng. 2019, 137, 106030. [Google Scholar] [CrossRef]
  45. Aminbakhsh, S.; Sonmez, R. Pareto front particle swarm optimizer for discrete time-cost trade-off problem. J. Comput. Civ. Eng. 2017, 31, 04016040. [Google Scholar] [CrossRef]
  46. Wijayanto, A.W.; Purwarianti, A. Improvement design of fuzzy geo-demographic clustering using Artificial Bee Colony optimization. In Proceedings of the 2014 International Conference on Cyber and IT Service Management (CITSM), Bali, Indonesia, 8–10 August 2014. [Google Scholar]
  47. Han, Y.Q.; Li, J.Q.; Liu, Z.; Liu, C.; Tian, J. Metaheuristic algorithm for solving the multiobjective vehicle routing problem with time window and drones. Int. J. Adv. Robot. Syst. 2020, 17, 1729881420920031. [Google Scholar] [CrossRef]
  48. Xiang, W.L.; Meng, X.L.; An, M.Q.; Li, Y.Z.; Gao, M.X. An enhanced differential evolution algorithm based on multiple mutation strategies. Comput. Intell. Neurosci. 2015, 2015, 285730. [Google Scholar] [CrossRef] [Green Version]
  49. Yeoh, J.M.; Caraffini, F.; Homapour, E.S.V.; Milani, A. A clustering system for dynamic data streams based on metaheuristic optimization. Mathematics 2019, 7, 1229. [Google Scholar] [CrossRef] [Green Version]
  50. Carrizales-Turrubiates, O.; Rangel-Valdez, N.; Torres-Jiménez, J. Optimal shortening of covering arrays. In Proceedings of the Mexican International Conference on Artificial Intelligence, Puebla, Mexico, 26 November–4 December 2011; Springer: Berlin/Heidelberg, Germany, 2011. [Google Scholar]
  51. Mandal, D.; Chatterjee, A.; Maitra, M. Robust medical image segmentation using particle swarm optimization aided level set based global fitting energy active contour approach. Eng. Appl. Artif. Intell. 2014, 35, 199–214. [Google Scholar] [CrossRef]
  52. Benaichouche, A.N.; Oulhadj, H.; Siarry, P. Improved spatial fuzzy c-means clustering for image segmentation using PSO initialization, Mahalanobis distance and post-segmentation correction. Digit. Signal Process. 2013, 23, 1390–1400. [Google Scholar] [CrossRef]
  53. Gallardo, J.E.; Cotta, C. A GRASP-based memetic algorithm with path relinking for the far from most string problem. Eng. Appl. Artif. Intell. 2015, 41, 183–194. [Google Scholar] [CrossRef]
  54. Kohler, M.; Vellasco, M.M.; Tanscheit, R. PSO+: A new particle swarm optimization algorithm for constrained problems. Appl. Soft Comput. 2019, 85, 105865. [Google Scholar] [CrossRef]
  55. Eshtay, M.; Faris, H.; Obeid, N. A competitive swarm optimizer with hybrid encoding for simultaneously optimizing the weights and structure of Extreme Learning Machines for classification problems. Int. J. Mach. Learn. Cybern. 2020, 11, 1801–1823. [Google Scholar] [CrossRef]
  56. Sawant, S.S.; Prabukumar, M.; Samiappan, S. A band selection method for hyperspectral image classification based on cuckoo search algorithm with correlation based initialization. In Proceedings of the 2019 10th Workshop on Hyperspectral Imaging and Signal Processing: Evolution in Remote Sensing (WHISPERS), Amsterdam, The Netherlands, 24–26 September 2019. [Google Scholar]
  57. Lin, L.; Ji, Z.; He, S.; Zhu, Z. A crown jewel defense strategy based particle swarm optimization. In Proceedings of the 2012 IEEE Congress on Evolutionary Computation, Brisbane, Australia, 10–15 June 2012. [Google Scholar]
  58. Sun, Y.; Qi, X. A DE-LS Metaheuristic Algorithm for Hybrid Flow-Shop Scheduling Problem considering Multiple Requirements of Customers. Sci. Program. 2020, 2020, 8811391. [Google Scholar] [CrossRef]
  59. El-Abd, M. Global-best brain storm optimization algorithm. Swarm Evol. Comput. 2017, 37, 27–44. [Google Scholar] [CrossRef]
  60. Giuliani, D. A Grayscale Segmentation Approach Using the Firefly Algorithm and the Gaussian Mixture Model. Int. J. Swarm Intell. Res. 2018, 9, 39–57. [Google Scholar] [CrossRef]
  61. Ivorra, B.; Mohammadi, B.; Ramos, A.M. A multi-layer line search method to improve the initialization of optimization algorithms. Eur. J. Oper. Res. 2015, 247, 711–720. [Google Scholar] [CrossRef] [Green Version]
  62. Zainuddin, Z.; Ong, P. Optimization of wavelet neural networks with the firefly algorithm for approximation problems. Neural Comput. Appl. 2017, 28, 1715–1728. [Google Scholar] [CrossRef]
  63. Li, W.; Özcan, E.; John, R. A learning automata-based multiobjective hyper-heuristic. IEEE Trans. Evol. Comput. 2017, 23, 59–73. [Google Scholar] [CrossRef]
  64. Mehrmolaei, S.; Keyvanpour, M.R.; Savargiv, M. Metaheuristics on time series clustering problem: Theoretical and empirical evaluation. Evol. Intell. 2020. [Google Scholar] [CrossRef]
  65. Shen, X.N.; Yao, X. Mathematical modeling and multiobjective evolutionary algorithms applied to dynamic flexible job shop scheduling problems. Inf. Sci. 2015, 298, 198–224. [Google Scholar] [CrossRef] [Green Version]
  66. Xiang, W.L.; An, M.Q.; Li, Y.Z.; He, R.C.; Zhang, J.F. An improved global-best harmony search algorithm for faster optimization. Expert Syst. Appl. 2014, 41, 5788–5803. [Google Scholar] [CrossRef]
  67. Myszkowski, P.B.; Olech, Ł.P.; Laszczyk, M.; Skowroński, M.E. Hybrid differential evolution and greedy algorithm (DEGR) for solving multi-skill resource-constrained project scheduling problem. Appl. Soft Comput. 2018, 62, 1–14. [Google Scholar] [CrossRef]
  68. Aqil, S.; Allali, K. Local search metaheuristic for solving hybrid flow shop problem in slabs and beams manufacturing. Expert Syst. Appl. 2020, 162, 113716. [Google Scholar] [CrossRef]
  69. Tavazoei, M.S.; Haeri, M. Comparison of different one-dimensional maps as chaotic search pattern in chaos optimization algorithms. Appl. Math. Comput. 2007, 187, 1076–1085. [Google Scholar] [CrossRef]
  70. Suresh, S.; Lal, S.; Reddy, C.S.; Kiran, M.S. A novel adaptive cuckoo search algorithm for contrast enhancement of satellite images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 3665–3676. [Google Scholar] [CrossRef]
  71. Afrabandpey, H.; Ghaffari, M.; Mirzaei, A.; Safayani, M. A novel bat algorithm based on chaos for optimization tasks. In Proceedings of the 2014 Iranian Conference on Intelligent Systems (ICIS), Bam, Iran, 4–6 February 2014. [Google Scholar]
  72. Zhang, Q.; Chen, H.; Luo, J.; Xu, Y.; Wu, C.; Li, C. Chaos enhanced bacterial foraging optimization for global optimization. IEEE Access 2018, 6, 64905–64919. [Google Scholar] [CrossRef]
  73. Li, B.; Liu, C.; Wu, H.; Zhao, Y.; Dong, Y. Chaotic adaptive butterfly mating optimization and its applications in synthesis and structure optimization of antenna arrays. Int. J. Antennas Propag. 2019, 2019, 1730868. [Google Scholar] [CrossRef] [Green Version]
  74. Yuan, X.; Miao, Z.; Liu, Z.; Yan, Z.; Zhou, F. Multi-Strategy Ensemble Whale Optimization Algorithm and Its Application to Analog Circuits Intelligent Fault Diagnosis. Appl. Sci. 2020, 10, 3667. [Google Scholar] [CrossRef]
  75. Wang, M.; Chen, H.; Yang, B.; Zhao, X.; Hu, L.; Cai, Z.; Huang, H.; Tong, C. Toward an optimal kernel extreme learning machine using a chaotic moth-flame optimization strategy with applications in medical diagnoses. Neurocomputing 2017, 267, 69–84. [Google Scholar] [CrossRef]
  76. Gandomi, A.H.; Yang, X.S.; Talatahari, S.; Alavi, A.H. Firefly algorithm with chaos. Commun. Nonlinear Sci. Numer. Simul. 2013, 18, 89–98. [Google Scholar] [CrossRef]
  77. Wu, B.; Fan, S.H. Improved artificial bee colony algorithm with chaos. In Proceedings of the International Workshop on Computer Science for Environmental Engineering and EcoInformatics, Kunming, China, 29–31 July 2011; Springer: Berlin/Heidelberg, Germany, 2011. [Google Scholar]
  78. Saremi, S.; Mirjalili, S.; Lewis, A. Biogeography-based optimization with chaos. Neural Comput. Appl. 2014, 25, 1077–1097. [Google Scholar] [CrossRef]
  79. Wang, G.G.; Guo, L.; Gandomi, A.H.; Hao, G.S.; Wang, H. Chaotic krill herd algorithm. Inf. Sci. 2014, 274, 17–34. [Google Scholar] [CrossRef]
  80. Heidari, A.A.; Abbaspour, R.A.; Jordehi, A.R. An efficient chaotic water cycle algorithm for optimization tasks. Neural Comput. Appl. 2017, 28, 57–85. [Google Scholar] [CrossRef]
  81. Kohli, M.; Arora, S. Chaotic grey wolf optimization algorithm for constrained optimization problems. J. Comput. Des. Eng. 2018, 5, 458–472. [Google Scholar] [CrossRef]
  82. Sayed, G.I.; Khoriba, G.; Haggag, M.H. A novel chaotic salp swarm algorithm for global optimization and feature selection. Appl. Intell. 2018, 48, 3462–3481. [Google Scholar] [CrossRef]
  83. Anter, A.M.; Ali, M. Feature selection strategy based on hybrid crow search optimization algorithm integrated with chaos theory and fuzzy c-means algorithm for medical diagnosis problems. Soft Comput. 2020, 24, 1565–1584. [Google Scholar] [CrossRef]
  84. Kaveh, A.; Sheikholeslami, R.; Talatahari, S.; Keshvari-Ilkhichi, M. Chaotic swarming of particles: A new method for size optimization of truss structures. Adv. Eng. Softw. 2014, 67, 136–147. [Google Scholar] [CrossRef]
  85. Liu, F.; Duan, H.; Deng, Y. A chaotic quantum-behaved particle swarm optimization based on lateral inhibition for image matching. Optik 2012, 123, 1955–1960. [Google Scholar] [CrossRef]
  86. Faia, R.; Pinto, T.; Vale, Z.; Corchado, J.M. An ad-hoc initial solution heuristic for metaheuristic optimization of energy market participation portfolios. Energies 2017, 10, 883. [Google Scholar] [CrossRef]
  87. Eltamaly, A.M.; Al-Saud, M.S.; Abokhalil, A.G. A Novel Bat Algorithm Strategy for Maximum Power Point Tracker of Photovoltaic Energy Systems under Dynamic Partial Shading. IEEE Access 2020, 8, 10048–10060. [Google Scholar] [CrossRef]
  88. Yao, L.U.; You, S.U.N.; Xiaodong, L.I.U.; Bo, G.A.O. Control allocation for a class of morphing aircraft with integer constraints based on Lévy flight. J. Syst. Eng. Electron. 2020, 31, 826–840. [Google Scholar] [CrossRef]
  89. Correia, S.D.; Beko, M.; Tomic, S.; Cruz, L.A.D.S. Energy-Based Acoustic Localization by Improved Elephant Herding Optimization. IEEE Access 2020, 8, 28548–28559. [Google Scholar] [CrossRef]
  90. Eltamaly, A.M.; Al-Saud, M.S.; Abo-Khalil, A.G. Performance Improvement of PV Systems’ Maximum Power Point Tracker Based on a Scanning PSO Particle Strategy. Sustainability 2020, 12, 1185. [Google Scholar] [CrossRef] [Green Version]
  91. Abbas, A.; Hewahi, N.M. Imaging the search space: A nature-inspired metaheuristic extension. Evol. Intell. 2020, 13, 463–474. [Google Scholar] [CrossRef]
  92. El-Sayed, W.T.; El-Saadany, E.F.; Zeineldin, H.H.; Al-Sumaiti, A.S. Fast initialization methods for the nonconvex economic dispatch problem. Energy 2020, 201, 117635. [Google Scholar] [CrossRef]
  93. Hussein, W.A.; Sahran, S.; Abdullah, S.N.H.S. A new initialization algorithm for bees algorithm. In Proceedings of the International Multi-Conference on Artificial Intelligence Technology, Shah Alam, Malaysia, 28–29 August 2013; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  94. Heidari, A.A.; Pahlavani, P. An efficient modified grey wolf optimizer with Lévy flight for optimization tasks. Appl. Soft Comput. 2017, 60, 115–134. [Google Scholar] [CrossRef]
  95. Lin, J.H.; Chou, C.W.; Yang, C.H.; Tsai, H.L. A chaotic Levy flight bat algorithm for parameter estimation in nonlinear dynamic biological systems. Comput. Inf. Technol. 2012, 2, 56–63. [Google Scholar]
  96. Aydoğdu, İ.; Akın, A.; Saka, M.P. Design optimization of real world steel space frames using artificial bee colony algorithm with Levy flight distribution. Adv. Eng. Softw. 2016, 92, 1–14. [Google Scholar] [CrossRef]
  97. Amirsadri, S.; Mousavirad, S.J.; Ebrahimpour-Komleh, H. A Levy flight-based grey wolf optimizer combined with back-propagation algorithm for neural network training. Neural Comput. Appl. 2018, 30, 3707–3720. [Google Scholar] [CrossRef]
  98. Barshandeh, S.; Haghzadeh, M. A new hybrid chaotic atom search optimization based on tree-seed algorithm and Levy flight for solving optimization problems. Eng. Comput. 2020, 37, 3079–3122. [Google Scholar] [CrossRef]
  99. Chegini, S.N.; Bagheri, A.; Najafi, F. PSOSCALF: A new hybrid PSO based on Sine Cosine Algorithm and Levy flight for solving optimization problems. Appl. Soft Comput. 2018, 73, 697–726. [Google Scholar] [CrossRef]
  100. Abdulwahab, H.A.; Noraziah, A.; Alsewari, A.A.; Salih, S.Q. An enhanced version of black hole algorithm via levy flight for optimization and data clustering problems. IEEE Access 2019, 7, 142085–142096. [Google Scholar] [CrossRef]
  101. Jensi, R.; Jiji, G.W. An enhanced particle swarm optimization with levy flight for global optimization. Appl. Soft Comput. 2016, 43, 248–261. [Google Scholar] [CrossRef]
  102. Parsopoulos, K.; Vrahatis, M. Initializing the particle swarm optimizer using the nonlinear simplex method. Adv. Intell. Syst. Fuzzy Syst. Evol. Comput. 2002, 216, 1–6. [Google Scholar]
  103. Richards, M.; Ventura, D. Choosing a starting configuration for particle swarm optimization. Neural Netw. 2004, 3, 2309–2312. [Google Scholar]
  104. Wang, P.; Zhou, Y.; Luo, Q.; Han, C.; Niu, Y.; Lei, M. Complex-valued encoding metaheuristic optimization algorithm: A comprehensive survey. Neurocomputing 2020, 407, 313–342. [Google Scholar] [CrossRef]
  105. De Lima Corrêa, L.; Dorn, M. A multi-population memetic algorithm for the 3-D protein structure prediction problem. Swarm Evol. Comput. 2020, 55, 100677. [Google Scholar] [CrossRef]
  106. Ahmed, Z.H. Genetic algorithm for the traveling salesman problem using sequential constructive crossover operator. Int. J. Biom. Bioinform. 2010, 3, 96. [Google Scholar]
  107. Talbi, E.G. Combining metaheuristics with mathematical programming, constraint programming and machine learning. Ann. Oper. Res. 2016, 240, 171–215. [Google Scholar] [CrossRef]
  108. Yang, X.S. A new metaheuristic bat-inspired algorithm. In Nature Inspired Cooperative Strategies for Optimization (Nicso 2010); Studies in Computational, Intelligence; Gonzalez, J.R., Ed.; Springer: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
  109. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  110. Arora, S.; Singh, S. Butterfly optimization algorithm: A novel approach for global optimization. Soft Comput. 2019, 23, 715–734. [Google Scholar] [CrossRef]
Figure 1. Bukin N. 6 landscape.
Figure 1. Bukin N. 6 landscape.
Applsci 12 00896 g001
Figure 2. The beta distribution of the population after the first iteration.
Figure 2. The beta distribution of the population after the first iteration.
Applsci 12 00896 g002
Figure 3. The beta distribution of the population after a few iterations.
Figure 3. The beta distribution of the population after a few iterations.
Applsci 12 00896 g003
Figure 4. The random number distribution of the population after the first iteration.
Figure 4. The random number distribution of the population after the first iteration.
Applsci 12 00896 g004
Figure 5. The random number distribution of the population after a few iterations.
Figure 5. The random number distribution of the population after a few iterations.
Applsci 12 00896 g005
Figure 6. Document Type Distribution from WoS.
Figure 6. Document Type Distribution from WoS.
Applsci 12 00896 g006
Figure 7. Document Type Distribution from Scopus.
Figure 7. Document Type Distribution from Scopus.
Applsci 12 00896 g007
Figure 8. Random numbers after 1000 iterations.
Figure 8. Random numbers after 1000 iterations.
Applsci 12 00896 g008
Figure 9. Halton sequence after 1000th iteration.
Figure 9. Halton sequence after 1000th iteration.
Applsci 12 00896 g009
Figure 10. Application areas reported in the literature.
Figure 10. Application areas reported in the literature.
Applsci 12 00896 g010
Table 1. Inclusion/Exclusion Criteria.
Table 1. Inclusion/Exclusion Criteria.
InclusionExclusion
Articles that used different initialisation schemes to improve the performance of metaheuristic algorithmWhile we discussed the commonly used pseudo-random number initialization scheme, we excluded algorithms that used the scheme. Including these articles would mean reviewing the entire metaheuristic algorithms, which is outside the scope of this work
Articles published in reputable peer-review journals, conference proceedings, and edited booksArticles published as part of textbooks, abstracts, editorials, and keynote speeches
Articles that are written in the English languageArticles that are written in other languages besides English
Table 2. Summary of quasirandom methods.
Table 2. Summary of quasirandom methods.
Reference Year Initialisation SchemeOptimisation Problem
[21]2007Halton, Sobol, and FaureBenchmark test functions
[8]2008Van der Corput and Sobol sequencesBenchmark test functions
[25]2018A quasirandom sequence TorusBenchmark test function
[22]2020Van der Corput Faure and Sobol sequencesBenchmark test function
[26]2005Halton low-discrepancy sequenceBenchmark function
[27]2005Sobol and Halton sequencesBenchmark function
[23]2002Faure sequencesBenchmark functions
[28]2009Sobol sequence Benchmark functions
[29]2021Halton, Sobol, and TorusBenchmark functions
[30]2012Sobol sequencesImage Segmentation
[31]2017Latin hypercube sampling (LHS)Optimisation of structural components under Fatigue
Table 3. Summary of PDF category.
Table 3. Summary of PDF category.
Reference Year Initialisation SchemeOptimisation Problem
[32]2017Stochastic fractal search techniqueDynamic state estimation (DSE) problem at the filtering stage
[13]2020Variants of Beta distribution, uniform distribution, normal distribution, logarithmic normal distribution, exponential distribution, Rayleigh distribution, Weibull distribution, and Latin hypercube SsmplingBenchmark function
[31]2017Latin hypercube samplingStructural components under fatigue
[34]2011Stochastic demandsVehicle routing problem
[35]2012Smart samplingBenchmark functions
[36]2018Log logisticTraining of artificial neural network
[37]2016Neural fuzzy inferenceFlood susceptibility modeling
[38]2019Adaptive neuro-fuzzy inferenceGroundwater potential mapping
[33]2016Stochastic random a sampling of simpler fat-tailed distributions enhanced with scaled-chaotic sequencesComplex wellbore trajectories
Table 4. Summary of hybrid methods.
Table 4. Summary of hybrid methods.
Reference Year Initialisation SchemeOptimisation Problem
[51]2014PSOImage segmentation
[53]2015GRASPFar from most string problem (FFMSP)
[55]2020Competitive swarm optimizer (CSO)Train single hidden layer feed forward
Networks (SLFN)
[56]2019K-Means clustering algorithmBand selection of hyperspectral images
[60]2018Firefly algorithmGreyscale image segmentation
[52]2013PSOMahalanobis distance and post-segmentationCorrection
[46]2014ABCGeo-demographic analysis
[62]2017Firefly algorithmWavelet neural networks (WNNs)
[49]2019Metaheuristic algorithmDynamic Data Streams
[63]2017Hyper-heuristicBenchmark test functions
[64]2020Hybrid of fuzzy metaheuristics (e.g., FATPSO) and the base TsC algorithmsDifferent areas
[65]2015Heuristic initialization strategiesDynamic flexible job shop scheduling problems
[66]2014Opposition-based learningBenchmark test function
[42]2015VNSTask Scheduling
[48]2015Opposition-based learningBenchmark function
[43]2018Quasi-opposition based learning (QOBL)Parameter estimation of photovoltaic (PV) models
[57]2012Crown jewel defense (CJD)Benchmark test functions
[58]2020DE combined with local searchHybrid flow-shop scheduling problem
[41]2017Pairwise linear optimizationTakagi-Sugeno fuzzy systems
[67]2018Greedy algorithmMulti-skill resource-constrained project scheduling problem
[68]2020Nawaz-Enscore-Ham (NEH) algorithmSlabs and beams manufacturing
Table 5. Summary of chaotic based methods.
Table 5. Summary of chaotic based methods.
Reference Year Initialisation SchemeOptimisation Problem
[71]2014Chaotic SequenceBenchmark function
[70]2017Logistic Chaotic FunctionEnhancement of satellite images
[72]2018Chaotic SequenceBenchmark test functions
[73]2019Chaotic SequenceSynthesis and structure optimization of antenna arrays
[74]2020Chaotic TheoryAnalog circuits intelligent fault diagnosis
[75]2017Chaotic TheoryMedical diagnoses
[76]2013Chaotic TheoryBenchmark test function
[77]2011Chaotic TheoryBenchmark test function
[78]2014Chaotic TheoryBenchmark test function
[79]2014Chaotic TheoryBenchmark test function
[80]2017Chaotic TheoryBenchmark test function
[82]2018Chaotic TheoryGlobal optimization and feature selection
[83]2020Chaotic TheoryMedical diagnosis problems
[84]2014Chaotic TheoryTruss structures
[85]2012Chaotic TheoryImage matching
[81]2018Chaotic TheoryBenchmark test function
Table 6. Summary of methods based on ad hoc knowledge.
Table 6. Summary of methods based on ad hoc knowledge.
Reference Year Initialisation SchemeOptimisation Problem
[86]2017Ad hoc knowledgeEnergy market
Participation portfolios
[88]2020Ad hoc knowledgeMorphing aircraft
[54]2019Metropolis–Hastings (MH) and function domain
contraction technique (FDCT)
Reservoir drainage plan optimisation problem
[91]2020Imaging the search spaceBlack box setting called COCO
[87]2020Ad hoc knowledgePhotovoltaic energy
systems under dynamic partial shading
[89]2020Ad hoc knowledgeEnergy-based acoustic localization
[92]2020Ad hoc knowledgeNonconvex economic dispatch problem
[90]2020Ad hoc knowledgePV Systems
Table 7. Summary of levy flight methods.
Table 7. Summary of levy flight methods.
Reference Year Initialisation SchemeOptimisation Problem
[94]2017Lévy flight (LF) and greedy selectionBenchmark test function
[95]2012Chaotic Lévy MotionNonlinear dynamic biological systems
[96]2016Lévy MotionSteel space frames
[97]2018Lévy MotionNeural network training
[98]2020Lévy MotionBenchmark
[99]2018Lévy MotionEngineering design problems
[100]2019Lévy MotionData clustering problems
[101]2016Lévy MotionGlobal optimization
[93]2013Lévy MotionBenchmark test function
Table 8. Other approaches to improving initialising schemes.
Table 8. Other approaches to improving initialising schemes.
Reference Year Initialisation SchemeOptimisation Problem
[61]2015Multi-Layer Line
Search Methods
Benchmark test function
[40]2017Equal Partition and F/T MutationBenchmark test function
[44]2019Dispatching RulesBenchmark test function
[47]2020Push-Forward Insertion
Heuristic
Vehicle routing problem
[45]2017Siemens Approximation Method (SAM)Discrete time-cost trade-off problem
[105]2020Angle Probability List strategy3-D protein structure prediction problem
[106]2010Sequential Constructive Crossover OperatorTravelling salesman problem
[107]2016Mathematical Programming, Constraint Programming and Machine LearningBenchmark function
[39]2013Randomized Breadth-First SearchCyclic antibandwidth problem
Table 9. Summary of algorithms used.
Table 9. Summary of algorithms used.
S/NAlgorithm Authors and Year of PublicationApplication Area (When the Algorithm Was First Published/Proposed)
1BA[108]Benchmark test function
2GWO[109]Benchmark test function, tension/compression spring, welded beam, pressure vessel designs, and optical engineering
3BOA[110]Benchmark test functions, spring design, welded beam design, and gear train design
Table 10. Initialisation schemes.
Table 10. Initialisation schemes.
S/NInitialization SchemeFunction
1Random rand
2Beta Betarnd(3, 2)
3Beta Betarnd(2.5, 2.5)
4Uniform Unifrnd(0, 1)
5Logarithmic normalLognrnd(0, 0.5)
6Exponential Exprnd(0.5)
7RayleighRaylrnd(0.4)
8WeibullWblrnd(1, 1)
9Latin hypercube samplinglhsdesign
10Sobol Sobol
Table 11. Algorithm-specific parameters.
Table 11. Algorithm-specific parameters.
S/NAlgorithm Parameters
1BAA = rand(N,1), r = rand(N,1), alpha = 0.5, gamma = 0.5, and ro = 0.001.
2GWOAlpha_pos = zeros (1, dim), Alpha_score = inf, Beta_pos = zeros (1, dim), Beta_score = inf, Delta_pos = zeros (1, dim), and Delta_score = inf.
3BOAProbability switch (p) = 0.8, power_exponent = 0.1, and sensory_modality = 0.01
Table 12. Population size and number of iterations.
Table 12. Population size and number of iterations.
Initialization ParametersValues
Population size102030501003005001000
Number of iterations100090080060050030010010
Table 13. Result for Bat Algorithm.
Table 13. Result for Bat Algorithm.
FunctionValueBat Algorithm
Pop = 10Pop = 20Pop = 30Pop = 50Pop = 100Pop = 300Pop = 500Pop = 1000
Iter = 1000Iter = 900Iter = 800Iter = 600Iter = 500Iter = 300Iter = 100Iter = 10
SphereMean1.94E+041.68E+041.59E+041.55E+041.43E+041.1882E+041.1242E+041.0300E+04
Stand.Div1.98E+041.72E+041.64E+041.61E+041.46E+041.2227E+041.1441E+041.0369E+04
Best1.15E+047.26E+031.05E+049.01E+038.02E+035.5988E+037.3909E+037.0207E+03
Worst2.78E+042.38E+042.46E+042.42E+041.98E+041.6507E+041.6244E+041.1869E+04
MeanRunTimes1.53E+002.86E+007.23E+008.54E+001.64E+011.1428E+0112.83736.8391
RastriginMean3.16E+023.04E+022.90E+022.83E+022.77E+022.7082E+02264.1494252.9087
Stand.Div3.18E+023.05E+022.91E+022.84E+022.78E+022.7126E+02264.4724253.7484
Best2.15E+022.73E+022.43E+022.43E+022.16E+022.3388E+02238.5740206.2512
Worst3.73E+023.59E+023.36E+023.14E+023.29E+022.9137E+02285.5669289.4242
MeanRunTimes1.85E+003.22E+004.54E+006.18E+009.79E+001.1708E+0113.13866.9371
RosenbrockMean2.96E+072.26E+071.76E+071.40E+071.52E+079.0472E+067.6228E+066.7477E+06
Stand.Div3.40E+072.42E+072.00E+071.51E+071.59E+079.8445E+068.0408E+066.9390E+06
Best1.09E+076.00E+066.26E+062.39E+065.92E+062.5159E+063.8750E+063.8616E+06
Worst7.99E+074.30E+074.11E+072.49E+072.55E+071.6890E+071.3239E+079.3581E+06
MeanRunTimes1.45E+003.09E+006.45E+004.81E+001.14E+011.1363E+0112.64696.7982
GriewankMean1.94E+021.67E+021.54E+021.43E+021.18E+021.1163E+0293.700694.5310
Stand.Div2.01E+021.71E+021.57E+021.46E+021.21E+021.1527E+0295.045697.1846
Best1.09E+021.09E+027.05E+019.88E+017.94E+014.0860E+0165.631149.3315
Worst3.16E+022.51E+022.09E+022.01E+021.76E+021.6464E+02134.4593139.8530
MeanRunTimes1.94E+003.54E+007.19E+008.81E+001.66E+011.6149E+0113.44747.1280
Table 14. Result for Grey Wolf Optimizer.
Table 14. Result for Grey Wolf Optimizer.
FunctionValueGrey Wolf Optimizer
Pop = 10Pop = 20Pop = 30Pop = 50Pop = 100Pop = 300Pop = 500Pop = 1000
Iter = 1000Iter = 900Iter = 800Iter = 600Iter = 500Iter = 300Iter = 100Iter = 10
SphereMean0000002.3000E-0650.3358
Stand.Div0000002.7289E-0653.0968
Best0000005.6828E-0721.0005
Worst0000005.6884E-0691.1243
MeanRunTimes1.59E+002.91E+006.88E+006.50E+009.91E+004.5197E+003.69211.4936
RastriginMean000005.0086E+0014.695084.8322
Stand.Div000006.7312E+0015.161686.5337
Best000002.6057E-088.605253.7818
Worst000001.4721E+0120.8869115.8360
MeanRunTimes1.73E+002.87E+003.88E+004.31E+005.75E+004.6616E+003.95131.6358
RosenbrockMean2.56E+012.56E+012.56E+012.52E+012.54E+012.5108E+0126.51881.3708E+03
Stand.Div2.56E+012.56E+012.56E+012.52E+012.54E+012.5115E+0126.54251.6446E+03
Best2.41E+012.42E+012.41E+012.40E+012.42E+012.4025E+0125.0393429.0925
Worst2.62E+012.62E+012.70E+012.61E+012.62E+012.6104E+0128.77304.0747E+03
MeanRunTimes1.52E+003.23E+005.52E+003.80E+006.96E+004.4560E+003.58341.4551
GriewankMean3.76E-046.21E-049.90E-04006.0142E-030.00561.4594
Stand.Div1.68E-032.78E-033.13E-03001.4804E-020.00921.4767
Best0000001.6031E-061.1595
Worst7.52E-031.24E-029.91E-03006.0169E-020.02111.9960
MeanRunTimes1.73E+003.02E+005.34E+006.25E+009.52E+006.3599E+003.84141.7894
Table 15. Result for Butterfly Optimization Algorithm.
Table 15. Result for Butterfly Optimization Algorithm.
FunctionValueButterfly Optimization Algorithm
Pop = 10Pop = 20Pop = 30Pop = 50Pop = 100Pop = 300Pop = 500Pop = 1000
Iter = 1000Iter = 900Iter = 800Iter = 600Iter = 500Iter = 300Iter = 100Iter = 10
SphereMean000001.6922E-084.9084E-057.6957E-06
Stand.Div000001.7860E-085.0361E-051.3852E-05
Best000001.0376E-083.0436E-053.0197E-07
Worst000003.2348E-087.5007E-055.0646E-05
MeanRunTimes1.06E+002.71E+007.82E+001.11E+012.68E+012.2878E+0129.637122.1779
RastriginMean000004.7966E-060.00281.5485
Stand.Div000005.3872E-060.00291.5518
Best000001.5685E-060.00131.3340
Worst000001.1829E-050.00411.7193
MeanRunTimes1.33E+003.02E+005.16E+007.31E+001.78E+012.3616E+0129.887122.4889
RosenbrockMean2.89E+012.89E+012.88E+012.88E+012.88E+012.8787E+0128.773528.9139
Stand.Div2.89E+012.89E+012.88E+012.88E+012.88E+012.8787E+0128.773528.9139
Best2.88E+012.88E+012.88E+012.88E+012.87E+012.8750E+0128.734028.8728
Worst2.89E+012.89E+012.89E+012.89E+012.89E+012.8832E+0128.806228.9504
MeanRunTimes9.82E-012.85E+006.86E+006.82E+001.91E+012.3152E+0131.173022.1920
GriewankMean000005.7050E-060.00681.1715
Stand.Div000005.7511E-060.00681.1715
Best000004.0656E-060.00591.1497
Worst000006.9312E-060.00791.1881
MeanRunTimes1.29E+003.06E+007.07E+001.09E+012.58E+013.8891E+0131.542922.5601
Table 16. Friedman test result.
Table 16. Friedman test result.
BAGWOBOA
Pop = 10
Iter = 1000
7.115.044.89
Pop = 20
Iter = 900
6.304.074.22
Pop = 30
Iter = 800
5.594.633.26
Pop = 50
Iter = 600
5.193.653.67
Pop = 100
Iter = 500
4.303.913.35
Pop = 300
Iter = 300
3.303.704.43
Pop = 500
Iter = 100
2.705.135.67
Pop = 1000
Iter = 10
1.525.876.52
N101010
Chi-Square113.91429.44549.249
df777
Asymp. Sig.0.0000.0000.000
Table 17. Results for BA.
Table 17. Results for BA.
FunctionValueRandBetarnd(3,2)Betarnd(2.5,2.5)Unifrnd(0,1)Lognrnd(0,0.5)Exprnd(0.5)Raylrnd(0.4)Wblrnd(1,1)Lhsdesign()Sobol()
F1Mean9.79E+034.94E+034.57E+039.51E+032.53E+041.78E+046.78E+033.43E+049.35E+033.02E+03
Stand.Div9.93E+035.04E+034.65E+039.75E+032.60E+041.80E+046.91E+033.47E+049.56E+033.12E+03
Best6.89E+033.44E+033.23E+035.93E+031.28E+041.26E+044.05E+032.34E+044.85E+039.86E+02
Worst1.43E+046.85E+036.47E+031.35E+043.82E+042.14E+049.18E+034.34E+041.14E+044.45E+03
MeanRunTimes6.79E+006.86E+006.87E+006.82E+008.20E+007.64E+006.84E+008.16E+009.98E+006.79E+00
F2Mean4.93E+001.20E+001.26E+004.65E+001.37E+011.18E+012.43E+004.22E+015.65E+008.74E-01
Stand.Div5.26E+001.28E+001.34E+005.14E+001.49E+011.30E+012.58E+004.59E+016.21E+001.04E+00
Best1.33E+003.75E-013.22E-011.46E+005.48E+005.24E+009.30E-011.26E+012.27E+001.61E-01
Worst8.00E+001.88E+002.04E+009.69E+002.88E+012.41E+013.73E+007.73E+011.10E+012.39E+00
MeanRunTimes7.89E+007.97E+008.04E+007.95E+009.37E+009.15E+008.07E+009.29E+001.12E+017.88E+00
F3Mean2.03E+014.50E+011.23E+012.11E+011.35E+022.05E+011.18E+016.27E+012.20E+015.54E+01
Stand.Div2.12E+014.93E+011.29E+012.22E+011.44E+022.18E+011.27E+016.59E+012.37E+016.63E+01
Best7.76E+001.65E+014.65E+001.10E+015.61E+017.22E+003.91E+003.13E+011.04E+011.72E+01
Worst3.41E+018.45E+012.32E+013.61E+012.49E+023.24E+012.27E+011.22E+024.43E+011.71E+02
MeanRunTimes9.68E-019.75E-019.83E-019.91E-011.13E+001.05E+009.77E-011.11E+001.28E+009.94E-01
F4Mean1.61E+041.64E+047.44E+031.77E+045.91E+053.53E+041.08E+046.20E+041.83E+041.56E+04
Stand.Div1.65E+041.69E+047.61E+031.79E+046.68E+053.61E+041.10E+046.57E+041.89E+041.98E+04
Best6.73E+036.15E+035.21E+031.21E+041.83E+052.19E+046.90E+032.15E+049.53E+036.28E+03
Worst2.26E+042.33E+041.05E+042.33E+041.69E+064.86E+041.59E+041.02E+052.66E+045.95E+04
MeanRunTimes1.13E+011.13E+011.13E+011.13E+011.26E+011.17E+011.14E+011.25E+011.49E+011.14E+01
F5Mean1.99E-021.80E-022.85E-025.89E-028.43E-025.46E-022.62E-028.63E-025.15E-024.24E-01
Stand.Div2.86E-022.74E-025.32E-029.10E-021.28E-018.41E-026.90E-021.58E-018.19E-028.22E-01
Best1.21E-042.95E-043.61E-069.74E-046.15E-042.58E-041.41E-042.31E-051.45E-041.62E-04
Worst8.47E-029.53E-021.70E-012.66E-013.59E-012.49E-012.96E-015.17E-011.89E-012.50E+00
MeanRunTimes8.48E-028.51E-028.72E-028.84E-029.44E-028.82E-028.59E-029.13E-021.02E-018.77E-02
F6Mean4.62E+004.03E+003.86E+004.47E+005.49E+004.74E+004.26E+005.15E+004.61E+004.92E+00
Stand.Div4.62E+004.08E+003.88E+004.49E+005.50E+004.76E+004.27E+005.17E+004.63E+004.94E+00
Best3.85E+002.35E+003.09E+003.41E+004.94E+003.80E+003.61E+003.68E+003.69E+004.06E+00
Worst5.15E+004.81E+004.46E+005.15E+005.90E+005.41E+004.84E+005.80E+005.12E+005.94E+00
MeanRunTimes1.07E+001.08E+001.07E+001.08E+001.18E+001.11E+001.07E+001.13E+001.38E+001.09E+00
F7Mean2.62E+022.14E+022.13E+022.58E+024.06E+022.98E+022.29E+023.72E+022.53E+022.22E+02
Stand.Div2.62E+022.15E+022.14E+022.58E+024.08E+022.99E+022.29E+023.72E+022.54E+022.22E+02
Best2.35E+021.76E+021.88E+022.23E+023.26E+022.60E+022.02E+023.27E+021.94E+021.93E+02
Worst2.98E+022.40E+022.50E+022.94E+024.65E+023.31E+022.55E+024.37E+022.83E+022.45E+02
MeanRunTimes6.95E+006.92E+006.92E+007.00E+008.38E+008.08E+007.03E+008.32E+001.01E+017.04E+00
F8Mean6.69E+061.76E+061.29E+065.86E+065.90E+071.49E+073.33E+067.78E+075.77E+067.60E+05
Stand.Div7.39E+061.89E+061.39E+066.25E+066.65E+071.54E+073.56E+068.29E+076.26E+068.66E+05
Best3.00E+068.85E+053.77E+052.36E+061.49E+076.32E+061.10E+063.32E+072.24E+061.89E+05
Worst1.25E+073.17E+062.31E+061.07E+071.31E+082.21E+075.41E+061.37E+081.12E+071.70E+06
MeanRunTimes6.75E+006.86E+006.86E+006.79E+008.16E+007.97E+006.91E+008.11E+009.98E+006.81E+00
F9Mean8.88E+014.60E+013.74E+019.50E+012.42E+021.60E+026.30E+012.92E+029.00E+013.06E+01
Stand.Div9.06E+014.67E+013.79E+019.73E+012.48E+021.63E+026.45E+012.95E+029.23E+013.29E+01
Best3.99E+013.10E+012.80E+015.21E+011.48E+021.12E+023.82E+012.19E+025.93E+011.77E+01
Worst1.18E+026.18E+014.67E+011.36E+023.64E+022.38E+028.79E+013.70E+021.31E+025.72E+01
MeanRunTimes7.08E+007.14E+007.17E+007.09E+008.47E+007.94E+007.14E+008.47E+001.03E+017.13E+00
F10Mean1.53E+011.36E+011.27E+011.56E+012.00E+011.86E+011.39E+011.99E+011.57E+011.04E+01
Stand.Div1.53E+011.36E+011.27E+011.56E+012.00E+011.86E+011.40E+011.99E+011.58E+011.05E+01
Best1.38E+011.16E+011.17E+011.38E+012.00E+011.60E+011.18E+011.96E+011.44E+018.63E+00
Worst1.63E+011.48E+011.40E+011.66E+012.00E+012.00E+011.51E+012.00E+011.69E+011.31E+01
MeanRunTimes8.22E+007.84E+007.61E+008.26E+008.24E+008.25E+008.07E+008.26E+001.14E+018.24E+00
Table 18. Results for BOA.
Table 18. Results for BOA.
FunctionValueRandBetarnd(3,2)Betarnd(2.5,2.5)Unifrnd(0,1)Lognrnd(0,0.5)Exprnd(0.5)Raylrnd(0.4)Wblrnd(1,1)Lhsdesign()Sobol()
F1Mean 0000000000
Stand.Div0000000000
Best0000000000
Worst0000000000
MeanRunTimes4.01E+004.42E+004.23E+004.05E+004.24E+003.97E+004.03E+004.01E+004.26E+004.06E+00
F2Mean0000000000
Stand.Div0000000000
Best0000000000
Worst0000000000
MeanRunTimes7.35E+007.78E+007.66E+007.38E+007.65E+007.35E+007.36E+007.38E+007.59E+007.45E+00
F3Mean0000000000
Stand.Div0000000000
Best0000000000
Worst0000000000
MeanRunTimes7.31E-017.84E-017.78E-017.47E-017.84E-017.33E-017.42E-017.36E-017.69E-017.53E-01
F4Mean0000000000
Stand.Div0000000000
Best0000000000
Worst0000000000
MeanRunTimes1.63E+011.67E+011.65E+011.62E+011.66E+011.63E+011.63E+011.65E+011.72E+011.64E+01
F5Mean1.04E-078.27E-081.03E-079.58E-085.76E-081.18E-071.30E-079.02E-081.20E-071.02E-07
Stand.Div1.41E-071.22E-071.42E-071.14E-077.62E-081.71E-071.97E-071.23E-071.71E-071.40E-07
Best1.33E-0800001.00E-081.50E-08000
Worst3.58E-074.13E-073.11E-072.25E-071.57E-075.57E-077.05E-073.28E-075.12E-073.76E-07
MeanRunTimes8.82E-029.21E-029.15E-029.03E-029.10E-028.95E-028.96E-028.96E-029.08E-029.02E-02
F6Mean1.40E+001.01E+001.33E+001.22E+001.33E+001.27E+001.24E+001.23E+001.44E+001.48E+00
Stand.Div1.63E+001.35E+001.56E+001.36E+001.55E+001.47E+001.36E+001.41E+001.63E+001.66E+00
Best2.64E-011.23E-015.07E-011.48E-014.09E-013.69E-012.35E-011.48E-015.29E-014.74E-01
Worst3.15E+003.59E+003.38E+002.52E+003.26E+002.99E+002.31E+002.69E+003.73E+003.14E+00
MeanRunTimes1.11E+001.18E+001.15E+001.12E+001.15E+001.12E+001.12E+001.12E+001.14E+001.13E+00
F7Mean0000000000
Stand.Div0000000000
Best0000000000
Worst0000000000
MeanRunTimes4.11E+004.51E+004.35E+004.16E+004.35E+004.09E+004.15E+004.13E+004.29E+004.20E+00
F8Mean2.52E+012.51E+012.53E+012.51E+012.50E+012.51E+012.52E+012.51E+012.51E+012.53E+01
Stand.Div2.53E+012.51E+012.53E+012.51E+012.50E+012.51E+012.52E+012.52E+012.51E+012.53E+01
Best2.37E+012.38E+012.39E+012.33E+012.36E+012.42E+012.39E+012.40E+012.42E+012.41E+01
Worst2.62E+012.70E+012.62E+012.62E+012.62E+012.62E+012.62E+012.62E+012.60E+012.61E+01
MeanRunTimes3.95E+004.35E+004.14E+003.96E+004.14E+003.92E+003.97E+003.95E+004.20E+004.00E+00
F9Mean000003.73E-04003.73E-040
Stand.Div000001.67E-03001.67E-030
Best0000000000
Worst000007.46E-03007.46E-030
MeanRunTimes4.27E+004.67E+004.48E+004.28E+004.49E+004.27E+004.27E+004.33E+004.42E+004.32E+00
F10Mean00002.02E+01002.04E+0000
Stand.Div00002.02E+01006.44E+0000
Best00002.02E+0100000
Worst00002.03E+01002.04E+0100
MeanRunTimes4.11E+004.51E+004.33E+004.12E+004.89E+004.06E+004.14E+004.25E+004.25E+004.18E+00
Table 19. Results for GWO.
Table 19. Results for GWO.
FunctionValueRandBetarnd(3,2)Betarnd(2.5,2.5)Unifrnd(0,1)Lognrnd(0,0.5)Exprnd(0.5)Raylrnd(0.4)Wblrnd(1,1)Lhsdesign()Sobol()
F1Mean0000000000
Stand.Div0000000000
Best0000000000
Worst0000000000
MeanRunTimes3.76E+003.77E+003.77E+003.77E+003.78E+003.74E+003.71E+003.74E+003.75E+003.74E+00
F2Mean0000000000
Stand.Div0000000000
Best0000000000
Worst0000000000
MeanRunTimes6.49E+006.47E+006.53E+006.53E+006.52E+006.49E+006.42E+006.45E+006.50E+006.48E+00
F3Mean0000000000
Stand.Div0000000000
Best0000000000
Worst0000000000
MeanRunTimes6.24E-016.24E-016.31E-016.33E-016.37E-016.28E-016.25E-016.22E-016.31E-016.32E-01
F4Mean0000000000
Stand.Div0000000000
Best0000000000
Worst0000000000
MeanRunTimes1.36E+011.37E+011.37E+011.35E+011.38E+011.37E+011.36E+011.37E+011.43E+011.37E+01
F5Mean1.28E-037.20E-048.16E-048.13E-048.42E-049.36E-041.09E-031.57E-031.13E-035.43E-04
Stand.Div2.02E-039.37E-041.09E-031.07E-031.12E-031.71E-031.67E-032.34E-031.74E-038.19E-04
Best1.29E-045.74E-053.47E-058.95E-054.32E-052.07E-052.22E-055.83E-055.98E-062.78E-06
Worst7.04E-032.35E-033.25E-033.39E-032.83E-034.76E-034.36E-036.90E-033.88E-032.68E-03
MeanRunTimes7.42E-027.41E-027.48E-027.47E-027.48E-027.51E-027.40E-027.39E-027.58E-027.38E-02
F6Mean4.41E+003.89E+003.91E+004.14E+003.80E+004.67E+004.28E+004.33E+004.22E+004.25E+00
Stand.Div4.42E+003.90E+003.92E+004.16E+003.82E+004.68E+004.29E+004.35E+004.25E+004.27E+00
Best3.73E+003.36E+003.29E+003.34E+002.55E+003.83E+003.55E+003.60E+002.67E+003.29E+00
Worst4.76E+004.53E+004.40E+004.77E+004.37E+005.37E+004.63E+005.01E+004.89E+004.84E+00
MeanRunTimes9.23E-019.31E-019.30E-019.18E-019.44E-019.16E-019.21E-019.24E-019.27E-019.16E-01
F7Mean0000000000
Stand.Div0000000000
Best0000000000
Worst0000000000
MeanRunTimes4.14E+004.12E+004.17E+004.14E+004.15E+004.15E+004.15E+004.13E+004.15E+004.14E+00
F8Mean2.88E+012.87E+012.88E+012.88E+012.87E+012.89E+012.89E+012.87E+012.88E+012.88E+01
Stand.Div2.88E+012.87E+012.88E+012.88E+012.87E+012.89E+012.89E+012.87E+012.88E+012.88E+01
Best2.88E+012.87E+012.88E+012.88E+012.86E+012.89E+012.88E+012.87E+012.88E+012.88E+01
Worst2.89E+012.87E+012.89E+012.89E+012.87E+012.89E+012.89E+012.87E+012.89E+012.89E+01
MeanRunTimes3.69E+003.69E+003.68E+003.67E+003.70E+003.68E+003.65E+003.68E+003.69E+003.67E+00
F9Mean0000000000
Stand.Div0000000000
Best0000000000
Worst0000000000
MeanRunTimes4.20E+004.19E+004.20E+004.18E+004.19E+004.16E+004.18E+004.15E+004.19E+004.16E+00
F10Mean0000000000
Stand.Div0000000000
Best0000000000
Worst0000000000
MeanRunTimes3.97E+003.97E+003.97E+003.96E+004.04E+003.99E+003.95E+004.07E+003.97E+003.97E+00
Table 20. Friedman’s test for classical functions.
Table 20. Friedman’s test for classical functions.
BABOAGWO
Mean RankMean RankMean Rank
rand7.1012.3512.30
betarnd(3,2)4.209.259.90
betarnd(2.5,2.5)2.7012.0510.90
unifrnd(0,1)8.009.6511.20
lognrnd(0,0.5)17.7010.609.50
exprnd(0.5)10.0012.4013.10
raylrnd(0.4)4.9012.2512.40
wblrnd(1,1)16.2010.9012.30
lhsdesign()7.9012.9012.20
Sobol()5.7012.6511.20
Test Statistics a
N101010
Chi-Square160.91733.89225.217
df222222
Asymp. Sig.0.0000.0500.287
a Friedman’s Test.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Agushaka, J.O.; Ezugwu, A.E. Initialisation Approaches for Population-Based Metaheuristic Algorithms: A Comprehensive Review. Appl. Sci. 2022, 12, 896. https://doi.org/10.3390/app12020896

AMA Style

Agushaka JO, Ezugwu AE. Initialisation Approaches for Population-Based Metaheuristic Algorithms: A Comprehensive Review. Applied Sciences. 2022; 12(2):896. https://doi.org/10.3390/app12020896

Chicago/Turabian Style

Agushaka, Jeffrey O., and Absalom E. Ezugwu. 2022. "Initialisation Approaches for Population-Based Metaheuristic Algorithms: A Comprehensive Review" Applied Sciences 12, no. 2: 896. https://doi.org/10.3390/app12020896

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop