A Survey On (Stochastic Fractal Search) Algorithm
AA Survey On (Stochastic Fractal Search)Algorithm
Mohammed ElKomy, Computer Engineering, [email protected]
Abstract
Evolutionary Algorithms are naturally inspired approximation optimisation algorithmsthat usually interfere with science problems when common mathematical methods areunable to provide a good solution or finding the exact solution requires an unreasonableamount of time using traditional exhaustive search algorithms. The success of thesepopulation-based frameworks is mainly due to their flexibility and ease of adaptation tothe most different and complex optimisation problems. This paper presents ametaheuristic algorithm called Stochastic Fractal Search, inspired by the naturalphenomenon of growth based on a mathematical concept called the fractal, which isshown to be able to explore the search space more efficiently. This paper also focuses onthe algorithm steps and some example applications of engineering design optimisationproblems commonly used in the literature being applied to the proposed algorithm.
Keywords
Optimisation, Exploration, Exploitation, Diffusion process, Fractal, Random fractals,Benchmark functions, Visual tracking, PID Controllers Tuning.
Introduction
Throughout history problem-solving has been a major concern for science and industry,thousands of challenging problems have been proposed with varying levels complexity andgranularity. Industrial and economic organizations are always looking for better solutions forthe problems they encounter from, minimizing time, risk and cost, to maximizing profit andefficiency, which has been a persisting question. Also in engineering and scientific researchthere are certainly plenty of problems framed in an optimisation framework, including manyvariables acting under complex constraints such as maximum stress, maximum deflection,minimum load capacity, or geometrical configuration. Another major concern is the size of thesearch space increases dramatically while solving high-dimensional optimisation problemswhich hinders classical optimisation algorithms such as exhaustive search [8] by consumingunreasonable amount of time due to the increase complexity. That’s why metaheuristic andnaturally inspired evolutionary algorithms are used as an optimisation approximationframework, while being able to produce robust solutions in reasonable amount of time withoutexhaustively searching the whole search space based on stochasticity. Those algorithms havesatisfy two main important characteristics namely, intensification (or exploitation) anddiversification (or exploration), Intensification represents the ability of the search algorithm tofind the best candidates around the current best solutions, while diversification considers the1 a r X i v : . [ c s . A I] J a n fficiency of the algorithm in exploring the search space often using the randomizationstrategies.Recently, the scientific community has witnessed the cutting-edge breakthrough incomputational intelligence including computer vision and natural language processing usingnumerical-based approaches, mostly driven by the advances in the integrated chips industryand graphical processing units (GPUs) for distributed computing. With this great development,metaheuristic algorithms, inspired by natural phenomena behaviors, received great attention,Among them, Genetic Algorithm (GA) [7] based on the Darwin theorem of evolution, ParticleSwarm Optimization (PSO) [4] which mimics the flocks of birds searching for food and Antcolony (AC) [3] is another optimisation algorithm inspired by the foraging behavior of antcolonies.This paper presents two algorithms [9], the first algorithm, each candidate solution simulatethe branching property y of a dielectric breakdown, The second algorithm is the developedversion of the first algorithm which tackles the disadvantages of the first algorithm. the secondalgorithm is divided into two processes called Diffusing and Updating processes, the firstprocess is adapted from the first algorithm using the particle diffusion, while the secondprocess is incorporated as an improvement to explore the search space more efficiently byintroducing random perturbation to candidate solutions which in turn reduces the possibilty ofbeing stuck in a local minima. Fractals
The property of an object or quantity which incorporates self-similarity on all scales[9], fromthe Latin word fr˘actus which means ’broken’ or ’fractured’, Mathematically generated fractalsare visually appealing recursive structures, for example Mandelbrot set (fig 1) [5] and Sierpinskitriangle (fig 4) , There are some common methods to generate fractals, but in this work we areinterested in generating random fractals as a core for the optimisation algorithm.Fractals are common in nature such as dielectric breakdown (fig 2) which is narrow dischargebranchings due to lightning.Figure 1: Mandelbrot Set Figure 2: Dielectric Breakdown
Random Fractals
Random fractals are generated through modifying the iteration process according to astochastic rule such as Levy flight, guassian walk or self-avoiding walks. The proposed searchalgorithm uses the Diffusion Limited Aggregation (DLA) method is the key operation used forthe diffusion phase (fig 3) ([9] fig 1). 2igure 3: A simple fractal growthby DLA method. Figure 4: Sierpinski triangle
Fractal Search Algorithm
Based on our previous discussion, The first proposed optimisation algorithm incorporates bothfractal growth through the DLA method and potential theory, particles are generated accordingto the following rules:1. Each particle has it’s own electrical potential energy.2. Each particle is subject to the diffusion process, generating other random particles, wherethe energy of the seed particle is distributed among the generated particles.3. Only few of the best particles survive through successive generations, and the rest of theunsuccessful particles are dropped from the population.As detailed in ([9] section 3), throughgenerations each particle is diffused basedon levy flight, A levy flight is a random-walk based on the levy distribution to modelforaging in nature. For a particle with initialenergy E i , the diffusion process generates p particles where the total potential energy of thegenerated particles is equal to the original seedparticle potential energy. (fig 5) ([9] fig 2). Figure 5: Diffusing a particle.The diffusion process switches randomly between the levy flight and gaussian distributionincorporating the advantages of both of them, where the levy flight for fast convergence whilethe gaussian distribution is used for improved exploitation.As shown in ([9] section 4), Fractal search algorithm suffers from some disadvantages, mainlythe number of tunable parameters and the lack of information exchange between particles, forthat reason Stochastic Fractal Search (the second algorithm) addressing those drawbacks.3 tochastic Fractal Search As mentioned in the previous section, the disadvantages of Fractal search has been tackled inStochastic Fractal Search(SFS) ([9] section 4), introducing a phase called the updating process.There are two main phases in SFS, namely, the diffusing process and the updating process.In the first process, as in Fractal Search, each particle diffuses around its current position tosatisfy intensification (exploitation) of the search space, increasing the chance of finding theglobal minima, and also prevents being trapped in the local minima. In the second process,each particle updates its location in the search space based on the position of other particles,addressing the information exchange Fractal Search lacked, leading us to better diversification(exploration) of the search space. Another important modification is using a static diffusionprocess which means only the best generated particle from the diffusing process is considered,instead of the exponential growth of the population through generations. In addition to usinga static diffusion process, the gaussian distribution is the only random walk considered ratherthan using both of the gaussian distribution and levy flight as in fractal search. Although levyflight proved fast convergence, gaussians are more promising in finding the global optimum.Figure 6: right:Levy flight, left:Gauss walk ([6])
Algorithm Steps And Pseudocode
Stochastic Fractal Search is fully presented in (algorithm 1) The first phase of the algorithm isthe diffusion step which uses two different gaussian walks and switches between themrandomly according to a tunable parameter given to the optimizer.The first gaussian walk, where µ BP , BP is exactly Best Point found by the optimizer, P i thepoint from which the diffsion is done, ε and ε (cid:48) are uniformly distributed random numbers [ , ] . GW = Gaussian ( µ BP , σ ) + (cid:0) ε × BP − ε (cid:48) × P i (cid:1) (1)The second gaussian walk, where µ P is point from which the diffusion is done, as in the equationbelow: GW = Gaussian ( µ P , σ ) (2)For more improved localized search near the good candidates, by reducing the steps taken bythe gaussian walks in (2) and (1), in the following equation: σ = (cid:12)(cid:12)(cid:12)(cid:12) log ( g ) g × ( P i − BP ) (cid:12)(cid:12)(cid:12)(cid:12) (3)4uring the initialization, and for every newly created particle the boundaries of the searchspace have to be checked for a specific dimension j for a particle P , where U B is the upperbound vector, LB is the lower bound vector, and ε is a uniformly distributed random number[0,1], as in the following equation: P j = LB + ε × ( U B − LB ) (4)The Linear ranking function to give probabilities to particles according to the fitness of eachpoint, used twice in the (algorithm 1) , as in the following equation: Pa i = rank ( P i ) N (5)The first updating process uses the following equation, introducing elementwise updates Forcomponent j and point i , P (cid:48) i is the newly updated point from the first updating step, as in theequation: P (cid:48) i ( j ) = P r ( j ) − ε × ( P t ( j ) − P i ( j )) (6)For the second updating process, for the point P (cid:48) i to be updated, and P (cid:48) t , P (cid:48) r which are tworandomly selected points, combining them together to perform a point vector replacement,ˆ ε is a uniformly distributed random number [0,1] to switch between those two updating rules, P (cid:48)(cid:48) i is the newly updated point from the second updating step, as described in the followingequations: P (cid:48)(cid:48) i = P (cid:48) i − ˆ ε × (cid:0) P (cid:48) t − BP (cid:1) | ε (cid:48) (cid:54) . P (cid:48)(cid:48) i = P (cid:48) i + ˆ ε × (cid:0) P (cid:48) t − P (cid:48) r (cid:1) | ε (cid:48) > . Recent SFS-Related Publications
PID controller design
PID control algorithm is the most accepted approach for automatic voltage regulator systems(fig 7), an automatic voltage regulator(AVR) is equipment installed in power generationstations that sustains the output voltage at a desired voltage level under varying systemconditions by controlling the excitation voltage of a synchronous generator (fig 8). despitethat, it’s still a challenging task for researchers to tune its parameters.A recent research [1] incorporated the SFS algorithm to tune the parameters of the PIDcontroller (fig 9), the motivation behind that was around 80% of such controllers in service aresuffering from poorly tuned controller gains.As in ([1] table 2), the SFS optimized PID controller renders a better dynamic response profileof the concerned power system than the existing alternative algorithms.5 lgorithm 1:
Stochastic Fractal Search Pseudo Algorithm [9, sec 4]Initialize a population of N points; while g < maximum generation or (stop criterion) dofor each Point P i in the system doCall Diffusion Process with the following process:q = (maximum considered number of diffusion). for j = 1 to q doif user applies the first Gaussian walk to solve the problem then
Create a new point based on Eq.(1). endif user sets the second Gaussian Walks to solve the problem then
Create a new point based on Eq.(2). endendCall Updating
Process with the following process beginFirst Updating Process: .First, all points are ranked based on Eq.( 5). for each Point P i in the system dofor each component j in P i doif rand[0,1] > Pa i then Update the component inbased on Eq. (6). endendendSecond Updating Process: .Once again, all points obtained by the first update process are ranked basedon Eq. (5). for each Point P (cid:48) i in the system doif rand[0,1] > Pa (cid:48) i then Update the position based on Eqs. (7) and (8). endendendendend
Figure 7: Transfer function model of an AVR system with PID controller ([1]fig 2)6igure 8: Schematic diagram of an AVR system ([1] fig 1)Figure 9: Block diagram of an AVR system combined with SFS–PID controller([1]fig 4)
Visual Object Tracking
Visual tracking is a challenging computer vision problem because of many reasons such as,partial occlusion, fast motion, blur motion, object deformation, sudden feature changes,rotation, scale variation. . . etc, Example dataset (fig 11)As in ([2]section 3) the target localization in every frame is considered as an optimisationproblem. The pseudocode is shown in (fig 10), the object template is at first pinpointed inside abounding box (BB), which is defined by a translation vector V i = ( X i , Y i , W i , H i ) , where ( X i , Y i ) denotes the cartesian coordinate of that BB, and ( W i , H i ) are the fixed width and height of thatBB defined from the ground truth of the first frame. Assuming that the target’s speed cannotexceed the target dimensions between adjacent frames, the reasonable size of the searchingwindow is defined by the lower boundary L b and the upper boundary U b . As depicted in([2]section 3), 7igure 10: Proposed SFS tracking algorithm ([2]fig 3)Figure 11: The Boy sequence suffers from scale variation, motion blur, fast motion, in- planerotation, and out-of-plane rotation ([2]fig 4) Related work
Optimisation algorithms in the recent years have witnessed a dramatic increase of interest fromscientists and engineers, because they provide acceptable solutions in a reasonable amount oftime to many engineering and scientific problems. The development in optimisationalgorithms focused on the evolutionary and metaheuristic algorithms in general because theyare supported by natural phenomena which provably shows they actually survive the naturalchallenges. This dates back to the early stages of development, which incorporated theprinciples of evolution in problem-solving numerical techinques, from Genetic Algorithm(GA) [7] based on the Darwin theorem of evolution, Ant colony (AC)[3] mimicking theforaging behavior of ant colonies, Differential Evolution (DE) [10] is another heuristic basedon simple yet powerful mathematical expressions, and particle Swarm Optimization (PSO) [4]which simulates the flocks of birds searching for food.8his paper is an introduction to the algorithms presented in Salimi [9], namely, fractal search([9] section 3) and stochastic fractal search ([9] section 4). Also mentioning applications inengineering and science using stochastic fractal search as an optimisation framework,Optimizing PID controller parameters for an automatic folder regulator [2], Anotherapplication [1] which used SFS as a localization backend for a visual tracking algorithm.
Source Code
The source code for stochastic fractal search is already published by the original author onMATLAB Central File Exchange (here). In this paper we offer a python re-implementation(here) for SFS on github under the GPL-3.0 License, where almost all the well-knownbenchmark functions could be integrated in the pipeline.
Conclusion
In this paper, we provided an introduction for fractal-based metaheuristic optimisationalgorithms, which proved its ability to solve global optimisation problems qualitatively andquantitatively, the first algorithm incorporated the DLA method to generate new points basedon the levy flight and gaussian distribution, switching between them randomly, on the otherhand the latter algorithm employed a second phase for better convergence to the globaloptimum called the update step, which introduced random perturbation to candidate solutions.We also provided example applications that used SFS as an optimisation engine,outperforming alternative numerical-based optimisation. The first application addressed acontrol engineering problem, where SFS were used to optimise the parameters of a PIDcontroller. The second application used SFS for object tracking, where in each frame the SFSis used to localize the location of the target object based on the previous knowledge from theprevious frames.
References [1] E. Ç elik. Incorporation of stochastic fractal search algorithm into efficient design of pidcontroller for an automatic voltage regulator system.
Neural Computing and Applications ,30(6):1991–2002, Sep 2018. ISSN 1433-3058. doi: 10.1007/s00521-017-3335-7. URL https://doi.org/10.1007/s00521-017-3335-7 .[2] D. Charef-Khodja, A. Toumi, S. Medouakh, and S. Sbaa. A novel visual tracking methodusing stochastic fractal search algorithm.
Signal, Image and Video Processing , Aug 2020.ISSN 1863-1711. doi: 10.1007/s11760-020-01748-7. URL https://doi.org/10.1007/s11760-020-01748-7 .[3] M. Dorigo and G. Di Caro. Ant colony optimization: a new meta-heuristic. In
Proceedingsof the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406) ,volume 2, pages 1470–1477 Vol. 2, 1999. doi: 10.1109/CEC.1999.782657.[4] J. Kennedy.
Particle Swarm Optimization , pages 760–766. Springer US, Boston, MA,2010. ISBN 978-0-387-30164-8. doi: 10.1007/978-0-387-30164-8_630. URL https://doi.org/10.1007/978-0-387-30164-8_630 .95] B. B. Mandelbrot.
The fractal geometry of nature . Freeman, San Francisco, CA, 1982.URL https://cds.cern.ch/record/98509 .[6] R. Metzler and J. Klafter. The restaurant at the end of the random walk: Recentdevelopments in the description of anomalous transport by fractional dynamics.
J. Phys.A: Math. Gen , 3737, 08 2004. doi: 10.1088/0305-4470/37/31/R01.[7] M. Mitchell.
An Introduction to Genetic Algorithms . MIT Press, Cambridge, MA, USA,1998. ISBN 0262631857.[8] S. J. Russell, P. Norvig, and S. J. Russell.
Artificial Intelligence: A Modern Approach .Prentice Hall, 0002 edition, jul 1999. ISBN 0137903952.[9] H. Salimi. Stochastic fractal search: A powerful metaheuristic algorithm.
Knowledge-Based Systems , 75:1 – 18, 2015. ISSN 0950-7051. doi: https://doi.org/10.1016/j.knosys.2014.07.025. URL .[10] R. Storn and K. Price. Differential evolution – a simple and efficient heuristic for globaloptimization over continuous spaces.
Journal of Global Optimization , 11(4):341–359,Dec 1997. ISSN 1573-2916. doi: 10.1023/A:1008202821328. URL https://doi.org/10.1023/A:1008202821328https://doi.org/10.1023/A:1008202821328