Julio Barrera
CINVESTAV
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Julio Barrera.
Innovations in Swarm Intelligence | 2009
Julio Barrera; Carlos A. Coello Coello
Particle swarm optimization (PSO) is a metaheuristic inspired on the flight of a flock of birds seeking food, which has been widely used for a variety of optimization tasks [1,2]. However, its use in multimodal optimization (i.e., single-objective optimization problems having multiple optima) has been relatively scarce.
Biometrical Journal | 1998
J.Andrés Christen; José-Leonel Torres; Julio Barrera
We consider the available genomes (total DNA content) of autonomous organisms as strings of symbols from a four-letter alphabet corresponding to their molecular components (or bases), and define words as continuous sequences of letters with a given length. With the aim of distinguishing between genomes and non-functional DNA chains, we devise a description of random letter substitutions along a genetic string, and show each of the extant genomes to be more robust against point mutations than chains obtained by randomly reordering its word frequencies.
Archive | 2011
Julio Barrera; Carlos A. Coello Coello
The particle swarm optimization (PSO) algorithm has gained increasing popularity in the last few years mainly because of its relative simplicity and its good overall performance, particularly in continuous optimization problems. As PSO is adopted in more types of application domains, it becomes more important to have well-established methodologies to assess its performance. For that purpose, several test problems have been proposed. In this chapter, we review several state-of-the-art test function generators that have been used for assessing the performance of PSO variants. As we will see, such test problems sometimes have regularities which can be easily exploited by PSO (or any other algorithm for that sake) resulting in an outstanding performance. In order to avoid such regularities, we describe here several basic design principles that should be followed when creating a test function generator for single-objective continuous optimization.
mexican international conference on artificial intelligence | 2009
Julio Barrera; Carlos A. Coello Coello
The problem of finding more than one optimum of a fitness function has been addressed in evolutionary computation using a wide variety of algorithms, including particle swarm optimization (PSO). Several variants of the PSO algorithm have been developed to deal with this sort of problem with different degrees of success, but a common drawback of such approaches is that they normally add new parameters that need to be properly tuned, and whose values usually rely on previous knowledge of the fitness function being analyzed. In this paper, we present a PSO algorithm based on electrostatic interaction, which does not need any additional parameters besides those of the original PSO. We show that our proposed approach is able to converge to all the optima of several test functions commonly adopted in the specialized literature, consuming less evaluations of the fitness function than other previously reported PSO methods.
learning and intelligent optimization | 2011
Juan J. Flores; Rodrigo López; Julio Barrera
Evolutionary computation is inspired by nature in order to formulate metaheuristics capable to optimize several kinds of problems. A family of algorithms has emerged based on this idea; e.g. genetic algorithms, evolutionary strategies, particle swarm optimization (PSO), ant colony optimization (ACO), etc. In this paper we show a population-based metaheuristic inspired on the gravitational forces produced by the interaction of the masses of a set of bodies. We explored the physics knowledge in order to find useful analogies to design an optimization metaheuristic. The proposed algorithm is capable to find the optima of unimodal and multimodal functions commonly used to benchmark evolutionary algorithms. We show that the proposed algorithm (Gravitational Interactions Optimization - GIO) works and outperforms PSO with niches in both cases. Our algorithm does not depend on a radius parameter and does not need to use niches to solve multimodal problems. We compare GIO with other metaheuristics with respect to the mean number of evaluations needed to find the optima.
mexican international conference on artificial intelligence | 2010
Juan J. Flores; Rodrigo López; Julio Barrera
Evolutionary computation is inspired by nature in order to formulate metaheuristics capable to optimize several kinds of problems. A family of algorithms has emerged based on this idea; e.g. genetic algorithms, evolutionary strategies, particle swarm optimization (PSO), ant colony optimization (ACO), etc. In this paper we show a population-based metaheuristic inspired on the gravitational forces produced by the interaction of the masses of a set of bodies. We explored the physics knowledge in order to find useful analogies to design an optimization metaheuristic. The proposed algorithm is capable to find the optima of unimodal and multimodal functions commonly used to benchmark evolutionary algorithms. We show that the proposed algorithm works and outperforms PSO with niches in both cases. Our algorithm does not depend on a radius parameter and does not need to use niches to solve multimodal problems. We compare with other metaheuristics respect to the mean number of evaluations needed to find the optima.
genetic and evolutionary computation conference | 2009
Julio Barrera; Carlos A. Coello Coello
Since the introduction of the particle swarm optimization (PSO) algorithm, a considerable amount of research has been devoted to devise mechanisms that can control its possible premature convergence. The most common approach to deal with premature convergence in PSO consists of controlling (e.g., by limiting) the velocity of a particle. In this paper, we present a method that consists of limiting the velocity of a particle using the elements of a sequence of a geometric series. This approach is not only simplest than the current available methods, but also presents competitive results, and even better convergence in some cases, than two other PSO-based approaches. Additionally, the proposed approach provides more flexibility to balance between exploration or exploitation, through the tuning of a single parameter.
Journal of Artificial Evolution and Applications | 2008
Julio Barrera; Juan J. Flores; Claudio R. Fuerte-Esquivel
A dynamic system is represented as a set of equations that specify how variables change over time. The equations in the system specify how to compute the new values of the state variables as a function of their current values and the values of the control parameters. If those parameters change beyond certain values, the system exhibits qualitative changes in its behavior. Those qualitative changes are called bifurcations, and the values for the parameters where those changes occur are called bifurcation points. In this contribution, we present an application of particle swarm optimization methods for dynamic environments for plotting bifurcation diagrams used in the analysis of dynamical systems. The use of particle swarm optimization methods presents various advantages over traditional methods.
intelligent systems design and applications | 2007
Juan J. Flores; Julio Barrera; Felix Calderon
A multimodal function is a function with more than one optimum. This work proposes a method for automatic determination of regions of the search space of a given function. Those regions guarantee enclose one an optimum. Based on an initial sampling of the search space, an incremental convex hull algorithm is used to incrementally grow regions that enclose one optimum. After applying the method we can use any optimization algorithm on each of the determined region; comparing with genetic algorithms for multimodal functions, the proposed method eliminates the use of parameters like radius in fitness sharing.
electronics robotics and automotive mechanics conference | 2007
Julio Barrera; Juan J. Flores
In this contribution we propose the use of intelligent optimization methods in the search of initial conditions for the analysis of dynamic systems. The use of intelligent optimization methods provides a search tool that does not depend on the experience of the researcher in the particular system to analyze. An example of a dynamic system that models an electrical power system is provided. Three intelligent optimization methods are compared: genetic algorithms, multimodal genetic algorithms, and particle swarm optimization. An analysis of precision and error is presented, contrasting the three methods.