Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Petr Pošík is active.

Publication


Featured researches published by Petr Pošík.


genetic and evolutionary computation conference | 2010

Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009

Nikolaus Hansen; Anne Auger; Raymond Ros; Steffen Finck; Petr Pošík

This paper presents results of the BBOB-2009 benchmarking of 31 search algorithms on 24 noiseless functions in a black-box optimization scenario in continuous domain. The runtime of the algorithms, measured in number of function evaluations, is investigated and a connection between a single convergence graph and the runtime distribution is uncovered. Performance is investigated for different dimensions up to 40-D, for different target precision values, and in different subgroups of functions. Searching in larger dimension and multi-modal functions appears to be more difficult. The choice of the best algorithm also depends remarkably on the available budget of function evaluations.


congress on evolutionary computation | 2005

Real-parameter optimization using the mutation step co-evolution

Petr Pošík

An evolutionary algorithm for the optimization of a function with real parameters is described in this paper. It uses a cooperative co-evolution to breed and reproduce successful mutation steps. The algorithm described herein is then tested on a suite of 10D and 30D reference optimization problems collected for the special session on real-parameter optimization of the IEEE Congress on Evolutionary Computation 2005. The results are of mixed quality (as expected), but reveal several interesting aspects of this simple algorithm


Evolutionary Computation | 2012

A comparison of global search algorithms for continuous black box optimization

Petr Pošík; Waltraud Huyer; László Pál

Four methods for global numerical black box optimization with origins in the mathematical programming community are described and experimentally compared with the state of the art evolutionary method, BIPOP-CMA-ES. The methods chosen for the comparison exhibit various features that are potentially interesting for the evolutionary computation community: systematic sampling of the search space (DIRECT, MCS) possibly combined with a local search method (MCS), or a multi-start approach (NEWUOA, GLOBAL) possibly equipped with a careful selection of points to run a local optimizer from (GLOBAL). The recently proposed “comparing continuous optimizers” (COCO) methodology was adopted as the basis for the comparison. Based on the results, we draw suggestions about which algorithm should be used depending on the available budget of function evaluations, and we propose several possibilities for hybridizing evolutionary algorithms (EAs) with features of the other compared algorithms.


parallel problem solving from nature | 2008

Preventing Premature Convergence in a Simple EDA Via Global Step Size Setting

Petr Pošík

When a simple real-valued estimation of distribution algorithm (EDA) with Gaussian model and maximum likelihood estimation of parameters is used, it converges prematurely even on the slope of the fitness function. The simplest way of preventing premature convergence by multiplying the variance estimate by a constant factor keach generation is studied. Recent works have shown that when increasing the dimensionality of the search space, such an algorithm becomes very quickly unable to traverse the slope and focus to the optimum at the same time. In this paper it is shown that when isotropic distributions with Gaussian or Cauchy distributed norms are used, the simple constant setting of kis able to ensure a reasonable behaviour of the EDA on the slope and in the valley of the fitness function at the same time.


genetic and evolutionary computation conference | 2009

BBOB-benchmarking a simple estimation of distribution algorithm with cauchy distribution

Petr Pošík

The restarted estimation of distribution algorithm (EDA) with Cauchy distribution as the probabilistic model is tested on the BBOB 2009 testbed. These tests prove that when using the Cauchy distribution and suitably chosen variance enlargment factor, the algorithm is usable for broad range of fitness landscapes, which is not the case for EDA with Gaussian distribution which converges prematurely. The results of the algorithm are of mixed quality and its scaling is at least quadratic.


genetic and evolutionary computation conference | 2009

BBOB-benchmarking the DIRECT global optimization algorithm

Petr Pošík

The DIRECT global optimization algorithm is tested on the BBOB 2009 testbed. The algorithm is rather time and space consuming since it does not forget any point it samples during the optimization. Furthermore, all the sampled points are considered when deciding where to sample next. The results suggest that the algorithm is a viable alternative only for low-dimensional search spaces (5D at most).


genetic and evolutionary computation conference | 2007

Estimation of fitness landscape contours in EAs

Petr Pošík; Vojtĕch Franc

Evolutionary algorithms applied in real domain should profit from information about the local fitness function curvature. This paper presents an initial study of an evolutionary strategy with a novel approach for learning the covariance matrix of a Gaussian distribution. The learning method is based one stimation of the fitness landscape contour line between the selected and discarded individuals. The distribution learned this way is then used to generate new population members. The algorithm presented here is the first attempt to construct the Gaussian distribution this way and should beconsidered only a proof of concept; nevertheless, the empirical comparison on low-dimensional quadratic functions shows that our approach is viable and with respect to the number of evaluations needed to find a solution of certain quality, it is comparable to the state-of-the-art CMA-ES incase of sphere function and outperforms the CMA-ES in case of elliptical function.


parallel problem solving from nature | 2014

Online Black-Box Algorithm Portfolios for Continuous Optimization

Petr Baudiš; Petr Pošík

In black-box function optimization, we can choose from a wide variety of heuristic algorithms that are suited to different functions and computation budgets. Given a particular function to be optimized, the problem we consider in this paper is how to select the appropriate algorithm. In general, this problem is studied in the field of algorithm portfolios; we treat the algorithms as black boxes themselves and consider online selection (without learning mapping from problem features to best algorithms a priori and dynamically switching between algorithms during the optimization run).


electronic commerce | 2012

Restarted local search algorithms for continuous black box optimization

Petr Pošík; Waltraud Huyer

Several local search algorithms for real-valued domains (axis parallel line search, Nelder-Mead simplex search, Rosenbrocks algorithm, quasi-Newton method, NEWUOA, and VXQR) are described and thoroughly compared in this article, embedding them in a multi-start method. Their comparison aims (1) to help the researchers from the evolutionary community to choose the right opponent for their algorithm (to choose an opponent that would constitute a hard-to-beat baseline algorithm), (2) to describe individual features of these algorithms and show how they influence the algorithm on different problems, and (3) to provide inspiration for the hybridization of evolutionary algorithms with these local optimizers. The recently proposed Comparing Continuous Optimizers (COCO) methodology was adopted as the basis for the comparison. The results show that in low dimensional spaces, the old method of Nelder and Mead is still the most successful among those compared, while in spaces of higher dimensions, it is better to choose an algorithm based on quadratic modeling, such as NEWUOA or a quasi-Newton method.


genetic and evolutionary computation conference | 2012

Benchmarking the differential evolution with adaptive encoding on noiseless functions

Petr Pošík; Václav Klemš

The differential evolution (DE) algorithm is equipped with the recently proposed adaptive encoding (AE) which makes the algorithm rotationally invariant. The resulting algorithm, DEAE, should exhibit better performance on non-separable functions. The aim of this article is to assess what benefits the AE has, and what effect it has for other function groups. DEAE is compared against pure DE, an adaptive version of DE (JADE), and an evolutionary strategy with covariance matrix adaptation (CMA-ES). The results suggest that AE indeed improves the performance of DE, particularly on the group of unimodal non-separable functions, but the adaptation of parameters used in JADE is more profitable on average. The use of AE inside JADE is envisioned.

Collaboration


Dive into the Petr Pošík's collaboration.

Top Co-Authors

Avatar

Jan Žegklitz

Czech Technical University in Prague

View shared research outputs
Top Co-Authors

Avatar

Petr Baudiš

Czech Technical University in Prague

View shared research outputs
Top Co-Authors

Avatar

Gustav Šourek

Czech Technical University in Prague

View shared research outputs
Top Co-Authors

Avatar

Jaroslav Moravec

Czech Technical University in Prague

View shared research outputs
Top Co-Authors

Avatar

Jiří Kubalík

Czech Technical University in Prague

View shared research outputs
Top Co-Authors

Avatar

Václav Klemš

Czech Technical University in Prague

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Aleš Pilný

Czech Technical University in Prague

View shared research outputs
Top Co-Authors

Avatar

Daniel Novák

Czech Technical University in Prague

View shared research outputs
Top Co-Authors

Avatar

Július Bemš

Czech Technical University in Prague

View shared research outputs
Researchain Logo
Decentralizing Knowledge