Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hartmut Schmeck is active.

Publication


Featured researches published by Hartmut Schmeck.


Archive | 2000

A Multi-population Approach to Dynamic Optimization Problems

Jürgen Branke; Christian Smidt; Hartmut Schmeck

Time-dependent optimization problems pose a new challenge to evolutionary algorithms, since they not only require a search for the optimum, but also a continuous tracking of the optimum over time. In this paper, we will will use concepts from the ”forking GA” (a multi-population evolutionary algorithm proposed to find multiple peaks in a multi-modal landscape) to enhance search in a dynamic landscape. The algorithm uses a number of smaller populations to track the most promising peaks over time, while a larger parent population is continuously searching for new peaks. We will show that this approach is indeed suitable for dynamic optimization problems by testing it on the recently proposed Moving Peaks Benchmark.


Advances in Engineering Software | 2001

Guidance in evolutionary multi-objective optimization

Jürgen Branke; T Kaußler; Hartmut Schmeck

Abstract Many real world design problems involve multiple, usually conflicting optimization criteria. Often, it is very difficult to weight the criteria exactly before alternatives are known. Multi-Objective Evolutionary Algorithms based on the principle of Pareto optimality are designed to explore the complete set of non-dominated solutions, which then allows the user to choose among many alternatives. However, although it is very difficult to exactly define the weighting of different optimization criteria, usually the user has some notion as to what range of weightings might be reasonable. In this paper, we present a novel, simple, and intuitive way to integrate the users preference into the evolutionary algorithm by allowing to define linear maximum and minimum trade-off functions. On a number of test problems we show that the proposed algorithm efficiently guides the population towards the interesting region, allowing a faster convergence and a better coverage of this area of the Pareto optimal front.


Journal of Heuristics | 2002

Multi Colony Ant Algorithms

Martin Middendorf; Frank Reischle; Hartmut Schmeck

In multi colony ant algorithms several colonies of ants cooperate in finding good solutions for an optimization problem. At certain time steps the colonies exchange information about good solutions. If the amount of exchanged information is not too large multi colony ant algorithms can be easily parallelized in a natural way by placing the colonies on different processors. In this paper we study the behaviour of multi colony ant algorithms with different kinds of information exchange between the colonies. Moreover we compare the behaviour of different numbers of colonies with a multi start single colony ant algorithm. As test problems we use the Traveling Salesperson problem and the Quadratic Assignment problem.


Advances in evolutionary computing | 2003

Designing evolutionary algorithms for dynamic optimization problems

Jiirgen Branke; Hartmut Schmeck

Most research in evolutionary computation focuses on optimization of static, non-changing problems. Many real-world optimization problems, however, are dynamic, and optimization methods are needed that are capable of continuously adapting the solution to a changing environment. If the optimization problem is dynamic, the goal is no longer to find the extrema, but to track their progression through the space as closely as possible. In this chapter, we suggest a classification of dynamic optimization problems, and survey and classify a number of the most widespread techniques that have been published in the literature so far to make evolutionary algorithms suitable for changing optimization problems. After this introduction to the basics, we will discuss in more detail two specific approaches, pointing out their deficiencies and potential. The first approach is based on memorization, the other one uses a novel multi-population structure.


international symposium on object component service oriented real time distributed computing | 2005

Organic computing - a new vision for distributed embedded systems

Hartmut Schmeck

Organic computing is becoming the new vision for the design of complex systems, satisfying human needs for trustworthy systems that behave life-like by adapting autonomously to dynamic changes of the environment, and have self-x properties as postulated for autonomic computing. Organic computing is a response to the threatening view of being surrounded by interacting and self-organizing systems which may become unmanageable, showing undesired emergent behavior. Major challenges for organic system design arise from the conflicting requirements to have systems that are at the same time robust and adaptive, having sufficient degrees of freedom for showing self-x properties but being open for human intervention and operating with respect to appropriate rules and constraints to prevent the occurrence of undesired emergent behavior.


congress on evolutionary computation | 2004

Parallelizing multi-objective evolutionary algorithms: cone separation

Jürgen Branke; Hartmut Schmeck; Kalyanmoy Deb

Evolutionary multi-objective optimization (EMO) may be computationally quite demanding, because instead of searching for a single optimum, one generally wishes to find the whole front of Pareto-optimal solutions. For that reason, parallelizing EMO is an important issue. Since we are looking for a number of Pareto-optimal solutions with different tradeoffs between the objectives, it seems natural to assign different parts of the search space to different processors. We propose the idea of cone separation which is used to divide up the search space by adding explicit constraints for each process. We show that the approach is more efficient than simple parallelization schemes, and that it also works on problems with a non-convex Pareto-optimal front.


genetic and evolutionary computation conference | 2007

Multi-objective particle swarm optimization on computer grids

Sanaz Mostaghim; Juergen Branke; Hartmut Schmeck

In recent years, a number of authors have successfully extended particle swarmoptimization to problem domains with multiple objec\-tives. This paper addresses theissue of parallelizing multi-objec\-tive particle swarms. We propose and empirically comparetwo parallel versions which differ in the way they divide the swarminto subswarms that can be processed independently on differentprocessors. One of the variants works asynchronouslyand is thus particularly suitable for heterogeneous computer clusters asoccurring e.g.\ in moderngrid computing platforms.


Annals of Operations Research | 1999

Experiences with fine‐grainedparallel genetic algorithms

Udo Kohlmorgen; Hartmut Schmeck; Knut Haase

In this paper, we present some results of our systematic studies of fine‐grained parallelversions of the island model of genetic algorithms and of variants of the neighborhood model(also called diffusion model) on the massively parallel computer MasPar MP1 with 16kprocessing elements. These parallel genetic algorithms have been applied to a range ofdifferent problems (e.g. traveling salesman, capacitated lot sizing, resource‐constrainedproject scheduling, flow shop, and warehouse location problems) in order to obtain anempirical basis for statements on their optimization quality.


ACM Transactions on Autonomous and Adaptive Systems | 2010

Adaptivity and self-organization in organic computing systems

Hartmut Schmeck; Christian Müller-Schloer; Emre Cakar; Moez Mnif; Urban Richter

Organic Computing (OC) and other research initiatives like Autonomic Computing or Proactive Computing have developed the vision of systems possessing life-like properties: they self-organize, adapt to their dynamically changing environments, and establish other so-called self-x properties, like self-healing, self-configuration, self-optimization, etc. What we are searching for in OC are methodologies and concepts for systems that allow to cope with increasingly complex networked application systems by introduction of self-x properties and at the same time guarantee a trustworthy and adaptive response to externally provided system objectives and control actions. Therefore, in OC, we talk about controlled self-organization. Although the terms self-organization and adaptivity have been discussed for years, we miss a clear definition of self-organization in most publications, which have a technically motivated background. In this article, we briefly summarize the state of the art and suggest a characterization of (controlled) self-organization and adaptivity that is motivated by the main objectives of the OC initiative. We present a system classification of robust, adaptable, and adaptive systems and define a degree of autonomy to be able to quantify how autonomously a system is working. The degree of autonomy distinguishes and measures external control that is exerted directly by the user (no autonomy) from internal control of a system which might be fully controlled by an observer/controller architecture that is part of the system (full autonomy). The quantitative degree of autonomy provides the basis for characterizing the notion of controlled self-organization. Furthermore, we discuss several alternatives for the design of organic systems.


Applied Soft Computing | 2004

FPGA implementation of population-based ant colony optimization

Bernd Scheuermann; Keith So; Michael Guntsch; Martin Middendorf; Oliver Diessel; Hossam A. ElGindy; Hartmut Schmeck

Abstract We present a hardware implementation of population-based ant colony optimization (P-ACO) on field-programmable gate arrays (FPGAs). The ant colony optimization meta-heuristic is adopted from the natural foraging behavior of real ants and has been used to find good solutions to a wide spectrum of combinatorial optimization problems. We describe the P-ACO algorithm and present a circuit architecture that facilitates efficient FPGA implementations. The proposed design shows modest space requirements but leads to a significant reduction in runtime over software-based solutions. Several modifications and extensions of the basic algorithm are also presented, including the approximation of the heuristic function by a small, dynamically changing set of favorable decisions.

Collaboration


Dive into the Hartmut Schmeck's collaboration.

Top Co-Authors

Avatar

Lukas König

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sanaz Mostaghim

Otto-von-Guericke University Magdeburg

View shared research outputs
Top Co-Authors

Avatar

Friederike Pfeiffer-Bohnen

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ingo Mauser

Center for Information Technology

View shared research outputs
Top Co-Authors

Avatar

Pradyumn Kumar Shukla

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sebastian Kochanneck

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Florian Allerding

Karlsruhe Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Holger Prothmann

Karlsruhe Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge