Martina Friese
Cologne University of Applied Sciences
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Martina Friese.
genetic and evolutionary computation conference | 2014
Martin Zaefferer; Jörg Stork; Martina Friese; Andreas Fischbach; Boris Naujoks; Thomas Bartz-Beielstein
Real-world optimization problems may require time consuming and expensive measurements or simulations. Recently, the application of surrogate model-based approaches was extended from continuous to combinatorial spaces. This extension is based on the utilization of suitable distance measures such as Hamming or Swap Distance. In this work, such an extension is implemented for Kriging (Gaussian Process) models. Kriging provides a measure of uncertainty when determining predictions. This can be harnessed to calculate the Expected Improvement (EI) of a candidate solution. In continuous optimization, EI is used in the Efficient Global Optimization (EGO) approach to balance exploitation and exploration for expensive optimization problems. Employing the extended Kriging model, we show for the first time that EGO can successfully be applied to combinatorial optimization problems. We describe necessary adaptations and arising issues as well as experimental results on several test problems. All surrogate models are optimized with a Genetic Algorithm (GA). To yield a comprehensive comparison, EGO and Kriging are compared to an earlier suggested Radial Basis Function Network, a linear modeling approach, as well as model-free optimization with random search and GA. EGO clearly outperforms the competing approaches on most of the tested problem instances.
genetic and evolutionary computation conference | 2011
Wolfgang Konen; Patrick Koch; Oliver Flasch; Thomas Bartz-Beielstein; Martina Friese; Boris Naujoks
The complex, often redundant and noisy data in real-world data mining (DM) applications frequently lead to inferior results when out-of-the-box DM models are applied. A tuning of parameters is essential to achieve high-quality results. In this work we aim at tuning parameters of the preprocessing and the modeling phase conjointly. The framework TDM (Tuned Data Mining) was developed to facilitate the search for good parameters and the comparison of different tuners. It is shown that tuning is of great importance for high-quality results. Surrogate-model based tuning utilizing the Sequential Parameter Optimization Toolbox (SPOT) is compared with other tuners (CMA-ES, BFGS, LHD) and evidence is found that SPOT is well suited for this task. In benchmark tasks like the Data Mining Cup (DMC) tuned models achieve remarkably better ranks than their untuned counterparts.
genetic and evolutionary computation conference | 2012
Martin Zaefferer; Thomas Bartz-Beielstein; Martina Friese; Boris Naujoks; Oliver Flasch
Many relevant industrial optimization tasks feature more than just one quality criterion. State-of-the-art multi-criteria optimization algorithms require a relatively large number of function evaluations (usually more than 10^5) to approximate Pareto fronts. Due to high cost or time consumption this large amount of function evaluations is not always available. Therefore, it is obvious to combine techniques such as Sequential Parameter Optimization (SPO), which need a very small number of function evaluations only, with techniques from evolutionary multi-criteria optimization (EMO). In this paper, we show how EMO techniques can be efficiently integrated into the framework of the SPO Toolbox (SPOT). We discuss advantages of this approach in comparison to state-of-the-art optimizers. Moreover, with the resulting capability to allow competing objectives, the opportunity arises to not only aim for the best, but also for the most robust solution. Herein we present an approach to optimize not only the quality of the solution, but also its robustness, taking these two goals as objectives for multi-criteria optimization into account.
genetic and evolutionary computation conference | 2014
Martin Zaefferer; Beate Breiderhoff; Boris Naujoks; Martina Friese; Jörg Stork; Andreas Fischbach; Oliver Flasch; Thomas Bartz-Beielstein
Cyclone separators are filtration devices frequently used in industry, e.g., to filter particles from flue gas. Optimizing the cyclone geometry is a demanding task. Accurate simulations of cyclone separators are based on time consuming computational fluid dynamics simulations. Thus, the need for exploiting cheap information from analytical, approximative models is evident. Here, we employ two multi-objective optimization algorithms on such cheap, approximative models to analyze their optimization performance on this problem. Under various limitations, we tune both algorithms with Sequential Parameter Optimization (SPO) to achieve best possible results in shortest time. The resulting optimal settings are validated with different seeds, as well as with a different approximative model for collection efficiency. Their optimal performance is compared against a model based approach, where multi-objective SPO is directly employed to optimize the Cyclone model, rather than tuning the optimization algorithms. It is shown that SPO finds improved parameter settings of the concerned algorithms and performs excellently when directly used as an optimizer.
genetic and evolutionary computation conference | 2011
Thomas Bartz-Beielstein; Martina Friese; Martin Zaefferer; Boris Naujoks; Oliver Flasch; Wolfgang Konen; Patrick Koch
Sequential parameter optimization (SPO) is a heuristic that combines classical and modern statistical techniques to improve the performance of search algorithms. In this study, SPO is directly used as an optimization method on different noisy mathematical test functions. SPO includes a broad variety of meta models, which can have significant impact on its performance. Additionally, Optimal Computing Budget Allocation (OCBA), which is an enhanced method for handling the computational budget spent for selecting new design points, is presented. The OCBA approach can intelligently determine the most efficient replication numbers. Moreover, we study the of performance of different meta models being integrated in SPO. Our results reveal that the incorporation of OCBA and the selection of Gaussian process models are highly beneficial. SPO outperformed three different alternative optimization algorithms on a set of five noisy mathematical test functions.
european conference on applications of evolutionary computation | 2013
Oliver Flasch; Martina Friese; Katya Vladislavleva; Thomas Bartz-Beielstein; Olaf Mersmann; Boris Naujoks; Jörg Stork; Martin Zaefferer
This work provides a preliminary study on applying state-of-the-art time-series forecasting methods to electrical energy consumption data recorded by smart metering equipment. We compare a custom-build commercial baseline method to modern ensemble-based methods from statistical time-series analysis and to a modern commercial GP system. Our preliminary results indicate that that modern ensemble-based methods, as well as GP, are an attractive alternative to custom-built approaches for electrical energy consumption forecasting.
genetic and evolutionary computation conference | 2012
Thomas Bartz-Beielstein; Martina Friese; Boris Naujoks; Martin Zaefferer
Most parameter tuning methods feature a number of parameters themselves. This also holds for the Sequential Parameter Optimization [1] Toolbox (SPOT). It provides default values, which are reasonable for many problems, but these defaults are set to favor robustness over performance. By default, a Random Forest (RF) [2] model is used for the surrogate optimization. The RF model is built rather fast. It runs robustly (i.e. it does not crash) and can handle non-ordered parameters (i.e. factors) very well. However, the RF model does provide poor optimization performance for a number of problems, due to the inbuilt discontinuities. It would often be more reasonable to use Kriging models [4]. These usually perform well for small and medium sized decision space dimensions. For use with the SPOT package, there are several existing packages that provide Kriging methods that often fit the required problem well (DiceKriging, mlegp, etc.). However, these methods have one thing in common, they are not robust. Especially when several design points (samples in the decision space) are close to each other, those functions often fail. Hence, in SPOT versions greater 1.0, a Kriging model based on the Matlab code by Forrester et.al. [3] was introduced.
Archive | 2011
Thomas Bartz-Beielstein; Martina Friese; Oliver Flasch; Wolfgang Konen; Patrick Koch; Boris Naujoks
Archive | 2011
Martina Friese; Martin Zaefferer; Thomas Bartz-Beielstein; Oliver Flasch; Patrick Koch; Wolfgang Konen; Boris Naujoks; Forschungsstelle Ciop
Archive | 2012
Martin Zae; Thomas Bartz-Beielstein; Martina Friese; Boris Naujoks; Oliver Flasch