Dietmar Ratz
Karlsruhe Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Dietmar Ratz.
SIAM Journal on Numerical Analysis | 1997
Tibor Csendes; Dietmar Ratz
This paper gives a short overview of the latest results on the role of the interval subdivision selection rule in branch-and-bound algorithms for global optimization. The class of rules that allow convergence for two slightly different model algorithms is characterized, and it is shown that the four rules investigated satisfy the conditions of convergence. An extensive numerical study with a wide spectrum of test problems indicates that there are substantial differences between the rules in terms of the required CPU time, the number of function and derivative evaluations and space complexity. Two of the rules can provide substantial improvements in efficiency.
Journal of Global Optimization | 1995
Dietmar Ratz; Tibor Csendes
This paper investigates the influence of the interval subdivision selection rule on the convergence of interval branch-and-bound algorithms for global optimization. For the class of rules that allows convergence, we study the effects of the rules on a model algorithm with special list ordering. Four different rules are investigated in theory and in practice. A wide spectrum of test problems is used for numerical tests indicating that there are substantial differences between the rules with respect to the required CPU time, the number of function and derivative evaluations, and the necessary storage space. Two rules can provide considerable improvements in efficiency for our model algorithm.
Journal of Global Optimization | 1999
Dietmar Ratz
In this paper we introduce a pruning technique based on slopes in the context of interval branch-and-bound methods for nonsmooth global optimization. We develop the theory for a slope pruning step which can be utilized as an accelerating device similar to the monotonicity test frequently used in interval methods for smooth problems. This pruning step offers the possibility to cut away a large part of the box currently investigated by the optimization algorithm. We underline the new techniques efficiency by comparing two variants of a global optimization model algorithm: one equipped with the monotonicity test and one equipped with the pruning step. For this reason, we compared the required CPU time, the number of function and derivative or slope evaluations, and the necessary storage space when solving several smooth global optimization problems with the two variants. The paper concludes on the test results for several nonsmooth examples.
Reliable Computing | 1995
Christine Jäger; Dietmar Ratz; К. йегер; Л. Рац
We consider the problem of finding interval enclosures of all zeros of a nonlinear system of polynomial equations. We present a method which combines the method of Gröbner bases (used as a preprocessing step), some techniques from interval analysis, and a special version of the algorithm of E. Hansen for solving nonlinear equations in one variable. The latter is applied to a triangular form of the system of equations, which is generated by the preprocessing step. Our method is able to check if the given system has a finite number of zeros and to compute verfied enclosures for all these zeros. Several test results demonstrate that our method is much faster than the application of Hansen’s multidimensional algorithm (or similar methods) to the original nonlinear systems of polynomial equations.AbstractРассматрнвается залача нахожлення интервальных оболочек всех корней нелннейной снстемы полиномиальных уравнений. Прелстаилена процелура, обьелиняюшая метод базисов Грёбнера (нснользуемый на нрелварнтельном этаце вычцслений), некоторые методнки интервального аналнза н особую разновидность алгоритма Е. Хансена для решення нелинейных уравнений с одной иеременной. Послелний прнменяется к треугольному уредставлению, снстемы уравнений, сформированному на предварнтельном этапе. Онисываемый метод снособен проверять, нмеет, ли данная снстема конечное чнсло корней, и вычнслять вернфицированные ободочкн для всех корней. Несколько чнсленных прнмеров ноказывают, что наш метод является намного более быстрым, чем многомерный алгорнтм Нансена (илп аналогичные методы) в нрименении к исходным нелииейным снстемам полнномнальных уравнений.
Numerical Algorithms | 2004
Tamás Vinkó; Dietmar Ratz
In this paper a new multidimensional extension of the recently developed one-dimensional enclosure method called kite is given for interval global optimization. A more sophisticated version of the pruning technique based on the kite method is introduced. By the new componentwise approach all the one-dimensional theoretical results and procedures can be used in the higher-dimensional case. The possibilities in the implementation of the new algorithm together with numerical results on 40 standard test problems are presented.
Archive | 1995
Ulrich W. Kulisch; Rolf Hammer; Matthias Hocks; Dietmar Ratz
One of the most important tasks in scientific computing is the problem of finding zeros (or roots) of nonlinear functions. In classical numerical analysis, root-finding methods for nonlinear functions begin with an approximation and apply an iterative method (such as Newton’s or Halley’s methods), which hopefully improves the approximation. It is a myth that no numerical algorithm is able to compute all zeros of a nonlinear equation with guaranteed error bounds, or even more, that no method is able to give concrete information about the existence and uniqueness of solutions of such a problem.
Archive | 2001
András Erik Csallner; Rudi Klatte; Dietmar Ratz; Andreas Wiethoff
The global optimization problem with simple bounds which is the scope of this work can be defined in general as min x∈X f(x) where X is a — possibly multidimensional — interval. The original problem can be solved with verified accuracy with the aid of interval subdivision methods. These algorithms are based on the well-known branch-and-bound principle. The methods pruning the search tree of these algorithms are the so-called accelerating devices. One of the most effective of these is the interval Newton step, however, its time complexity is relatively high as compared with other accelerating devices. Therefor it should only be deployed if there are no other possibilities to effectively bound the search tree. Methods like the boxing method can decrease the number of the applied Newton steps. The present paper discusses some of these methods and shows the numerical effects of their implementation.
Archive | 1997
Dietmar Ratz
Interval-branch-and-bound methods for global optimization very often incorporate interval Newton Gauss-Seidel steps to reduce the widths of the boxes resulting from the basic branch-and-bound method. These steps try to determine the roots of the gradient of the objective function, whereas various other techniques eliminate the regions containing roots which do not correspond to global optimizers.
Archive | 1995
Ulrich W. Kulisch; Rolf Hammer; Matthias Hocks; Dietmar Ratz
We consider the complex polynomial p: C → C defined by
Archive | 1995
Ulrich W. Kulisch; Rolf Hammer; Matthias Hocks; Dietmar Ratz