Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where John E. Dennis is active.

Publication


Featured researches published by John E. Dennis.


Siam Journal on Optimization | 2006

Mesh Adaptive Direct Search Algorithms for Constrained Optimization

Charles Audet; John E. Dennis

This paper addresses the problem of minimization of a nonsmooth function under general nonsmooth constraints when no derivatives of the objective or constraint functions are available. We introduce the mesh adaptive direct search (MADS) class of algorithms which extends the generalized pattern search (GPS) class by allowing local exploration, called polling, in an asymptotically dense set of directions in the space of optimization variables. This means that under certain hypotheses, including a weak constraint qualification due to Rockafellar, MADS can treat constraints by the extreme barrier approach of setting the objective to infinity for infeasible points and treating the problem as unconstrained. The main GPS convergence result is to identify limit points


Siam Journal on Optimization | 2002

Analysis of Generalized Pattern Searches

Charles Audet; John E. Dennis

\hat{x}


Siam Journal on Optimization | 1994

Problem Formulation for Multidisciplinary Optimization

Evin J. Cramer; John E. Dennis; Paul D. Frank; Robert Michael Lewis; Gregory R. Shubin

, where the Clarke generalized derivatives are nonnegative in a finite set of directions, called refining directions. Although in the unconstrained case, nonnegative combinations of these directions span the whole space, the fact that there can only be finitely many GPS refining directions limits rigorous justification of the barrier approach to finitely many linear constraints for GPS. The main result of this paper is that the general MADS framework is flexible enough to allow the generation of an asymptotically dense set of refining directions along which the Clarke derivatives are nonnegative. We propose an instance of MADS for which the refining directions are dense in the hypertangent cone at


Siam Journal on Optimization | 1991

Direct Search Methods on Parallel Machines

John E. Dennis; Virginia Torczon

\hat{x}


Siam Journal on Optimization | 2000

Pattern Search Algorithms for Mixed Variable Programming

Charles Audet; John E. Dennis

with probability 1 whenever the iterates associated with the refining directions converge to a single


Siam Journal on Optimization | 2009

A Progressive Barrier for Derivative-Free Nonlinear Programming

Charles Audet; John E. Dennis

\hat{x}


Siam Journal on Optimization | 2009

OrthoMADS: A Deterministic MADS Instance with Orthogonal Directions

Mark A. Abramson; Charles Audet; John E. Dennis; Sébastien Le Digabel

. The instance of MADS is compared to versions of GPS on some test problems. We also illustrate the limitation of our results with examples.


Siam Journal on Optimization | 1997

A Global Convergence Theory for General Trust-Region-Based Algorithms for Equality Constrained Optimization

John E. Dennis; Mahmoud El-Alem; María C. Maciel

This paper contains a new convergence analysis for the Lewis and Torczon generalized pattern search (GPS) class of methods for unconstrained and linearly constrained optimization. This analysis is motivated by a desire to understand the successful behavior of the algorithm under hypotheses that are satisfied by many practical problems. Specifically, even if the objective function is discontinuous or extended-valued, the methods find a limit point with some minimizing properties. Simple examples show that the strength of the optimality conditions at a limit point depends not only on the algorithm, but also on the directions it uses and on the smoothness of the objective at the limit point in question. The contribution of this paper is to provide a simple convergence analysis that supplies detail about the relation of optimality conditions to objective smoothness properties and to the defining directions for the algorithm, and it gives previous results as corollaries.


Siam Journal on Optimization | 1999

A Trust-Region Approach to Nonlinear Systems of Equalities and Inequalities

John E. Dennis; Mahmoud El-Alem; Karen Williamson

This paper is about multidisciplinary (design) optimization, or MDO, the coupling of two or more analysis disciplines with numerical optimization.The paper has three goals. First, it is an expository introduction to MDO aimed at those who do research on optimization algorithms, since the optimization community has much to contribute to this important class of computational engineering problems. Second, this paper presents to the MDO research community a new abstraction for multidisciplinary analysis and design problems as well as new decomposition formulations for these problems. Third, the “individual discipline feasible” (IDF) approaches introduced here make use of existing specialized analysis codes, and they introduce significant opportunities for coarse-grained computational parallelism particularly well suited to heterogeneous computing environments.The key distinguishing characteristic of the three fundamental approaches to MDO formulation discussed here is the kind of disciplinary feasibility that...


Mathematical Programming | 2004

Generalized pattern searches with derivative information

Mark A. Abramson; Charles Audet; John E. Dennis

Direct search methods are methods designed to solve unconstrained minimization problems of the form:

Collaboration


Dive into the John E. Dennis's collaboration.

Top Co-Authors

Avatar

Charles Audet

École Polytechnique de Montréal

View shared research outputs
Top Co-Authors

Avatar

Mark A. Abramson

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Sébastien Le Digabel

École Polytechnique de Montréal

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge