Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stephen G. Nash is active.

Publication


Featured researches published by Stephen G. Nash.


Archive | 2009

Linear and nonlinear optimization

Igor Griva; Stephen G. Nash; Ariela Sofer

Preface Part I. Basics: 1. Optimization models 2. Fundamentals of optimization 3. Representation of linear constraints Part II. Linear Programming: 4. Geometry of linear programming 5. The simplex method 6. Duality and sensitivity 7. Enhancements of the simplex method 8. Network problems 9. Computational complexity of linear programming 10. Interior-point methods of linear programming Part III. Unconstrained Optimization: 11. Basics of unconstrained optimization 12. Methods for unconstrained optimization 13. Low-storage methods for unconstrained problems Part IV. Nonlinear Optimization: 14. Optimality conditions for constrained problems 15. Feasible-point methods 16. Penalty and barrier methods Part V. Appendices: Appendix A. Topics from linear algebra Appendix B. Other fundamentals Appendix C. Software Bibliography Index.


SIAM Journal on Numerical Analysis | 1984

Newton-Type Minimization via the Lanczos Method

Stephen G. Nash

This paper discusses the use of the linear conjugate-gradient method (developed via the Lanczos method) in the solution of large-scale unconstrained minimization problems. At each iteration of a Newton-type method, the direction of search is defined as the solution of a quadratic subproblem. When the number of variables is very large, this subproblem may be solved using the linear conjugate-gradient method of Hestenes and Stiefel. We show how the equivalent Lanczos characterization of the linear conjugate-gradient method may be exploited to define a modified Newton method which can be applied to problems that do not necessarily have positive-definite Hessian matrices at all points of the region of interest. This derivation also makes it possible to compute a negative-curvature direction at a stationary point.The idea of a truncated Newton method is to perform only a limited number of iterations of the quadratic subproblem. This effectively gives a search direction that interpolates between the steepest-de...


Journal of Computational and Applied Mathematics | 2000

A survey of truncated-Newton methods

Stephen G. Nash

Truncated-Newton methods are a family of methods for solving large optimization problems. Over the past two decades, a solid convergence theory has been derived for the methods. In addition, many algorithmic enhancements have been developed and studied, resulting in a number of publicly available software packages. The result has been a collection of powerful, flexible, and adaptable tools for large-scale nonlinear optimization.


Siam Journal on Optimization | 1991

A NUMERICAL STUDY OF THE LIMITED MEMORY BFGS METHOD AND THE TRUNCATED-NEWTON METHOD FOR LARGE SCALE OPTIMIZATION*

Stephen G. Nash; Jorge Nocedal

This paper examines the numerical performances of two methods for large-scale optimization: a limited memory quasi-Newton method (L-BFGS), and a discrete truncated-Newton method (TN). Various ways of classifying test problems are discussed in order to better understand the types of problems that each algorithm solves well. The L-BFGS and TN methods are also compared with the Polak–Ribiere conjugate gradient method.


International Journal of Control | 1989

Approaches to robust pole assignment

Ralph Byers; Stephen G. Nash

Abstract Robust pole assignment is a non-linear optimization problem in many variables. We describe numerical methods for determining robust or well-conditioned so-lutions to the problem of pole assignment by state feedback. The solutions are chosen to minimize various objective functions based on the condition number of the eigenvector matrix. Careful choice of parametrization and objective function avoids singularities and artificial variable constraints; explicit formulae for the gradient and hessian permit rigorous stopping criteria and rapid local convergence. Several computational examples are included.


Siam Journal on Scientific and Statistical Computing | 1985

Preconditioning of Truncated-Newton Methods

Stephen G. Nash

In this paper we discuss the use of truncated-Newton methods, a flexible class of iterative methods, in the solution of large-scale unconstrained minimization problems. At each major iteration, the Newton equations are approximately solved by an inner iterative algorithm. The performance of the inner algorithm, and in addition the total method, can be greatly improved by the addition of preconditioning and scaling strategies. Preconditionings can be developed using either the outer nonlinear algorithm or using information computed during the inner iteration. Several preconditioning schemes are derived and tested.Numerical tests show that a carefully chosen truncated-Newton method can perform well in comparison with nonlinear conjugate-gradient-type algorithms. This is significant, since the two classes of methods have comparable storage and operation counts, and they are the only practical methods for solving many large-scale problems. In addition, with the Hessian matrix available, the truncated-Newton a...


Optimization Methods & Software | 2000

A multigrid approach to discretized optimization problems

Stephen G. Nash

Many large optimization problems represent a family of models of varying size, corresponding to different discretizations. An example is optimal control problems where the solution is a function that is approximated by its values at finitely many points. We discuss optimization techniques suitable for nonlinear programs of this type, with an emphasis on algorithms that guarantee global convergence. The goal is to exploit the similar structure among the subproblems, using the solutions of smaller subproblems to accelerate the solution of larger, more refined subproblems.


SIAM Journal on Scientific Computing | 2005

Model Problems for the Multigrid Optimization of Systems Governed by Differential Equations

Robert Michael Lewis; Stephen G. Nash

We discuss a multigrid approach to the optimization of systems governed by differential equations. Such optimization problems appear in many applications and are of a different nature than systems of equations. Our approach uses an optimization-based multigrid algorithm in which the multigrid algorithm relies explicitly on nonlinear optimization models as subproblems on coarser grids. Our goal is not to argue for a particular optimization-based multigrid algorithm, but instead to demonstrate how multigrid can be used to accelerate nonlinear programming algorithms. Furthermore, using several model problems we give evidence (both theoretical and numerical) that the optimization setting is well suited to multigrid algorithms. Some of the model problems show that the optimization problem may be more amenable to multigrid than the governing differential equation. In addition, we relate the multigrid approach to more traditional optimization methods as further justification for the use of an optimization-based multigrid algorithm.


Informs Journal on Computing | 1993

A Barrier Method for Large-Scale Constrained Optimization

Stephen G. Nash; Ariela Sofer

A logarithmic barrier method is applied to the solution of a nonlinear programming problem with inequality constraints. An approximation to the Newton direction is derived that avoids the ill conditioning normally associated with barrier methods. This approximation can be used within a truncated-Newton method, and hence is suitable for large-scale problems; the approximation can also be used in the context of a parallel algorithm. Enhancements to the basic barrier method are described that improve its efficiency and reliability. The resulting method can be shown to be a primal-dual method when the objective function is convex and all of the constraints are linear. Computational experiments are presented where the method is applied to 1000-variable problems with bound constraints. INFORMS Journal on Computing , ISSN 1091-9856, was published as ORSA Journal on Computing from 1989 to 1995 under ISSN 0899-1499.


Archive | 1994

A Numerical Comparison of Barrier and Modified Barrier Methods For Large-Scale Bound-Constrained Optimization

Stephen G. Nash; Roman A. Polyak; Ariela Sofer

When a classical barrier method is applied to the solution of a nonlinear programming problem with inequality constraints, the Hessian matrix of the barrier function becomes increasingly ill-conditioned as the solution is approached. As a result, it may be desirable to consider alternative numerical algorithms. We compare the performance of two methods motivated by barrier functions. The first is a stabilized form of the classical barrier method, where a numerically stable approximation to the Newton direction is used when the barrier parameter is small. The second is a modified barrier method where a barrier function is applied to a shifted form of the problem, and the resulting barrier terms are scaled by estimates of the optimal Lagrange multipliers. The condition number of the Hessian matrix of the resulting modified barrier function remains bounded as the solution to the constrained optimization problem is approached. Both of these techniques can be used in the context of a truncated- Newton method, and hence can be applied to large problems, as well as on parallel computers. In this paper, both techniques are applied to problems with bound constraints and we compare their practical behavior.

Collaboration


Dive into the Stephen G. Nash's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Paul T. Boggs

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Giovanni Fasano

Ca' Foscari University of Venice

View shared research outputs
Top Co-Authors

Avatar

Massimo Roma

Sapienza University of Rome

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ralph Byers

Northern Illinois University

View shared research outputs
Top Co-Authors

Avatar

Richard H F Jackson

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Robert H. Nilson

Sandia National Laboratories

View shared research outputs
Top Co-Authors

Avatar

Stewart K. Griffiths

Sandia National Laboratories

View shared research outputs
Researchain Logo
Decentralizing Knowledge