Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Peter N. Brown is active.

Publication


Featured researches published by Peter N. Brown.


ACM Transactions on Mathematical Software | 2005

SUNDIALS: Suite of nonlinear and differential/algebraic equation solvers

Alan C. Hindmarsh; Peter N. Brown; Keith E. Grant; Steven L. Lee; Radu Serban; D.E. Shumaker; Carol S. Woodward

SUNDIALS is a suite of advanced computational codes for solving large-scale problems that can be modeled as a system of nonlinear algebraic equations, or as initial-value problems in ordinary differential or differential-algebraic equations. The basic versions of these codes are called KINSOL, CVODE, and IDA, respectively. The codes are written in ANSI standard C and are suitable for either serial or parallel machine environments. Common and notable features of these codes include inexact Newton-Krylov methods for solving large-scale nonlinear systems; linear multistep methods for time-dependent problems; a highly modular structure to allow incorporation of different preconditioning and/or linear solver methods; and clear interfaces allowing for users to provide their own data structures underneath the solvers. We describe the current capabilities of the codes, along with some of the algorithms and heuristics used to achieve efficiency and robustness. We also describe how the codes stem from previous and widely used Fortran 77 solvers, and how the codes have been augmented with forward and adjoint methods for carrying out first-order sensitivity analysis with respect to model parameters or initial conditions.


Siam Journal on Scientific and Statistical Computing | 1989

VODE: a variable-coefficient ODE solver

Peter N. Brown; George D. Byrne; Alan C. Hindmarsh

VODE is a new initial value ODE solver for stiff and nonstiff systems. It uses variable-coefficient Adams-Moulton and Backward Differentiation Formula (BDF) methods in Nordsieck form, as taken from the older solvers EPISODE and EPISODEB, treating the Jacobian as full or banded. Unlike the older codes, VODE has a highly flexible user interface that is nearly identical to that of the ODEPACK solver LSODE.In the process, several algorithmic improvements have been made in VODE, aside from the new user interface. First, a change in stepsize and/or order that is decided upon at the end of one successful step is not implemented until the start of the next step, so that interpolations performed between steps use the more correct data. Second, a new algorithm for setting the initial stepsize has been included, which iterates briefly to estimate the required second derivative vector. Efficiency is often greatly enhanced by an added algorithm for saving and reusing the Jacobian matrix J, as it occurs in the Newton m...


Siam Journal on Scientific and Statistical Computing | 1990

Hybrid Krylov methods for nonlinear systems of equations

Peter N. Brown; Youcef Saad

Several implementations of Newton-like iteration schemes based on Krylov subspace projection methods for solving nonlinear equations are considered. The simplest such class of methods is Newtons algorithm in which a (linear) Krylov method is used to solve the Jacobian system approximately. A method in this class is referred to as a Newton–Krylov algorithm. To improve the global convergence properties of these basic algorithms, hybrid methods based on Powells dogleg strategy are proposed, as well as linesearch backtracking procedures. The main advantage of the class of methods considered in this paper is that the Jacobian matrix is never needed explicitly.


SIAM Journal on Scientific Computing | 1994

Using Krylov methods in the solution of large-scale differential-algebraic systems

Peter N. Brown; Alan C. Hindmarsh; Linda R. Petzold

In this paper, a new algorithm for the solution of large-scale systems of differential-algebraic equations is described. It is based on the integration methods in the solver DASSL, but instead of a direct method for the associated linear systems which arise at each time step, we apply the preconditioned GMRES iteration in combination with an Inexact Newton Method. The algorithm, along with those in DASSL, is implemented in a new solver called DASPK. We outline the algorithms and strategies used, and discuss the use of the solver. We develop and analyze some preconditioners for a certain class of DAE stems, and finally demonstrate the application of DASPK on two example problems.


Siam Journal on Optimization | 1994

Convergence Theory of Nonlinear Newton–Krylov Algorithms

Peter N. Brown; Youcef Saad

This paper presents some convergence theory for nonlinear Krylov subspace methods. The basic idea of these methods, which have been described by the authors in an earlier paper, is to use variants of Newton’s iteration in conjunction with a Krylov subspace method for solving the Jacobian linear systems. These methods are variants of inexact Newton methods where the approximate Newton direction is taken from a subspace of small dimension. The main focus of this paper is to analyze these methods when they are combined with global strategies such as linesearch techniques and model trust region algorithms. Most of the convergence results are formulated for projection onto general subspaces rather than just Krylov subspaces.


SIAM Journal on Numerical Analysis | 1986

Matrix-free methods for stiff systems of ODE's

Peter N. Brown; Alan C. Hindmarsh

We study here a matrix-free method for solving stiff systems of ordinary differential equations (ODE’s). In the numerical time integration of stiff ODE initial value problems by BDF methods, the resulting nonlinear algebraic system is usually solved by a modified Newton method and an appropriate linear system algorithm. In place of that, we substitute Newton’s method (unmodified) coupled with an iterative linear system method. The latter is a projection method called the Incomplete Orthogonalization Method (IOM), developed mainly by Y. Saad. A form of IOM, with scaling included to enhance robustness, is studied in the setting of Inexact Newton Methods. The implementation requires no Jacobian matrix storage whatever. Tests on several stiff problems, of sizes up to 16,000, show the method to be quite effective and much more economical, in both computational cost and storage, than standard solution methods, at least when the problem has a certain amount of clustering in its spectrum.


SIAM Journal on Matrix Analysis and Applications | 1997

GMRES On (Nearly) Singular Systems

Peter N. Brown; Homer F. Walker

We consider the behavior of the GMRES method for solving a linear system


Applied Mathematics and Computation | 1989

Reduced storage matrix methods in stiff ODE systems

Peter N. Brown; Alan C. Hindmarsh

Ax = b


Siam Journal on Scientific and Statistical Computing | 1991

A theoretical comparison of the Arnoldi and GMRES algorithms

Peter N. Brown

when


SIAM Journal on Scientific Computing | 1998

Consistent Initial Condition Calculation for Differential-Algebraic Systems

Peter N. Brown; Alan C. Hindmarsh; Linda R. Petzold

A

Collaboration


Dive into the Peter N. Brown's collaboration.

Top Co-Authors

Avatar

Carol S. Woodward

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Alan C. Hindmarsh

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Barna L. Bihari

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Frank Graziani

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Jim E. Jones

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Robert D. Falgout

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Barry Lee

Lawrence Livermore National Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge