Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michael Pernice is active.

Publication


Featured researches published by Michael Pernice.


SIAM Journal on Scientific Computing | 1998

NITSOL: A Newton Iterative Solver for Nonlinear Systems

Michael Pernice; Homer F. Walker

We introduce a well-developed Newton iterative (truncated Newton) algorithm for solving large-scale nonlinear systems. The framework is an inexact Newton method globalized by backtracking. Trial steps are obtained using one of several Krylov subspace methods. The algorithm is implemented in a Fortran solver called NITSOL that is robust yet easy to use and provides a number of useful options and features. The structure offers the user great flexibility in addressing problem specificity through preconditioning and other means and allows easy adaptation to parallel environments. Features and capabilities are illustrated in numerical experiments.


ieee international conference on high performance computing data and analytics | 2013

Multiphysics simulations: Challenges and opportunities

David E. Keyes; Lois Curfman McInnes; Carol S. Woodward; William Gropp; Eric Myra; Michael Pernice; John B. Bell; Jed Brown; Alain Clo; Jeffrey M. Connors; Emil M. Constantinescu; Donald Estep; Kate Evans; Charbel Farhat; Ammar Hakim; Glenn E. Hammond; Glen A. Hansen; Judith C. Hill; Tobin Isaac; Kirk E. Jordan; Dinesh K. Kaushik; Efthimios Kaxiras; Alice Koniges; Kihwan Lee; Aaron Lott; Qiming Lu; John Harold Magerlein; Reed M. Maxwell; Michael McCourt; Miriam Mehl

We consider multiphysics applications from algorithmic and architectural perspectives, where “algorithmic” includes both mathematical analysis and computational complexity, and “architectural” includes both software and hardware environments. Many diverse multiphysics applications can be reduced, en route to their computational simulation, to a common algebraic coupling paradigm. Mathematical analysis of multiphysics coupling in this form is not always practical for realistic applications, but model problems representative of applications discussed herein can provide insight. A variety of software frameworks for multiphysics applications have been constructed and refined within disciplinary communities and executed on leading-edge computer systems. We examine several of these, expose some commonalities among them, and attempt to extrapolate best practices to future systems. From our study, we summarize challenges and forecast opportunities.


SIAM Journal on Scientific Computing | 2005

Solution of Equilibrium Radiation Diffusion Problems Using Implicit Adaptive Mesh Refinement

Michael Pernice; Bobby Philip

Diffusion approximations to radiation transport feature a nonlinear conduction coefficient that leads to formation of a sharp front, or Marshak wave, under suitable initial and boundary conditions. The front can vary several orders of magnitude over a very short distance. Resolving the shape of the Marshak wave is essential, but using a global fine mesh can be prohibitively expensive. In such circumstances it is natural to consider using adaptive mesh refinement (AMR) to place a fine mesh only in the vicinity of the propagating front. In addition, to avoid any loss of accuracy due to linearization, implicit time integration should be used to solve the equilibrium radiation diffusion equation. Implicit time integration on AMR grids introduces a new challenge, as algorithmic complexity must be controlled to fully realize the performance benefits of AMR\@. A Newton--Krylov method together with a multigrid preconditioner addresses this latter issue on a uniform grid. A straightforward generalization is to use a multilevel preconditioner that is tuned to the structure of the AMR grid, such as the fast adaptive composite grid (FAC) method. We describe the resulting Newton--Krylov-FAC method and demonstrate its performance on simple equilibrium radiation diffusion problems.


Journal of Computational and Applied Mathematics | 2009

A posteriori error analysis of a cell-centered finite volume method for semilinear elliptic problems

Donald Estep; Michael Pernice; Du Pham; Simon Tavener; Haiying Wang

In this paper, we conduct a goal-oriented a posteriori analysis for the error in a quantity of interest computed from a cell-centered finite volume scheme for a semilinear elliptic problem. The a posteriori error analysis is based on variational analysis, residual errors and the adjoint problem. To carry out the analysis, we use an equivalence between the cell-centered finite volume scheme and a mixed finite element method with special choice of quadrature.


Journal of Computational Physics | 2014

Dynamic implicit 3D adaptive mesh refinement for non-equilibrium radiation diffusion

Bobby Philip; Zhen Wang; M. Berrill; Manuel Birke; Michael Pernice

The time dependent non-equilibrium radiation diffusion equations are important for solving the transport of energy through radiation in optically thick regimes and find applications in several fields including astrophysics and inertial confinement fusion. The associated initial boundary value problems that are encountered often exhibit a wide range of scales in space and time and are extremely challenging to solve. To efficiently and accurately simulate these systems we describe our research on combining techniques that will also find use more broadly for long term time integration of nonlinear multi-physics systems: implicit time integration for efficient long term time integration of stiff multi-physics systems, local control theory based step size control to minimize the required global number of time steps while controlling accuracy, dynamic 3D adaptive mesh refinement (AMR) to minimize memory and computational costs, Jacobian Free Newton-Krylov methods on AMR grids for efficient nonlinear solution, and optimal multilevel preconditioner components that provide level independent solver convergence.


computational science and engineering | 2014

NKS method for the implicit solution of a coupled allen-cahn/cahn-hilliard system

Chao Yang; Xiao-Chuan Cai; David E. Keyes; Michael Pernice

The coupled Allen-Cahn/Cahn-Hilliard system consists of high-order partial differential equations that make explicit methods hard to apply due to the severe restriction on the time step size. In order to relax the restriction and obtain steady-state solution(s) in an efficient way, we use a fully implicit method for the coupled system and employ a Newton-Krylov-Schwarz algorithm to solve the nonlinear algebraic equations arising at each time step. In the Schwarz preconditioner we impose low-order homogeneous boundary conditions for subdomain problems. We investigate several choices of subdomain solvers as well as different overlaps. Numerical experiments on a supercomputer with thousands of processor cores are provided to show the scalability of the fully implicit solver.


Archive | 2014

Scalable Nonlinear Solvers for Fully Implicit Coupled Nuclear Fuel Modeling. Final Report

Xiao-Chuan Cai; David E. Keyes; Chao Yang; Xiang Zheng; Michael Pernice

The focus of the project is on the development and customization of some highly scalable domain decomposition based preconditioning techniques for the numerical solution of nonlinear, coupled systems of partial differential equations (PDEs) arising from nuclear fuel simulations. These high-order PDEs represent multiple interacting physical fields (for example, heat conduction, oxygen transport, solid deformation), each is modeled by a certain type of Cahn-Hilliard and/or Allen-Cahn equations. Most existing approaches involve a careful splitting of the fields and the use of field-by-field iterations to obtain a solution of the coupled problem. Such approaches have many advantages such as ease of implementation since only single field solvers are needed, but also exhibit disadvantages. For example, certain nonlinear interactions between the fields may not be fully captured, and for unsteady problems, stable time integration schemes are difficult to design. In addition, when implemented on large scale parallel computers, the sequential nature of the field-by-field iterations substantially reduces the parallel efficiency. To overcome the disadvantages, fully coupled approaches have been investigated in order to obtain full physics simulations.


Archive | 2013

Exploratory Nuclear Reactor Safety Analysis and Visualization via Integrated Topological and Geometric Techniques

Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer; Diego Mandelli; Michael Pernice; Robert Nourgaliev

A recent trend in the nuclear power engineering field is the implementation of heavily computational and time consuming algorithms and codes for both design and safety analysis. In particular, the new generation of system analysis codes aim to embrace several phenomena such as thermo-hydraulic, structural behavior, and system dynamics, as well as uncertainty quantification and sensitivity analyses. The use of dynamic probabilistic risk assessment (PRA) methodologies allows a systematic approach to uncertainty quantification. Dynamic methodologies in PRA account for possible coupling between triggered or stochastic events through explicit consideration of the time element in system evolution, often through the use of dynamic system models (simulators). They are usually needed when the system has more than one failure mode, control loops, and/or hardware/process/software/human interaction. Dynamic methodologies are also capable of modeling the consequences of epistemic and aleatory uncertainties. The Monte-Carlo (MC) and the Dynamic Event Tree (DET) approaches belong to this new class of dynamic PRA methodologies. The major challenges in using MC and DET methodologies (as well as other dynamic methodologies) are the heavier computational and memory requirements compared to the classical ET analysis. This is due to the fact that each branch generated can contain time evolutions of a large number of variables (about 50,000 data channels are typically present in RELAP) and a large number of scenarios can be generated from a single initiating event (possibly on the order of hundreds or even thousands). Such large amounts of information are usually very difficult to organize in order to identify the main trends in scenario evolutions and the main risk contributors for each initiating event. This report aims to improve Dynamic PRA methodologies by tackling the two challenges mentioned above using: 1) adaptive sampling techniques to reduce computational cost of the analysis and 2) topology-based methodologies to interactively visualize multidimensional data and extract risk-informed insights. Regarding item 1) we employ learning algorithms that aim to infer/predict simulation outcome and decide the coordinate in the input space of the next sample that maximize the amount of information that can be gained from it. Such methodologies can be used to both explore and exploit the input space. The later one is especially used for safety analysis scopes to focus samples along the limit surface, i.e. the boundaries in the input space between system failure and system success. Regarding item 2) we present a software tool that is designed to analyze multi-dimensional data. We model a large-scale nuclear simulation dataset as a high-dimensional scalar function defined over a discrete sample of the domain. First, we provide structural analysis of such a function at multiple scales and provide insight into the relationship between the input parameters and the output. Second, we enable exploratory analysis for users, where we help the users to differentiate features from noise through multi-scale analysis on an interactive platform, based on domain knowledge and data characterization. Our analysis is performed by exploiting the topological and geometric properties of the domain, building statistical models based on its topological segmentations and providing interactive visual interfaces to facilitate such explorations.


Archive | 2012

Exploration of High-dimensional Scalar Function for Nuclear Reactor Safety Analysis and Visualization: A User's Guide to TopoXG*

Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer; Michael Pernice; Diego Mandelli

The next generation of methodologies for nuclear reactor Probabilistic Risk Assessment (PRA) explicitly accounts for the time element in modeling the probabilistic system evolution and uses numerical simulation tools to account for possible dependencies between failure events. The Monte-Carlo (MC) and the Dynamic Event Tree (DET) approaches belong to this new class of dynamic PRA methodologies. A challenge of dynamic PRA algorithms is the large amount of data they produce which may be difficult to visualize and analyze in order to extract useful information. We present a software tool that is designed to address these goals. We model a large-scale nuclear simulation dataset as a high-dimensional scalar function defined over a discrete sample of the domain. First, we provide structural analysis of such a function at multiple scales and provide insight into the relationship between the input parameters and the output. Second, we enable exploratory analysis for users, where we help the users to differentiate features from noise through multi-scale analysis on an interactive platform, based on domain knowledge and data characterization. Our analysis is performed by exploiting the topological and geometric properties of the domain, building statistical models based on its topological segmentations and providing interactive visual interfaces to facilitate such explorations. We provide a user’s guide to our software tool by highlighting its analysis and visualization capabilities, along with a use case involving data from


Archive | 2010

Process for selecting NEAMS applications for access to Idaho National Laboratory high performance computing resources

Michael Pernice

INL has agreed to provide participants in the Nuclear Energy Advanced Mod- eling and Simulation (NEAMS) program with access to its high performance computing (HPC) resources under sponsorship of the Enabling Computational Technologies (ECT) program element. This report documents the process used to select applications and the software stack in place at INL.

Collaboration


Dive into the Michael Pernice's collaboration.

Top Co-Authors

Avatar

Bobby Philip

Oak Ridge National Laboratory

View shared research outputs
Top Co-Authors

Avatar

David E. Keyes

King Abdullah University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Diego Mandelli

Idaho National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Donald Estep

Colorado State University

View shared research outputs
Top Co-Authors

Avatar

Luis Chacon

Oak Ridge National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Xiao-Chuan Cai

University of Colorado Boulder

View shared research outputs
Researchain Logo
Decentralizing Knowledge