Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jay Walter Larson is active.

Publication


Featured researches published by Jay Walter Larson.


ieee international conference on high performance computing data and analytics | 2005

The Model Coupling Toolkit: A New Fortran90 Toolkit for Building Multiphysics Parallel Coupled Models

Jay Walter Larson; Robert L. Jacob; Everest T. Ong

Many problems in science and engineering are best simulated as a set of mutually interacting models, resulting in a coupled or multiphysics model. These models present challenges stemming from their interdisciplinary nature and from their computational and algorithmic complexities. The computational complexity of individual models, combined with the popularity of the distributed-memory parallel programming model used on commodity micro-processor-based clusters, results in a parallel coupling problem when building a coupled model. We define and elucidate this problem and how it results in a set of requirements for software capable of simplifying the construction of parallel coupled models. We describe the package, the Model Coupling Toolkit (MCT), which we have developed to meet these general requirements and the specific requirements of a parallel climate model. We present the MCT programming model with illustrative code examples. We present representative results that measure MCT’s scalability, performance portability, and a proxy for coupling overhead.


ieee international conference on high performance computing data and analytics | 2005

M X N Communication and Parallel Interpolation in Community Climate System Model Version 3 Using the Model Coupling Toolkit

Robert L. Jacob; Jay Walter Larson; Everest T. Ong

The Model Coupling Toolkit (MCT) is a software library for constructing parallel coupled models from individual parallel models. MCT was created to address the challenges of creating a parallel coupler for the Community Climate System Model (CCSM). Each of the submodels that make up CCSM is a separate parallel application with its own domain decomposition, running on its own set of processors. This application contains multiple instances of the M × N problem, the problem of transferring data between two parallel programs running on disjoint sets of processors. CCSM also requires efficient data transfer to facilitate its interpolation algorithms. MCT was created as a generalized solution to handle these and other common functions in parallel coupled models. Here we describe MCT’s implementation of the data transfer infrastructure needed for a parallel coupled model. The performance of MCT scales satisfactorily as processors are added to the system. However, the types of decompositions used in the submodels can affect performance. MCT’s infrastructure provides a flexible and high-performing set of tools for enabling interoperability between parallel applications.


ieee international conference on high performance computing data and analytics | 2005

CPL6: The New Extensible, High Performance Parallel Coupler for the Community Climate System Model

Anthony P. Craig; Robert L. Jacob; Brian Kauffman; Thomas W. Bettge; Jay Walter Larson; Everest T. Ong; Chris H. Q. Ding; Yun He

Coupled climate models are large, multiphysics applications designed to simulate the Earth’s climate and predict the response of the climate to any changes in the forcing or boundary conditions. The Community Climate System Model (CCSM) is a widely used state-of-the-art climate model that has released several versions to the climate community over the past ten years. Like many climate models, CCSM employs a coupler, a functional unit that coordinates the exchange of data between parts of the climate system such as the atmosphere and ocean. In this paper we describe the new coupler, cpl6, contained in the latest version of CCSM, CCSM3. Cpl6 introduces distributed-memory parallelism to the coupler, a class library for important coupler functions, and a standardized interface for component models. Cpl6 is implemented entirely in Fortran90 and uses the Model Coupling Toolkit as the base for most of its classes. Cpl6 gives improved performance over previous versions and scales well on multiple platforms.


Journal of Hydrometeorology | 2003

Hydrological processes in regional climate model simulations of the central United States flood of June-July 1993.

Christopher J. Anderson; Raymond W. Arritt; Zaitao Pan; Eugene S. Takle; William J. Gutowski; Francis O. Otieno; Renato da Silva; Daniel Caya; Jesper Christensen; Daniel Lüthi; Miguel Angel Gaertner; Clemente Gallardo; Filippo Giorgi; René Laprise; Song-You Hong; Colin Jones; H-M. H. Juang; Jack J. Katzfey; John L. McGregor; William M. Lapenta; Jay Walter Larson; John A. Taylor; Glen E. Liston; Roger A. Pielke; John O. Roads

Thirteen regional climate model (RCM) simulations of June‐July 1993 were compared with each other and observations. Water vapor conservation and precipitation characteristics in each RCM were examined for a 108 3 108 subregion of the upper Mississippi River basin, containing the region of maximum 60-day accumulated precipitation in all RCMs and station reports. All RCMs produced positive precipitation minus evapotranspiration (P 2 E . 0), though most RCMs produced P 2 E below the observed range. RCM recycling ratios were within the range estimated from observations. No evidence of common errors of E was found. In contrast, common dry bias of P was found in the simulations. Daily cycles of terms in the water vapor conservation equation were qualitatively similar in most RCMs. Nocturnal maximums of P and C (convergence) occurred in 9 of 13 RCMs, consistent with observations. Three of the four driest simulations failed to couple P and C overnight, producing afternoon maximum P. Further, dry simulations tended to produce a larger fraction of their 60-day accumulated precipitation from low 3-h totals. In station reports, accumulation from high (low) 3-h totals had a nocturnal (early morning) maximum. This time lag occurred, in part, because many mesoscale convective systems had reached peak intensity overnight and had declined in intensity by early morning. None of the RCMs contained such a time lag. It is recommended that short-period experiments be performed to examine the ability of RCMs to simulate mesoscale convective systems prior to generating long-period simulations for hydroclimatology.


Quarterly Journal of the Royal Meteorological Society | 2001

An adaptive buddy check for observational quality control

Dick P. Dee; Leonid Rukhovets; Ricardo Todling; Arlindo da Silva; Jay Walter Larson

An adaptive buddy-check algorithm is presented that adjusts tolerances for suspect observations, based on the variability of surrounding data. The algorithm derives from a statistical hypothesis test combined with maximum-likelihood covariance estimation. Its stability is shown to depend on the initial identification of outliers by a simple background check. The adaptive feature ensures that the final quality-control decisions are not very sensitive to prescribed statistics of first-guess and observation errors, nor on other approximations introduced into the algorithm. The implementation of the algorithm in a global atmospheric data assimilation is described. Its performance is contrasted with that of a non-adaptive buddy check, for the surface analysis of an extreme storm that took place over Europe on 27 December 1999. The adaptive algorithm allowed the inclusion of many important observations that differed greatly from the first guess and that would have been excluded on the basis of prescribed statistics. The analysis of the storm development was much improved as a result of these additional observations.


international conference on computational science | 2001

The Model Coupling Toolkit

Jay Walter Larson; Robert L. Jacob; Ian T. Foster; Jing Guo

The advent of coupled earth system models has raised an important question in parallel computing: What is the most effective method for coupling many parallel models to form one high-performance coupled modeling system? We present our solution to this problem--The Model Coupling Toolkit (MCT). We describe how our effort to construct the Next-Generation Coupler for NCAR Community Climate System Model motivated us to create the Toolkit. We describe in detail the conceptual design of the MCT, and explain its usage in constructing parallel coupled models. We present some preliminary performance results for the Toolkits parallel data transfer facilities. Finally, we outline an agenda for future development of the MCT.


international conference on conceptual structures | 2013

Fault-Tolerant Grid-Based Solvers: Combining Concepts from Sparse Grids and MapReduce

Jay Walter Larson; Markus Hegland; Brendan Harding; Stephen Roberts; Linda Stals; Alistair P. Rendell; Peter E. Strazdins; Md. Mohsin Ali; Christoph Kowitz; Ross Nobes; James Southern; Nicholas Wilson; Michael Li; Yasuyuki Oishi

Abstract A key issue confronting petascale and exascale computing is the growth in probability of soft and hard faults with increasing system size. A promising approach to this problem is the use of algorithms that are inherently fault tolerant. We introduce such an algorithm for the solution of partial differential equations, based on the sparse grid approach. Here, the solution of multiple component grids are efficiently combined to achieve a solution on a full grid. The technique also lends itself to a (modified) MapReduce framework on a cluster of processors, with the map stage corresponding to allocating each component grid for solution over a subset of the processors, and the reduce stage corresponding to their combination. We describe how the sparse grid combination method can be modified to robustly solve partial differential equations in the presence of faults. This is based on a modified combination formula that can accommodate the loss of one or two component grids. We also discuss accuracy issues associated with this formula. We give details of a prototype implementation within a MapReduce framework using the dynamic process features and asynchronous message passing facilities of MPI. Results on a two-dimensional advection problem show that the errors after the loss of one or two sub-grids are within a factor of 3 of the sparse grid solution in the presence of no faults. They also indicate that the sparse grid technique with four times the resolution has approximately the same error as a full grid, while requiring (for a sufficiently high resolution) much lower computation and memory requirements. We finally outline a MapReduce variant capable of responding to faults in ways other than re-scheduling of failed tasks. We discuss the likely software requirements for such a flexible MapReduce framework, the requirements it will impose on users’ legacy codes, and the systems runtime behavior.


Geophysical Research Letters | 1994

A comparison of GCM sensitivity to changes in CO2 and solar luminosity

Susan Marshall; Robert J. Oglesby; Jay Walter Larson; Barry Saltzman

We evaluate the equilibrium response of the atmospheric climate to a wide range of systematic variations in carbon dioxide (100 to 1000 ppm) and solar luminosity (±5% of todays value) using the NCAR CCM1. The quantitative responses of each set of simulations are compared using a formal sensitivity analysis. Although the forcings on the climate are different, both temporally and spatially, similarities appear in the climatic response. The results of this analysis have broader implications for the use of simulations of past climates as a guide for possible future climatic change.


SIAM Journal on Scientific Computing | 2015

FAULT TOLERANT COMPUTATION WITH THE SPARSE GRID COMBINATION TECHNIQUE

Brendan Harding; Markus Hegland; Jay Walter Larson; James Southern

This paper continues to develop a fault tolerant extension of the sparse grid combination technique recently proposed in [B. Harding and M. Hegland, ANZIAM J. Electron. Suppl., 54 (2013), pp. C394--C411]. This approach to fault tolerance is novel for two reasons: First, the combination technique adds an additional level of parallelism, and second, it provides algorithm-based fault tolerance so that solutions can still be recovered if failures occur during computation. Previous work indicates how the combination technique may be adapted for a low number of faults. In this paper we develop a generalization of the combination technique for which arbitrary collections of coarse approximations may be combined to obtain an accurate approximation. A general fault tolerant combination technique for large numbers of faults is a natural consequence of this work. Using a renewal model for the time between faults on each node of a high performance computer, we also provide bounds on the expected error for interpolati...


conference on high performance computing (supercomputing) | 1997

Parallel Computing at the NASA Data Assimilation Office (DAO)

M. P. Lyster; K. Ekers; Jing Guo; M. Harber; David J. Lamich; Jay Walter Larson; Robert Lucchesi; Richard B. Rood; Siegfried D. Schubert; William Sawyer; M. Sienkiewicz; Arlindo da Silva; J. Stobie; Lawrence L. Takacs; R. Todling; Jose Zero; Chris H. Q. Ding; Robert D. Ferraro

The goal of atmospheric data assimilation is to produce accurate gridded datasets of fields by assimilating a range of observations along with physically consistent model forecasts. The NASA Data Assimilation Office (DAO) is currently upgrading its end-to-end data assimilation system (GEOS DAS) to support NASAs Mission To Planet Earth (MTPE) Enterprise. This effort is also part of a NASA HPCC Earth and Space Sciences (ESS) Grand Challenge PI project. Future Core computing, using a modular Fortran 90 design and distributed memory (MPI) software, will be carried out at Ames Research Center. The algorithmic and performance issues involved in the Core system are the main subjects of this presentation.

Collaboration


Dive into the Jay Walter Larson's collaboration.

Top Co-Authors

Avatar

Robert L. Jacob

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Everest T. Ong

Argonne National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Anthony P. Craig

National Center for Atmospheric Research

View shared research outputs
Top Co-Authors

Avatar

Brendan Harding

Australian National University

View shared research outputs
Top Co-Authors

Avatar

Markus Hegland

Australian National University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jing Guo

Goddard Space Flight Center

View shared research outputs
Top Co-Authors

Avatar

John A. Taylor

Australian National University

View shared research outputs
Top Co-Authors

Avatar

Arlindo da Silva

Goddard Space Flight Center

View shared research outputs
Top Co-Authors

Avatar

Michael Tobis

University of Wisconsin-Madison

View shared research outputs
Researchain Logo
Decentralizing Knowledge