Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Joseph A. Spahr is active.

Publication


Featured researches published by Joseph A. Spahr.


Journal of Climate | 1990

The impact of sea surface temperature anomalies on the Rainfall over Northeast Brazil

Carlos R. Mechoso; Steven W. Lyons; Joseph A. Spahr

Abstract The response of the tropical atmosphere to the sea surface temperature (SST) anomalies in the Northern Hemisphere spring of 1984 is investigated. The methodology for investigation consists of comparing simulations with and without the global distribution of SST anomalies in the boundary conditions of the UCLA General Circulation Model (GCM). At low levels, the response includes weaker southeast trade winds over the Atlantic, increased precipitation off the northeast coast of Brazil, and reduced precipitation west of this region. The increased precipitation is due to enhanced convergence of moisture advected by the southeast trade winds, although the trades themselves are weaker. The results for the western equatorial Atlantic am in apparent agreement with the observed anomalous southern migration of the ITCZ in years with warm SST anomalies in the southern tropical Atlantic. There are strong anomalous trade winds over the Pacific extending east of the date line and weak wind anomalies over the ma...


parallel computing | 1995

Performance of a distributed memory finite difference atmospheric general circulation model

Michael F. Wehner; Arthur A. Mirin; Peter G. Eltgroth; William Paul Dannevik; Carlos R. Mechoso; John D. Farrara; Joseph A. Spahr

Abstract A new version of the UCLA atmospheric general circulation model suitable for massively parallel computer architectures has been developed. This paper presents the principles for the codes design and examines performance on a variety of distributed memory computers. A two dimensional domain decomposition strategy is used to achieve parallelism and is implemented by message passing. This parallel algorithm is shown to scale favorably as the number of processors is increased. In the fastest configuration, performance roughly equivalent to that of multitasking vector supercomputers is achieved.


international conference on computational science | 2001

A Data Broker for Distributed Computing Environments

L. A. Drummond; James Demmel; Carlos R. Mechoso; H. Robinson; Keith Sklower; Joseph A. Spahr

This paper presents a toolkit for managing distributed communication in multi-application systems that are targeted to run in high performance computing environments; the Distributed Data Broker (DDB). The DDB provides a flexible mechanism for coupling codes with different grid resolutions and data representations. The target applications are coupled systems that deal with large volumes of data exchanges and/or are computational expensive. These application codes need to run efficiently in massively parallel computer environments generating a need for a distributed coupling to minimize long synchronization points. Furthermore, with the DDB, coupling is realized in a plug-in manner rather than hard-wire inclusion of any programming language statements. The DDB performance in the CRAY T3E- 600 and T3E-900 systems is examined.


Monthly Weather Review | 1993

Parallelization and Distribution of a Coupled Atmosphere–Ocean General Circulation Model

Carlos R. Mechoso; Chung-Chun Ma; John D. Farrara; Joseph A. Spahr; Reagan W. Moore

Abstract The distribution of a climate model across homogeneous and heterogeneous computer environments with nodes that can reside at geographically different locations is investigated. This scientific application consists of an atmospheric general circulation model (AGCM) coupled to an oceanic general circulation model (OGCM). Three levels of code decomposition are considered to achieve a high degree of parallelism and to mask communication with computation. First, the domains of both the gridpoint AGCM and OGCM are divided into subdomains for which calculations an carded out concurrently (domain decomposition). Second, the model is decomposed based on the diversity of tasks performed by its major components (task decompositions). Three such components are identified: (a) AGCM/physics which computes the effects on the grid-scale flow of subgrid-scale processes such as convection and turbulent mixing; (b) AGCM/dynamics, which computes the evolution of the flow governed by the primitive equations; and (c) ...


IEEE Parallel & Distributed Technology: Systems & Applications | 1994

Achieving superlinear speedup on a heterogeneous, distributed system

Carlos R. Mechoso; John D. Farrara; Joseph A. Spahr

The CASA Gigabit Network Testbed, part of NSF and ARPAs Gigabit Project, is investigating whether a metacomputer consisting of widely distributed, heterogeneous supercomputers connected by a high-speed network is viable for large scientific applications. A particular challenge is to determine if such a metacomputer can produce superlinear speedup despite latency and communication overheads. One of the applications in the CASA testbed is a model we developed that couples a global atmosphere model to a world ocean model. Simulations using such coupled general circulation models for climate studies demand considerable computer resources. When distributing such a model, we need to consider the methods for masking latency with computation, the communications bandwidth requirements for different decomposition strategies, the optimal computer architecture for each major phase of the computation, and the effects of latency and communication costs for different decomposition strategies. Here we focus an the last two issues, and demonstrate that choosing the appropriate computer architectures and masking communication with computation can produce superlinear speedup.<<ETX>>


Monthly Weather Review | 1982

A Study of the Sensitivity of Numerical Forecasts to an Upper Boundary in the Lower Stratosphere

Carlos R. Mechoso; Max J. Suarez; Koji Yamazaki; Joseph A. Spahr; Akio Arakawa

Abstract The impact of an upper boundary on numerical forecasts is studied by comparing the results of a nine-layer model with a top in the lower stratosphere, to those of a 15-layer model with a top near the stratopause. A single case is considered for which initial conditions are taken from a climatologically adjusted winter simulation produced by the 15-layer model. It is found that, as a result of the lowered upper boundary, them is a marked equatorward shift of upper-level westerlies. Significant errors in the ultra-long waves appear at SW mb within the first five days. Errors at 500 mb then spread to progressively shorter waves with large errors in cyclone-scale waves by day 12. Large errors in an ultra-long, wave (wavenumber 3) after day 10 am associated with the climatological adjustment of the stationary flow to the lowered boundary. Two different assumptions in the radiation calculation in the nine-layer model am used. Results indicate that radiative effects are of secondary importance to the pr...


conference on high performance computing (supercomputing) | 1991

Distribution of a climate model across high-speed networks

Carlos R. Mechoso; Chung-Chun Ma; John D. Farrara; Joseph A. Spahr

No abstract available


high performance distributed computing | 1993

Toward a high performance distributed memory climate model

Michael F. Wehner; J. J. Ambrosiano; J.C. Brown; William Paul Dannevik; Peter G. Eltgroth; Arthur A. Mirin; John D. Farrara; Chung-Chun Ma; Carlos R. Mechoso; Joseph A. Spahr

As part of a long range plan to develop a comprehensive climate systems modeling capability, the authors have taken the atmospheric general circulation model originally developed by Arakawa and collaborators at UCLA and have recast it in a portable, parallel form. The code uses an explicit time-advance procedure on a staggered three-dimensional Eulerian mesh. They have implemented a two-dimensional latitude/longitude domain decomposition message passing strategy. Both dynamic memory management and interprocess communication are handled with macro constructs that are preprocessed prior to compilation. The code can be moved about a variety of platforms, including massively parallel processors, workstation clusters, and vector processors, with a mere change of three parameters. Performance on the various platforms as well as issues associated with coupling different models for major components of the climate system are discussed.<<ETX>>


conference on high performance computing (supercomputing) | 1998

The UCLA AGCM in High Performance Computing Environments

Carlos R. Mechoso; L. A. Drummond; John D. Farrara; Joseph A. Spahr

General Circulation Models (GCMs) are at the top of the hierarchy of numerical models that are used to study the Earths climate. To increase the significance of predictions using GCMs requires ensembles of integrations that in turn demand large amounts of computing resources. GCMs codes are particularly difficult to optimize in view of their heterogeneity. In this paper we focus on code optimization for GCMs of the atmosphere (AGCMs), one of the major components of the climate system. In this paper, we present our efforts in optimizing the parallel UCLA AGCM code. The UCLA AGCM is a state-of-the-art finite-difference model of the global atmosphere. Our optimization efforts include the implementation of load balancing schemes, new physical parameterizations of atmospheric processes, code restructuring and use of special mathematical functions. At the beginning of this work, the overall execution time of the code was 459 seconds per simulated day in 256 nodes of a CRAY T3D. At present, the same model configuration requires 51 seconds per simulated day in 256 nodes of a CRAY T3E-900, which is approximately 9 times faster. The peak model performance is about 40 GFLOPs on 512 T3E-900 nodes. We present results in support of our conclusion that major advances in our ability to carry out longer and more detailed climate simulations depend primarily upon development of more powerful supercomputers and that code optimization, for a particular computer architecture, and development of more efficient algorithms can be nearly as important.


Journal of Geophysical Research | 2001

On‐line simulations of passive chemical tracers in the University of California, Los Angeles, atmospheric general circulation model: 1. CFC‐11 and CFC‐12

Mohan Gupta; Richard P. Turco; Carlos R. Mechoso; Joseph A. Spahr

Long-term simulations of the response of atmospheric CFC-11 and CFC-12 to standard emission scenarios have been carried out using the University of California, Los Angeles (UCLA) atmospheric general circulation model (AGCM) coupled on-line with the UCLA atmospheric chemistry model. For both compounds, photochemical loss rates are computed interactively over the entire model domain at each time step of the integration. Using industrial-based emission estimates, the simulations for CFC-12 closely track the long-term trends recorded in both hemispheres by the Atmospheric Lifetime Experiment/Global Atmospheric Gases Experiment/Advanced Global Atmospheric Gases Experiment (AGA) and Climate Monitoring and Diagnostics Laboratory monitoring networks. The agreement between simulations and observations is best when AGA-deduced emissions are employed. The predicted surface mixing ratios of CFC-11, on the other hand, are somewhat overestimated by the model. Because the transport and loss processes, as well as source distributions, are roughly similar for these halocarbons, the divergence in surface concentrations points to the possibility that emissions of CFC-11 may be overestimated for the period extending from the late 1980s through the early 1990s, and perhaps even at earlier times. As for CFC-12, the best agreement is achieved using AGA emissions. The simulated interhemispheric exchange time constant for these CFCs is about 0.6 year. In the annual cycle, maximum transport occurs from the Northern to Southern Hemisphere within the lowest atmospheric layers during northern winter. Our best estimates of the annually averaged mean global lifetimes of CFC-11 and CFC-12 are about 55 and 100 years, respectively. The simulations indicate that both the mean residence time and interhemispheric exchange rate depend on the assumed model vertical domain. For the mass balance analysis, when the upper boundary of the AGCM is artificially fixed below ∼35 km for CFC-11, or ∼43 km for CFC-12, there is a tendency for the timescales (lifetimes and interhemispheric exchange times) to be overestimated. Comparisons between CFC distributions and trends calculated using low and high spatial resolution show relatively small differences in the present case. These results, especially regarding CFC persistence and interhemispheric exchange, suggest that the present model accurately represents the global dispersion of long-lived chemical tracers.

Collaboration


Dive into the Joseph A. Spahr's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chung-Chun Ma

University of California

View shared research outputs
Top Co-Authors

Avatar

L. A. Drummond

University of California

View shared research outputs
Top Co-Authors

Avatar

Arthur A. Mirin

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Michael F. Wehner

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Peter G. Eltgroth

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar

William Paul Dannevik

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Judith G. Cohen

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge