Thomas L. Clune
Goddard Space Flight Center
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Thomas L. Clune.
Journal of Advances in Modeling Earth Systems | 2014
Gavin A. Schmidt; Max Kelley; Larissa Nazarenko; Reto Ruedy; Gary L. Russell; Igor Aleinov; Mike Bauer; Susanne E. Bauer; Maharaj K. Bhat; Rainer Bleck; V. M. Canuto; Thomas L. Clune; Rosalinda de Fainchtein; Anthony D. Del Genio; Nancy Y. Kiang; A. Lacis; Allegra N. LeGrande; Elaine Matthews; Ron L. Miller; Amidu Oloso; William M. Putman; David Rind; Drew T. Shindell; Rahman A. Syed; Jinlun Zhang
We present a description of the ModelE2 version of the Goddard Institute for Space Studies (GISS) General Circulation Model (GCM) and the configurations used in the simulations performed for the Coupled Model Intercomparison Project Phase 5 (CMIP5). We use six variations related to the treatment of the atmospheric composition, the calculation of aerosol indirect effects, and ocean model component. Specifically, we test the difference between atmospheric models that have noninteractive composition, where radiatively important aerosols and ozone are prescribed from precomputed decadal averages, and interactive versions where atmospheric chemistry and aerosols are calculated given decadally varying emissions. The impact of the first aerosol indirect effect on clouds is either specified using a simple tuning, or parameterized using a cloud microphysics scheme. We also use two dynamic ocean components: the Russell and HYbrid Coordinate Ocean Model (HYCOM) which differ significantly in their basic formulations and grid. Results are presented for the climatological means over the satellite era (1980–2004) taken from transient simulations starting from the preindustrial (1850) driven by estimates of appropriate forcings over the 20th Century. Differences in base climate and variability related to the choice of ocean model are large, indicating an important structural uncertainty. The impact of interactive atmospheric composition on the climatology is relatively small except in regions such as the lower stratosphere, where ozone plays an important role, and the tropics, where aerosol changes affect the hydrological cycle and cloud cover. While key improvements over previous versions of the model are evident, these are not uniform across all metrics.
Journal of Advances in Modeling Earth Systems | 2014
Ron L. Miller; Gavin A. Schmidt; Larissa Nazarenko; Nick Tausnev; Susanne E. Bauer; Anthony D. DelGenio; Max Kelley; Ken K. Lo; Reto Ruedy; Drew T. Shindell; Igor Aleinov; Mike Bauer; Rainer Bleck; V. M. Canuto; Yonghua Chen; Y. Cheng; Thomas L. Clune; Greg Faluvegi; James E. Hansen; Richard J. Healy; Nancy Y. Kiang; D. Koch; A. Lacis; Allegra N. LeGrande; Jean Lerner; Surabi Menon; Valdar Oinas; Carlos Pérez García-Pando; Jan Perlwitz; Michael J. Puma
Observations of climate change during the CMIP5 extended historical period (1850-2012) are compared to trends simulated by six versions of the NASA Goddard Institute for Space Studies ModelE2 Earth System Model. The six models are constructed from three versions of the ModelE2 atmospheric general circulation model, distinguished by their treatment of atmospheric composition and the aerosol indirect effect, combined with two ocean general circulation models, HYCOM and Russell. Forcings that perturb the model climate during the historical period are described. Five-member ensemble averages from each of the six versions of ModelE2 simulate trends of surface air temperature, atmospheric temperature, sea ice and ocean heat content that are in general agreement with observed trends, although simulated warming is slightly excessive within the past decade. Only simulations that include increasing concentrations of long-lived greenhouse gases match the warming observed during the twentieth century. Differences in twentieth-century warming among the six model versions can be attributed to differences in climate sensitivity, aerosol and ozone forcing, and heat uptake by the deep ocean. Coupled models with HYCOM export less heat to the deep ocean, associated with reduced surface warming in regions of deepwater formation, but greater warming elsewhere at high latitudes along with reduced sea ice. All ensembles show twentieth-century annular trends toward reduced surface pressure at southern high latitudes and a poleward shift of the midlatitude westerlies, consistent with observations.
Geophysical Research Letters | 2015
M. J. Way; Anthony D. Del Genio; Nancy Y. Kiang; Linda E. Sohl; David H. Grinspoon; Igor Aleinov; Maxwell Kelley; Thomas L. Clune
Present-day Venus is an inhospitable place with surface temperatures approaching 750K and an atmosphere 90 times as thick as Earths. Billions of years ago the picture may have been very different. We have created a suite of 3-D climate simulations using topographic data from the Magellan mission, solar spectral irradiance estimates for 2.9 and 0.715 Gya, present-day Venus orbital parameters, an ocean volume consistent with current theory, and an atmospheric composition estimated for early Venus. Using these parameters we find that such a world could have had moderate temperatures if Venus had a rotation period slower than ~16 Earth days, despite an incident solar flux 46-70% higher than Earth receives. At its current rotation period, Venuss climate could have remained habitable until at least 715 million years ago. These results demonstrate the role rotation and topography play in understanding the climatic history of Venus-like exoplanets discovered in the present epoch.
Journal of Advances in Modeling Earth Systems | 2015
L. Nazarenko; Gavin A. Schmidt; R. L. Miller; N. Tausnev; M. Kelley; R. Ruedy; Gary L. Russell; I. Aleinov; M. Bauer; S. Bauer; Rainer Bleck; V. M. Canuto; Y. Cheng; Thomas L. Clune; A. D. Del Genio; G. Faluvegi; James E. Hansen; R. J. Healy; N. Y. Kiang; Dorothy M. Koch; A. Lacis; Allegra N. LeGrande; J. Lerner; Kenneth K.-W. Lo; Surabi Menon; V. Oinas; J. Perlwitz; Michael J. Puma; David Rind; Anastasia Romanou
We examine the anthropogenically forced climate response for the 21st century representative concentration pathway (RCP) emission scenarios and their extensions for the period 2101–2500. The experiments were performed with ModelE2, a new version of the NASA Goddard Institute for Space Sciences (GISS) coupled general circulation model that includes three different versions for the atmospheric composition components: a noninteractive version (NINT) with prescribed composition and a tuned aerosol indirect effect (AIE), the TCAD version with fully interactive aerosols, whole-atmosphere chemistry, and the tuned AIE, and the TCADI version which further includes a parameterized first indirect aerosol effect on clouds. Each atmospheric version is coupled to two different ocean general circulation models: the Russell ocean model (GISS-E2-R) and HYCOM (GISS-E2-H). By 2100, global mean warming in the RCP scenarios ranges from 1.0 to 4.5°C relative to 1850–1860 mean temperature in the historical simulations. In the RCP2.6 scenario, the surface warming in all simulations stays below a 2°C threshold at the end of the 21st century. For RCP8.5, the range is 3.5–4.5°C at 2100. Decadally averaged sea ice area changes are highly correlated to global mean surface air temperature anomalies and show steep declines in both hemispheres, with a larger sensitivity during winter months. By the year 2500, there are complete recoveries of the globally averaged surface air temperature for all versions of the GISS climate model in the low-forcing scenario RCP2.6. TCADI simulations show enhanced warming due to greater sensitivity to CO2, aerosol effects, and greater methane feedbacks, and recovery is much slower in RCP2.6 than with the NINT and TCAD versions. All coupled models have decreases in the Atlantic overturning stream function by 2100. In RCP2.6, there is a complete recovery of the Atlantic overturning stream function by the year 2500 while with scenario RCP8.5, the E2-R climate model produces a complete shutdown of deep water formation in the North Atlantic.
IEEE Software | 2011
Thomas L. Clune; Richard B. Rood
Over the past 30 years, most climate models have grown from relatively simple representations of a few atmospheric processes to complex multidisciplinary systems. Computer infrastructure over that period has gone from punchcard mainframes to modern parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Verification processes for model implementations rely almost exclusively on some combination of detailed analyses of output from full climate simulations and system-level regression tests. Besides being costly in terms of developer time and computing resources, these testing methodologies are limited in the types of defects they can detect, isolate, and diagnose. Mitigating these weaknesses of coarse-grained testing with finer-grained unit tests has been perceived as cumbersome and counterproductive. Recent advances in commercial software tools and methodologies have led to a renaissance of systematic fine-grained testing. This opens new possibilities for testing climate-modeling-software methodologies.
software engineering for high performance computing in computational science and engineering | 2014
Michael L. Rilee; Thomas L. Clune
Developers working in Computational Science & Engineering (CSE)/High Performance Computing (HPC) must contend with constant change due to advances in computing technology and science. Test Driven Development (TDD) is a methodology that mitigates software development risks due to change at the cost of adding comprehensive and continuous testing to the development process. Testing frameworks tailored for CSE/HPC, like pFUnit, can lower the barriers to such testing, yet CSE software faces unique constraints foreign to the broader software engineering community. Effective testing of numerical software requires a comprehensive suite of oracles, i.e., use cases with known answers, as well as robust estimates for the unavoidable numerical errors associated with implementation with finite-precision arithmetic. At first glance these concerns often seem exceedingly challenging or even insurmountable for real-world scientific applications. However, we argue that this common perception is incorrect and driven by (1) a conflation between model validation and software verification and (2) the general tendency in the scientific community to develop relatively coarse-grained, large procedures that compound numerous algorithmic steps.We believe TDD can be applied routinely to numerical software if developers pursue fine-grained implementations that permit testing, neatly side-stepping concerns about needing nontrivial oracles as well as the accumulation of errors. We present an example of a successful, complex legacy CSE/HPC code whose development process shares some aspects with TDD, which we contrast with current and potential capabilities. A mix of our proposed methodology and framework support should enable everyday use of TDD by CSE-expert developers.
Astrophysical Journal Supplement Series | 2017
M. J. Way; Igor Aleinov; David S. Amundsen; Mark A. Chandler; Thomas L. Clune; A. D. Del Genio; Y. Fujii; Maxwell Kelley; Nancy Y. Kiang; Linda E. Sohl; Kostas Tsigaridis
Resolving Orbital and Climate Keys of Earth and Extraterrestrial Environments with Dynamics (ROCKE-3D) is a three-dimensional General Circulation Model (GCM) developed at the NASA Goddard Institute for Space Studies for the modeling of atmospheres of solar system and exoplanetary terrestrial planets. Its parent model, known as ModelE2, is used to simulate modern Earth and near-term paleo-Earth climates. ROCKE-3D is an ongoing effort to expand the capabilities of ModelE2 to handle a broader range of atmospheric conditions, including higher and lower atmospheric pressures, more diverse chemistries and compositions, larger and smaller planet radii and gravity, different rotation rates (from slower to more rapid than modern Earths, including synchronous rotation), diverse ocean and land distributions and topographies, and potential basic biosphere functions. The first aim of ROCKE-3D is to model planetary atmospheres on terrestrial worlds within the solar system such as paleo-Earth, modern and paleo-Mars, paleo-Venus, and Saturns moon Titan. By validating the model for a broad range of temperatures, pressures, and atmospheric constituents, we can then further expand its capabilities to those exoplanetary rocky worlds that have been discovered in the past, as well as those to be discovered in the future. We also discuss the current and near-future capabilities of ROCKE-3D as a community model for studying planetary and exoplanetary atmospheres.
Geophysical and Astrophysical Fluid Dynamics | 2010
Paul H. Roberts; Gary A. Glatzmaier; Thomas L. Clune
In a celebrated recent experiment, Monchaux, et al. (Phys. Rev. Lett. 2007, 98, 044502) created a self-excited dynamo in a cylindrical container of liquid sodium by a turbulent flow created by counter-rotating impellers at the plane ends of the container. A strange feature of the experiment was its failure to generate magnetic field when the impellers were made of stainless steel; success required the impellers to be made of soft iron. The results reported here were generated by numerical simulations of an idealization of the experiment. The container is a sphere and the impellers are replaced by a differential zonal motion of its surface, the northern and southern hemispheres turning about the symmetry axis in opposite senses, the whole system being contained in a thin shell with which it is in perfect electrical contact. This shell has generally a finite electrical conductance and a magnetic permeability that can greatly exceed that of the fluid. The electrodynamic effect of the shell is represented by a thin-wall boundary condition, similar but not identical to that used in MHD duct flow theory. Eleven cases were considered, in four of which the surface shell is an electrical insulator; in the others it is made of a conducting material which, like soft iron, might have a large permeability. In eight cases, a seed field decays away but in three it is amplified and becomes a turbulent self-excited dynamo. Through four cases of the same surface motion and shell permeability, it is inferred that an increase in the shell conductance assists the regeneration of magnetic field. It is also shown that enhancing the shell permeability assists the field creation.
international conference on big data | 2016
Khoa Doan; Amidu Oloso; Kwo-Sen Kuo; Thomas L. Clune; Hongfeng Yu; Brian R. Nelson; Jian Zhang
We investigate the impact of data placement on two Big Data technologies, Spark and SciDB, with a use case from Earth Science where data arrays are multidimensional. Simultaneously, this investigation provides an opportunity to evaluate the performance of the technologies involved. Two datastores, HDFS and Cassandra, are used with Spark for our comparison. It is found that Spark with Cassandra performs better than with HDFS, but SciDB performs better yet than Spark with either datastore. The investigation also underscores the value of having data aligned for the most common analysis scenarios in advance on a shared nothing architecture. Otherwise, repartitioning needs to be carried out on the fly, degrading overall performance.
international conference on big data | 2016
Michael L. Rilee; Kwo-Sen Kuo; Thomas L. Clune; Amidu Oloso; Paul Brown; Hongfeng Yu
We have implemented an updated Hierarchical Triangular Mesh (HTM) as the basis for a unified data model and an indexing scheme for geoscience data to address the variety challenge of Big Earth Data. In the absence of variety, the volume challenge of Big Data is relatively easily addressable with parallel processing. The more important challenge in achieving optimal value with a Big Data solution for Earth Science (ES) data analysis, however, is being able to achieve good scalability with variety. With HTM unifying at least the three popular data models, i.e. Grid, Swath, and Point, used by current ES data products, data preparation time for integrative analysis of diverse datasets can be drastically reduced and better variety scaling can be achieved. HTM is also an indexing scheme, and when applied to all ES datasets, data placement alignment (or co-location) on the shared nothing architecture, which most Big Data systems are based on, is guaranteed and better performance is ensured. With HTM most geospatial set operations become integer interval operations with further performance advantages.