Steven H. Langer
Lawrence Livermore National Laboratory
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Steven H. Langer.
ieee international conference on high performance computing data and analytics | 2012
Abhinav Bhatele; Todd Gamblin; Steven H. Langer; Peer-Timo Bremer; Erik W. Draeger; Bernd Hamann; Katherine E. Isaacs; Aaditya G. Landge; Joshua A. Levine; Valerio Pascucci; Martin Schulz; Charles H. Still
The placement of tasks in a parallel application on specific nodes of a supercomputer can significantly impact performance. Traditionally, this task mapping has focused on reducing the distance between communicating tasks on the physical network. This minimizes the number of hops that point-to-point messages travel and thus reduces link sharing between messages and contention. However, for applications that use collectives over sub-communicators, this heuristic may not be optimal. Many collectives can benefit from an increase in bandwidth even at the cost of an increase in hop count, especially when sending large messages. For example, placing communicating tasks in a cube configuration rather than a plane or a line on a torus network increases the number of possible paths messages might take. This increases the available bandwidth which can lead to significant performance gains. We have developed Rubik, a tool that provides a simple and intuitive interface to create a wide variety of mappings for structured communication patterns. Rubik supports a number of elementary operations such as splits, tilts, or shifts, that can be combined into a large number of unique patterns. Each operation can be applied to disjoint groups of processes involved in collectives to increase the effective bandwidth. We demonstrate the use of Rubik for improving performance of two parallel codes, pF3D and Qbox, which use collectives over sub-communicators.
Journal of Quantitative Spectroscopy & Radiative Transfer | 1995
C. J. Keane; G.W. Pollak; R. Cook; T. R. Dittrich; B. A. Hammel; O. L. Landen; Steven H. Langer; W.K. Levedahl; D. H. Munro; Howard A. Scott; G.B. Zimmerman
Abstract Rayleigh-Taylor (RT) instability of the pusher-fuel interface occurring upon acceleration and deceleration of the pusher is of major concern for current and future ICF experiments. One common diagnostic technique for measuring pusher-fuel mix in spherical implosion experiments involves placing spectroscopic dopants both in the capsule fuel region and the innermost region of the capsule wall adjacent to the fuel. As the degree of pusher-fuel mix is increased the pusher dopant x-ray emission increases relative to that of the fuel dopant. Spherical implosion experiments of this type using Ar and Ti dopants in the fuel and pusher, respectively, are being carried out on Nova. We first show that the Ti He-α Ar He-β line ratio shows promise as a mix diagnostic for high growth factor targets. We then discuss some of the important physical processes underlying Ar and Ti spectral line formation in these targets and discuss how these processes affect the calculation of simulated spectra. The importance of radiative transfer as well as high-density plasma phenomena such as continuum lowering and Stark broadening is demonstrated. The simulated spectra are also observed to be sensitive to assumptions regarding the treatment of electron thermal conduction in the mix region. Spectral postprocessing of 2-D hydrodynamic simulations using detailed line transfer methods has been carried out and implies that simple escape factor treatments must be tested carefully before they can be relied upon. Preliminary comparisons of experimental data with simulation are presented. It is shown that the computed spectra is sensitive to the laser energy and pusher temperature. These comparisons to data also imply that the inclusion of convective effects in computing the electron temperature profile through the mix region is necessary in order to satisfactorily model experimental spectra.
ieee international conference on high performance computing data and analytics | 2013
Daniel E. Laney; Steven H. Langer; C. R. Weber; Peter Lindstrom; Al Wegener
This paper examines whether lossy compression can be used effectively in physics simulations as a possible strategy to combat the expected data-movement bottleneck in future high performance computing architectures. We show that, for the codes and simulations we tested, compression levels of 3--5X can be applied without causing significant changes to important physical quantities. Rather than applying signal processing error metrics, we utilize physics-based metrics appropriate for each code to assess the impact of compression. We evaluate three different simulation codes: a Lagrangian shock-hydrodynamics code, an Eulerian higher-order hydrodynamics turbulence modeling code, and an Eulerian coupled laser-plasma interaction code. We compress relevant quantities after each time-step to approximate the effects of tightly coupled compression and study the compression rates to estimate memory and disk-bandwidth reduction. We find that the error characteristics of compression algorithms must be carefully considered in the context of the underlying physics being modeled.
Review of Scientific Instruments | 1995
C. J. Keane; R. Cook; T. R. Dittrich; B. A. Hammel; W.K. Levedahl; O. L. Landen; Steven H. Langer; D. H. Munro; Howard A. Scott
Of primary concern in next generation inertial confinement fusion (ICF) implosion experiments is Rayleigh–Taylor (RT) instability of the pusher‐fuel interface occurring upon acceleration and deceleration of the pusher. This results in mixing of hot fuel with cold pusher material. One method of diagnosing mix in this case is to place spectroscopic dopants both in the capsule fuel region and the innermost region of the capsule wall adjacent to the fuel. As the degree of pusher/fuel mix is increased (typically through placement of controlled perturbations on the outer surface of the capsule) the pusher dopant x‐ray emission increases relative to that of the fuel dopant. Experiments of this type using indirectly driven implosions have been carried out on Nova. In this paper we describe some of the important physics issues underlying spectral line formation in these targets and discuss how they are manifested in the modeling and interpretation of experimental data. The importance of radiative transfer as well ...
ieee visualization | 2005
Daniel E. Laney; Steven P. Callahan; Nelson L. Max; Cláudio T. Silva; Steven H. Langer; Randall Frank
We present the application of hardware accelerated volume rendering algorithms to the simulation of radiographs as an aid to scientists designing experiments, validating simulation codes, and understanding experimental data. The techniques presented take advantage of 32-bit floating point texture capabilities to obtain solutions to the radiative transport equation for X-rays. The hardware accelerated solutions are accurate enough to enable scientists to explore the experimental design space with greater efficiency than the methods currently in use. An unsorted hexahedron projection algorithm is presented for curvilinear hexahedral meshes that produces simulated radiographs in the absorption-only regime. A sorted tetrahedral projection algorithm is presented that simulates radiographs of emissive materials. We apply the tetrahedral projection algorithm to the simulation of experimental diagnostics for inertial confinement fusion experiments on a laser at the University of Rochester.
Journal of Quantitative Spectroscopy & Radiative Transfer | 2000
Steven H. Langer; Howard A. Scott; M. M. Marinak; O. L. Landen
Abstract Hydrodynamic instabilities can reduce the yield in inertial confinement fusion (ICF) implosions. Line emission from dopants placed in the capsule can be used to diagnose the extent of the instabilities. In earlier work we compared line emission measured in experiments performed on the Nova laser to one-dimensional mix models and two-dimensional models in which many different instability wavelengths interact. Three-dimensional simulations are required to model properly the saturation of the hydrodynamic instabilities. This paper presents the results of the first three-dimensional simulations of line emission from ICF capsules. The simulations show that the three lines considered here come from different spatial regions. Line ratios cannot be used to determine temperatures without accounting for the spatial dependence of the line emission. Future papers will present detailed comparisons to experiments.
ieee international conference on high performance computing data and analytics | 2016
Edgar A. León; Ian Karlin; Abhinav Bhatele; Steven H. Langer; Chris Chambreau; Louis H. Howell; Trent D'Hooge; Matthew L. Leininger
Understanding the characteristics and requirements of applications that run on commodity clusters is key to properly configuring current machines and, more importantly, procuring future systems effectively. There are only a few studies, however, that are current and characterize realistic workloads. For HPC practitioners and researchers, this limits our ability to design solutions that will have an impact on real systems. We present a systematic study that characterizes applications with an emphasis on communication requirements. It includes cluster utilization data, identifying a representative set of applications from a U.S. Department of Energy laboratory, and characterizing their communication requirements. The driver for this work is understanding application sensitivity to a tapered fat-tree network. These results provided key insights into the procurement of our next generation commodity systems. We believe this investigation can provide valuable input to the HPC community in terms of workload characterization and requirements from a large supercomputing center.
Journal of Quantitative Spectroscopy & Radiative Transfer | 2001
Steven H. Langer; Howard A. Scott; M. M. Marinak; O. L. Landen
Abstract Hydrodynamic instabilities reduce the yield in inertial confinement fusion (ICF) implosions. Line emission from dopants placed in the capsule can be used to diagnose the extent of the instabilities. We present the results of 3D simulations of line emission from titanium placed in the inner layers of the plastic shell of an ICF capsule. The simulations show that the line emission has strong spatial variations and that peak line emission occurs at the same time as peak fusion burn.
Journal of Quantitative Spectroscopy & Radiative Transfer | 1997
Steven H. Langer; Howard A. Scott; C. J. Keane; O. L. Landen; M. M. Marinak
Abstract Mix due to hydrodynamic instabilities reduces the yield in inertial confinement fusion implosions. Line emission from dopants placed in the capsule can be used to diagnose the extent of the instabilities. This paper compares line emission measured in experiments performed on the Nova laser to 1D mix models and 2D models in which many different instability wavelengths interact. The 2D models properly handle the distorted interface between the fuel and the capsule shell, instead of assuming a region where fuel and shell material are atomically mixed as in the 1D models. Both models agree reasonably well with the dependence of thermonuclear yield on capsule roughness. The 2D models do a better job of matching the line emission for capsules with rough surfaces.
ieee international conference on high performance computing, data, and analytics | 2014
Abhinav Bhatele; Nikhil Jain; Katherine E. Isaacs; Ronak Buch; Todd Gamblin; Steven H. Langer; Laxmikant V. Kalé
Six of the ten fastest supercomputers in the world in 2014 use a torus interconnection network for message passing between compute nodes. Torus networks provide high bandwidth links to near-neighbors and low latencies over multiple hops on the network. However, large diameters of such networks necessitate a careful placement of parallel tasks on the compute nodes to minimize network congestion. This paper presents a methodological study of optimizing application performance on a five-dimensional torus network via the technique of topology-aware task mapping. Task mapping refers to the placement of processes on compute nodes while carefully considering the network topology between the nodes and the communication behavior of the application. We focus on the IBM Blue Gene/Q machine and two production applications - a laser-plasma interaction code called pF3D and a lattice QCD application called MILC. Optimizations presented in the paper improve the communication performance of pF3D by 90% and that of MILC by up to 47%.