Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David Lange is active.

Publication


Featured researches published by David Lange.


ieee nuclear science symposium | 2007

Cosmic-ray shower generator (CRY) for Monte Carlo transport codes

Chris Hagmann; David Lange; Douglas Wright

The CRY software library generates correlated cosmic-ray particle shower distributions at one of three elevations (sea level, 2100 m, and 11300 m) for use as input to transport and detector simulation codes. Our simulation is based on precomputed input tables derived from full MCNPX simulations of primary cosmic rays on the atmosphere and benchmarked against published cosmic-ray measurements. Our simulation provides all particle production (muons, neutrons, protons, electrons, photons, and pions) with the proper flux within a user-specified area and altitude. The code generates individual showers of secondary particles sampling the energy, time of arrival, zenith angle, and multiplicity with basic correlations, and has user controls for latitude (geomagnetic cutoff) and solar cycle effects. We provide a function library, callable from C, C++, and Fortran, and interfaces to popular Monte Carlo transport codes (MCNP, MCNPX, COG, Geant4). The software library and examples can be downloaded from http://nuclear.llnl.gov/simulation.


Archive | 2007

Monte Carlo Simulation of Proton-induced Cosimc Ray Cascades in the Atmosphere

Chris Hagmann; David Lange; Doug Wright

We have developed a Monte Carlo model of the Earths atmosphere and implemented it in three different codes (GEANT4, MCNPX, and FLUKA). Primary protons in the energy range of 1 GeV-100 TeV are injected at the top of the atmosphere. The codes follow the tracks of all relevant secondary particles (neutrons, muons, gammas, electrons, and pions) and tally their fluxes at selectable altitudes. Comparisons with cosmic ray data at sea level show good agreement.


ieee nuclear science symposium | 2005

First-generation hybrid compact Compton imager

Mark F. Cunningham; Morgan T. Burks; Dan Chivers; C.P. Cork; Lorenzo Fabris; Donald Gunter; Thomas Krings; David Lange; Ethan L. Hull; Lucian Mihailescu; Karl Nelson; T. Niedermayr; D. Protic; John D. Valentine; K. Vetter; Doug Wright

At Lawrence Livermore National Laboratory, we are pursuing the development of a gamma-ray imaging system using the Compton effect. We have built our first generation hybrid Compton imaging system, and we have conducted initial calibration and image measurements using this system. In this paper, we present the details of the hybrid Compton imaging system and initial calibration and image measurements


ieee nuclear science symposium | 2005

Imaging performance of the Si/Ge hybrid Compton imager

Morgan T. Burks; D. Chivers; Christopher P. Cork; Mark F. Cunningham; Lorenzo Fabris; D. Gunter; Ethan L. Hull; David Lange; H. Manini; L. Mihailescu; Karl Nelson; T. Niedermayr; John D. Valentine; K. Vetter; Doug Wright

The point spread function (PSF) of a fully-instrumented silicon/germanium Compton telescope has been measured as a function of energy and angle. Overall, the resolution was 3deg to 4deg FWHM over most of the energy range and field of view. The various contributions to the resolution have been quantified. These contributions include the energy and position uncertainty of the detector; source energy; Doppler broadening; and the 1/r broadening characteristic of Compton back-projection. Furthermore, a distortion of the PSF is observed for sources imaged off-axis from the detector. These contributions are discussed and compared to theory and simulations


Archive | 2005

The new BaBar Analysis Model

D. Brown; David Lange; E. Charles; G. Finocchiaro; Gallieno De Nardo; L. Wilden; P. Elmer

As part of a general Computing Model upgrade, BaBar has deployed a new analysis model. The new analysis model was designed to overcome the major shortcomings of the previous analysis model. In particular, the new analysis model consolidates several redundant data formats, allows users to customize the data format for their analysis, and provides a wider range of data access options than the old model, while maintaining backwards compatibility with BaBar’s existing analysis interface and analysis code base. The new analysis model was used in roughly half the analyses submitted to the recent ICHEP 2004 conference, and has been enthusiastically welcomed by the BaBar analysis community.


arXiv: Computational Physics | 2018

Building a scalable python distribution for HEP data analysis

David Lange

There are numerous approaches to building analysis applications across the high-energy physics community. Among them are Python-based, or at least Python-driven, analysis workflows. We aim to ease the adoption of a Python-based analysis toolkit by making it easier for non-expert users to gain access to Python tools for scientific analysis. Experimental software distributions and individual user analysis have quite different requirements. Distributions tend to worry most about stability, usability and reproducibility, while the users usually strive to be fast and nimble. We discuss how we built and now maintain a python distribution for analysis while satisfying requirements both a large software distribution (in our case, that of CMSSW) and user, or laptop, level analysis. We pursued the integration of tools used by the broader data science community as well as HEP developed (e.g., histogrammar, root_numpy) Python packages. We discuss concepts we investigated for package integration and testing, as well as issues we encountered through this process. Distribution and platform support are important topics. We discuss our approach and progress towards a sustainable infrastructure for supporting this Python stack for the CMS user community and for the broader HEP user community.


arXiv: Computational Physics | 2018

arXiv : HEP Software Foundation Community White Paper Working Group - Data Analysis and Interpretation

L. A. T. Bauerdick; Martin Ritter; Oliver Gutsche; M. D. Sokoloff; N. F. Castro; M. Girone; T. Sakuma; P. Elmer; Brian Bockelman; Elizabeth Sexton-Kennedy; G. Watts; J. Letts; F. Würthwein; C. Vuosalo; Jim Pivarski; Daniel S. Katz; Riccardo Maria Bianchi; K. Cranmer; Robert Gardner; Shawn Patrick McKee; B. Hegner; E. Rodrigues; David Lange; Christoph Paus; JoséM. Hernández; K. Pedro; Bodhitha Jayatilaka; Lukasz Kreczko

At the heart of experimental high energy physics (HEP) is the development of facilities and instrumentation that provide sensitivity to new phenomena. Our understanding of nature at its most fundamental level is advanced through the analysis and interpretation of data from sophisticated detectors in HEP experiments. The goal of data analysis systems is to realize the maximum possible scientific potential of the data within the constraints of computing and human resources in the least time. To achieve this goal, future analysis systems should empower physicists to access the data with a high level of interactivity, reproducibility and throughput capability. As part of the HEP Software Foundation Community White Paper process, a working group on Data Analysis and Interpretation was formed to assess the challenges and opportunities in HEP data analysis and develop a roadmap for activities in this area over the next decade. In this report, the key findings and recommendations of the Data Analysis and Interpretation Working Group are presented.


Journal of Physics: Conference Series | 2010

CMS partial releases: Model, tools, and applications online and framework-light releases

Christopher D Jones; David Lange; E. Meschi; Shahzad Muzaffar; Andreas Pfeiffer; Natalia Ratnikova; Elizabeth Sexton-Kennedy

With the integration of all CMS software packages into one release, the CMS software release management team faced the problem that for some applications a big distribution size and a large number of unused packages have become a real issue. TWe describe a solution to this problem. Based on functionality requirements and dependency analysis, we define a self-contained subset of the full CMS software release and create a Partial Release for such applications. We describe a high level architecture for this model, and tools that are used to automate the release preparation. Finally we discuss the two most important use cases for which this approach is currently implemented.


arXiv: Computational Physics | 2013

Snowmass Computing Frontier: Software Development, Staffing and Training

David R. Brown; P. Elmer; R. Pordes; D. M. Asner; Gregory Dubois-Felsmann; V.Daniel Elvira; Robert Hatcher; Christopher E. Jones; Robert Kutschke; David Lange; Elizabeth Sexton-Kennedy; Craig Tull


Journal of Physics: Conference Series | 2018

Toward real-time data query systems in HEP

Jim Pivarski; David Lange; Thanat Jatuphattharachat

Collaboration


Dive into the David Lange's collaboration.

Top Co-Authors

Avatar

Doug Wright

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar

P. Elmer

Princeton University

View shared research outputs
Top Co-Authors

Avatar

Chris Hagmann

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Ethan L. Hull

Lawrence Berkeley National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John D. Valentine

Lawrence Livermore National Laboratory

View shared research outputs
Top Co-Authors

Avatar

K. Vetter

Lawrence Berkeley National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Karl Nelson

Lawrence Livermore National Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge