J. T. Childers
Argonne National Laboratory
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by J. T. Childers.
IEEE Transactions on Nuclear Science | 2007
S. Nam; H. S. Ahn; P. Allison; M. G. Bagliesi; Loius M. Barbier; J. J. Beatty; G. Bigongiari; T. J. Brandt; J. A. Jeon; J. T. Childers; N. B. Conklin; S. Coutu; Michael A. DuVernois; O. Ganel; J. H. Han; K. C. Kim; M.H. Lee; L. Lutz; P. Maestro; A. Malinine; P.S. Marrocchesi; Stephen Anthony Minnick; S. I. Mognet; Scott Lowry Nutter; I. H. Park; N. Park; E. S. Seo; R. Sina; P. Walpole; J. Wu
The balloon-borne cosmic-ray experiment CREAM (Cosmic Ray Energetics And Mass) has completed two flights in Antarctica, with a combined duration of 70 days. One of the detectors in the payload is the SCD (silicon charge detector) that measures the charge of high energy cosmic rays. The SCD was assembled with silicon sensors. A sensor is a 4 × 4 array of DC-coupled PIN diode pixels with the total active area of 21 × 16 mm2. The SCD used during the first flight (December 2004-January 2005) was a single layer device, then upgraded to a dual layer device for the second flight (December 2005-January 2006), covering the total sensitive area of 779 × 795 mm2. Flight data demonstrated that adding a second layer improved SCD performance, showing excellent particle charge resolution. With a total dissipation of 136 W for the dual layer system, special care was needed in designing thermal paths to keep the detector temperature within its operational range. As a consequence, flight temperatures of the SCD, even at diurnal maximum were kept below 38°C. The SCD mechanical structure was designed to minimize the possibility of damage to the sensors and electronics from the impacts of parachute deployment and landing. The detector was recovered successfully following the flight and is being refurbished for the next flight in 2007. Details of construction, operation, and performance are presented for the dual-layered SCD flown for the second CREAM flight.
Journal of Physics: Conference Series | 2015
J. T. Childers; Thomas D. Uram; Thomas LeCompte; Michael E. Papka; Doug Benjamin
Demand for Grid resources is expected to double during LHC Run II as compared to Run I; the capacity of the Grid, however, will not double. The HEP community must consider how to bridge this computing gap by targeting larger compute resources and using the available compute resources as efficiently as possible. Argonnes Mira, the fifth fastest supercomputer in the world, can run roughly five times the number of parallel processes that the ATLAS experiment typically uses on the Grid. We ported Alpgen, a serial x86 code, to run as a parallel application under MPI on the Blue Gene/Q architecture. By analysis of the Alpgen code, we reduced the memory footprint to allow running 64 threads per node, utilizing the four hardware threads available per core on the PowerPC A2 processor. Event generation and unweighting, typically run as independent serial phases, are coupled together in a single job in this scenario, reducing intermediate writes to the filesystem. By these optimizations, we have successfully run LHC proton-proton physics event generation at the scale of a million threads, filling two-thirds of Mira.
Journal of Physics: Conference Series | 2011
J. T. Childers
The ATLAS Level-1 Calorimeter Trigger identifies high-pT objects in the Liquid Argon and Tile Calorimeters with a fixed latency of up to 2.5μs using a hardware-based, pipelined system built with custom electronics. The Preprocessor Module conditions and digitizes about 7200 pre-summed analogue signals from the calorimeters at the LHC bunch-crossing frequency of 40 MHz, and performs bunch-crossing identification (BCID) and deposited energy measurement for each input signal. This information is passed to further processors for object classification and total energy calculation, and the results are used to make the Level-1 trigger decision for the ATLAS detector. The BCID and energy measurement in the trigger depend on precise timing adjustments to achieve correct sampling of the input signal peak. Test pulses from the calorimeters were analysed to derive the initial timing and energy calibration, and first data from the LHC restart in autumn 2009 and early 2010 were used for validation and further optimization. The results from these calibration measurements are presented.
Journal of Physics: Conference Series | 2015
Thomas D. Uram; J. T. Childers; Thomas LeCompte; Michael E. Papka; Doug Benjamin
HEPs demand for computing resources has grown beyond the capacity of the Grid, and these demands will accelerate with the higher energy and luminosity planned for Run II. Mira, the ten petaFLOPs supercomputer at the Argonne Leadership Computing Facility, is a potentially significant compute resource for HEP research. Through an award of fifty million hours on Mira, we have delivered millions of events to LHC experiments by establishing the means of marshaling jobs through serial stages on local clusters, and parallel stages on Mira. We are running several HEP applications, including Alpgen, Pythia, Sherpa, and Geant4. Event generators, such as Sherpa, typically have a split workload: a small scale integration phase, and a second, more scalable, event-generation phase. To accommodate this workload on Mira we have developed two Python-based Django applications, Balsam and ARGO. Balsam is a generalized scheduler interface which uses a plugin system for interacting with scheduler software such as HTCondor, Cobalt, and TORQUE. ARGO is a workflow manager that submits jobs to instances of Balsam. Through these mechanisms, the serial and parallel tasks within jobs are executed on the appropriate resources. This approach and its integration with the PanDA production system will be discussed.
Journal of the Korean Physical Society | 2006
N. Park; S. Nam; J. H. Han; J. H. Hyun; J. A. Jeon; Jik Lee; I. H. Park; J. Yang; H. S. Ahn; O. Ganel; K. C. Kim; M.H. Lee; L. Lutz; A. Mallnine; E. S. Seo; R. Sina; J. Wu; Y.S. Yoon; P. Allison; J. J. Beatty; M. G. Bagliesi; G. Bigongjari; P. Maestro; P. S. Marrocchesi; R. Zei; P. J. Boyle; Simon P. Swordy; S. P. Wakely; J. T. Childers; Michael A. DuVernois
29th International Cosmic Ray Conference | 2005
H. S. Ahn; P. Allison; M. G. Bagliesi; J. J. Beatty; G. Bigongiari; P. J. Boyle; J. T. Childers; N. B. Conklin; S. Coutu; Michael A. DuVernois; O. Ganel; J. H. Han; H. J. Hyun; Jongbum Jeon; K. C. Kim; Jik Lee; L. Lutz; P. Maestro; A. Malinine; P.S. Marrocchesi; Stephen Anthony Minnick; S. I. Mognet; S. Nam; Scott Lowry Nutter; N. Park; H. Park; I. H. Park; E. S. Seo; R. Sina; Simon P. Swordy
Archive | 2005
Melissa Lee; Hyo-sung Ahn; P. Allison; M. G. Bagliesi; G. Bigongiari; James J. Beatty; P. J. Boyle; J. T. Childers; N. B. Conklin; S. Coutu; Michael A. DuVernois; O. Ganel; Jeong-hee Han; H. J. Hyun; Jongbum Jeon; K. C. Kim; Jung-kyuen Lee; L. Lutz; P. Maestro; A. Malinine; P. S. Marrocchesi; Stephen Anthony Minnick; S. I. Mognet; Suk Woo Nam; Scott Lowry Nutter; H. Park; Il Han Park; N. Park; E. S. Seo; R. Sina
29th International Cosmic Ray Conference | 2005
E. S. Seo; H. S. Ahn; P. Allison; M. G. Bagliesi; J. J. Beatty; G. Bigongiari; P. J. Boyle; J. T. Childers; N. B. Conklin; S. Coutu; Michael A. DuVernois; O. Ganel; J. H. Han; H. J. Hyun; Jongbum Jeon; K. C. Kim; Jik Lee; M.H. Lee; L. Lutz; P. Maestro; A. Malinine; P. S. Marrocchesi; Stephen Anthony Minnick; S. I. Mognet; S. Nam; Scott Lowry Nutter; N. Park; H. Park; I. H. Park; R. Sina
Archive | 2008
J. T. Childers; Michael A. DuVernois
Archive | 2005
Y.S. Yoon; H. S. Ahn; P. Allison; M. G. Bagliesi; J. J. Beatty; G. Bigongiari; P. J. Boyle; J. T. Childers; N. B. Conklin; S. Coutu; Michael A. DuVernois; O. Ganel; J. H. Han; H. J. Hyun; Jongbum Jeon; K. C. Kim; Jik Lee; M.H. Lee; L. Lutz; P. Maestro; A. Malinine; P.S. Marrocchesi; Stephen Anthony Minnick; S. Nam; Scott Lowry Nutter; H. Park; E. S. Seo; R. Sina; Simon P. Swordy; S. P. Wakely