J. J. Nebrensky
Brunel University London
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by J. J. Nebrensky.
Nuclear Instruments & Methods in Physics Research Section A-accelerators Spectrometers Detectors and Associated Equipment | 2011
M. Ellis; P. R. Hobson; P. Kyberd; J. J. Nebrensky; A. Bross; J. Fagan; T. Fitzpatrick; R. Flores; R. Kubinski; J. Krider; R. Rucinski; P. Rubinov; C. Tolian; T. L. Hart; Daniel M. Kaplan; W. Luebke; B. Freemire; M. Wojcik; G. Barber; D. Clark; I. Clark; P.J. Dornan; A. Fish; S. Greenwood; R. Hare; A.K. Jamdagni; V. Kasey; M. Khaleeq; J. Leaver; Kenneth Long
Charged-particle tracking in the international Muon Ionisation Cooling Experiment (MICE) will be performed using two solenoidal spectrometers, each instrumented with a tracking detector based on 350 mu m diameter scintillating fibres. The design and construction of the trackers is described along with the quality-assurance procedures, photon-detection system, readout electronics, reconstruction and simulation software and the data-acquisition system. Finally, the performance of the MICE tracker, determined using cosmic rays, is presented
Optical diagnostics for industrial applications. Conference | 2000
Gary Craig; Stephen J. Alexander; S. Anderson; David C. Hendry; P.R. Hobson; Richard S. Lampitt; Benjamin Lucas-Leclin; H. Nareid; J. J. Nebrensky; M A Player; Kevin Saw; K. Tipping; John Watson
The HoloCam system is a major component of a multi-national multi- discipline project known as HoloMar (funded by the European Commission under the MAST III initiative). The project is concerned with the development of pulsed laser holography to analyse and monitor the populations of living organisms and inanimate particles within the worlds oceans. We describe here the development, construction and evaluation of a prototype underwater camera, the purpose of which is to record marine organisms and particles, in-situ. Recording using holography provides several advantages over conventional sampling methods in that it allows non-intrusive, non-destructive, high- resolution imaging of large volumes (up to 105 cm3) in three dimensions. The camera incorporates both in-line and off-axis holographic techniques, which allows particles from a few micrometres to tens of centimetres to be captured. In tandem with development of the HoloCam, a dedicated holographic replay system and an automated data extraction and image processing facility are being developed. These will allow, optimisation of the images recorded by the camera, identification of species and particle concentration plotting.
ieee nuclear science symposium | 2003
D. Bonacorsi; David Colling; L Field; Sm Fisher; C. Grandi; P.R. Hobson; P. Kyberd; B. C. MacEvoy; J. J. Nebrensky; H Tallini; S. Traylen
High-energy physics experiments, such as the compact muon solenoid (CMS) at the large hadron collider (LHC), have large-scale data processing computing requirements. The grid has been chosen as the solution. One important challenge when using the grid for large-scale data processing is the ability to monitor the large numbers of jobs that are being executed simultaneously at multiple remote sites. The relational grid monitoring architecture (R-GMA) is a monitoring and information management service for distributed resources based on the GMA of the Global Grid Forum. We report on the first measurements of R-GMA as part of a monitoring architecture to be used for batch submission of multiple Monte Carlo simulation jobs running on a CMS-specific LHC computing grid test bed. Monitoring information was transferred in real time from remote execution nodes back to the submitting host and stored in a database. In scalability tests, the job submission rates supported by successive releases of R-GMA improved significantly, approaching that expected in full-scale production.
Journal of Physics: Conference Series | 2012
I D Reid; J. J. Nebrensky; P. R. Hobson
In-line holographic imaging is used for small particulates, such as cloud or spray droplets, marine plankton, and alluvial sediments, and enables a true 3D object field to be recorded at high resolution over a considerable depth. To reconstruct a digital hologram a 2D FFT must be calculated for every depth slice desired in the replayed image volume. A typical in-line hologram of ~ 100 micrometre-sized particles over a depth of a few hundred millimetres will require O(1000) 2D FFT operations to be performed on an hologram of typically a few million pixels. In previous work we have reported on our experiences with reconstruction on a computational grid. In this paper we discuss the technical challenges in making efficient use of the NVIDIA Tesla and Fermi GPU systems and show how our reconstruction code was optimised for near real-time video slice reconstruction with holograms as large as 4K by 4K pixels. We also consider the implications for grid and cloud computing approaches to hologram replay, and the extent to which a GPU can replace these approaches, when the important step of locating focussed objects within a reconstructed volume is included.
IEEE Symposium Conference Record Nuclear Science 2004. | 2004
R. Byrom; David Colling; S. M. Fisher; C. Grandi; P.R. Hobson; P. Kyberd; B. C. MacEvoy; J. J. Nebrensky; H Tallini; S. Traylen
High Energy Physics experiments, such as the Compact Muon Solenoid (CMS) at the CERN laboratory in Geneva, have large-scale data processing requirements, with stored data accumulating at a rate of 1 Gbyte/s. This load comfortably exceeds any previous processing requirements and we believe it may be most efficiently satisfied through Grid computing. Management of large Monte Carlo productions (/spl sim/3000 jobs) or data analyses and the quality assurance of the results requires careful monitoring and bookkeeping, and an important requirement when using the Grid is the ability to monitor transparently the large number of jobs that are being executed simultaneously at multiple remote sites. R-GMA is a monitoring and information management service for distributed resources based on the Grid Monitoring Architecture of the Global Grid Forum. We have previously developed a system allowing us to test its performance under a heavy load while using few real Grid resources. We present the latest results on this system and compare them with the data collected while running actual CMS simulation jobs on the LCG2 Grid test bed.
Adaptive Optics: Analysis and Methods/Computational Optical Sensing and Imaging/Information Photonics/Signal Recovery and Synthesis Topical Meetings on CD-ROM (2007), paper DWB5 | 2007
J. J. Nebrensky; P.R. Hobson
Each plane within an in-line hologram of a particle field can be reconstructed by a separate computer. We investigate strategies to reproduce the sample volume as quickly and efficiently as possible using Grid computing.
Holography 2005: International Conference on Holography, Optical Recording, and Processing of Information | 2006
J. J. Nebrensky; P.R. Hobson
Digital holography is greatly extending the range of holographys applications and moving it from the lab into the field: a single CCD or other solid-state sensor can capture any number of holograms while numerical reconstruction within a computer eliminates the need for chemical development and readily allows further processing and visualisation of the holographic image. The steady increase in sensor pixel count leads to the possibilities of larger sample volumes, while smaller-area pixels enable the practical use of digital off-axis holography. However this increase in pixel count also drives a corresponding expansion of the computational effort needed to numerically reconstruct such holograms to an extent where the reconstruction process for a single depth slice takes significantly longer than the capture process for each single hologram. Grid computing - arecent innovation in large-scale distributed processing - provides a convenient means of harnessing significant computing resources in an ad-hoc fashion that might match the field deployment of a holographic instrument. We describe here the reconstruction of digital holograms on a trans-national computational Grid with over 10 000 nodes available at over 100 sites. A simplistic scheme of deployment was found to provide no computational advantage over a single powerful workstation. Based on these experiences we suggest an improved strategy for workflow and job execution for the replay of digital holograms on a Grid.
ieee nuclear science symposium | 2005
R. Byrom; David Colling; Sm Fisher; C. Grandi; P.R. Hobson; P. Kyberd; B. C. MacEvoy; J. J. Nebrensky; S. Traylen
High energy physics experiments, such as the Compact Muon Solenoid (CMS) at the CERN laboratory in Geneva, have large-scale data processing requirements, with data accumulating at a rate of 1 Gbyte/s. This load comfortably exceeds any previous processing requirements and we believe it may be most efficiently satisfied through grid computing. Furthermore the production of large quantities of Monte Carlo simulated data provides an ideal test bed for grid technologies and will drive their development. One important challenge when using the grid for data analysis is the ability to monitor transparently the large number of jobs that are being executed simultaneously at multiple remote sites. R-GMA is a monitoring and information management service for distributed resources based on the grid monitoring architecture of the Global Grid Forum. We have previously developed a system allowing us to test its performance under a heavy load while using few real grid resources. We present the latest results on this system running on the LCG 2 grid test bed using the LCG 2.6.0 middleware release. For a sustained load equivalent to 7 generations of 1000 simultaneous jobs, R-GMA was able to transfer all published messages and store them in a database for 98% of the individual jobs. The failures experienced were at the remote sites, rather than at the archivers MON box as had been expected
Photonics applications in astronomy, communications, industry, and high-energy physics experiments. COnference | 2005
J. J. Nebrensky; P.R. Hobson; P. C. Fryer
Digital holography has the potential to greatly extend holographys applications and move it from the lab into the field: a single CCD or other solid-state sensor can capture any number of holograms while numerical reconstruction within a computer eliminates the need for chemical processing and readily allows further processing and visualization of the holographic image. The steady increase in sensor pixel count and resolution leads to the possibilities of larger sample volumes and of higher spatial resolution sampling, enabling the practical use of digital off-axis holography. However, this increase in pixel count also drives a corresponding expansion of the computational effort needed to numerically reconstruct such holograms to an extent where the reconstruction process for a single depth slice takes significantly longer than the capture process for each single hologram. Grid computing -- a recent innovation in large-scale distributed processing -- provides a convenient means of harnessing significant computing resources in ad-hoc fashion that might match the field deployment of a holographic instrument. In this paper we consider the computational needs of digital holography and discuss the deployment of numerical reconstruction software over an existing Grid testbed. The analysis of marine organisms is used as an exemplar for work flow and job execution of in-line digital holography.
Optical diagnostics for industrial applications. Conference | 2000
J. J. Nebrensky; Gary Craig; P.R. Hobson; Richard S. Lampitt; H. Nareid; A. Pescetto; Andrea Trucco; John Watson
Pulsed laser holography in an extremely powerful technique for the study of particle fields as it allows instantaneous, non-invasive high- resolution recording of substantial volumes. By relaying the real image one can obtain the size, shape, position and - if multiple exposures are made - velocity of every object in the recorded field. Manual analysis of large volumes containing thousands of particles is, however, an enormous and time-consuming task, with operator fatigue an unpredictable source of errors. Clearly the value of holographic measurements also depends crucially on the quality of the reconstructed image: not only will poor resolution degrade the size and shape measurements, but aberrations such as coma and astigmatism can change the perceived centroid of a particle, affecting position and velocity measurements. For large-scale applications of particle field holography, specifically the in situ recording of marine plankton with Holocam, we have developed an automated data extraction system that can be readily switched between the in-line and off-axis geometries and provides optimised reconstruction from holograms recorded underwater. As a videocamera is automatically stepped through the 200 by 200 by 1000mm sample volume, image processing and object tracking routines locate and extract particle images for further classification by a separate software module.