C. Meessen
CERN
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by C. Meessen.
ieee nuclear science symposium | 2003
S.R. Armstrong; John Baines; C. P. Bee; M. Biglietti; A. Bogaerts; V. Boisvert; M. Bosman; S. Brandt; B. Caron; P. Casado; G. Cataldi; D. Cavalli; M. Cervetto; G. Comune; A. Corso-Radu; A. Di Mattia; M.D. Gomez; A. Dos Anjos; J.G. Drohan; N. Ellis; M. Elsing; B. Epp; F. Etienne; S. Falciano; A. Farilla; S. George; V. M. Ghete; S. Gonzalez; M. Grothe; A. Kaczmarska
The ATLAS High Level Triggers (HLT) primary function of event selection will be accomplished with a Level-2 trigger farm and an event filter (EF) farm, both running software components developed in the ATLAS offline reconstruction framework. While this approach provides a unified software framework for event selection, it poses strict requirements on offline components critical for the Level-2 trigger. A Level-2 decision in ATLAS must typically be accomplished within 10 ms and with multiple event processing in concurrent threads. To address these constraints, prototypes have been developed that incorporate elements of the ATLAS data flow, high level trigger, and offline framework software. To realize a homogeneous software environment for offline components in the HLT, the Level-2 Steering Controller was developed. With electron/gamma- and muon-selection slices it has been shown that the required performance can be reached, if the offline components used are carefully designed and optimized for the application in the HLT.
ieee-npss real-time conference | 2007
H. P. Beck; M. Abolins; A. Battaglia; R. E. Blair; A. Bogaerts; M. Bosman; M. D. Ciobotaru; R. Cranfield; G. Crone; J. W. Dawson; R. Dobinson; M. Dobson; A. Dos Anjos; G. Drake; Y. Ermoline; R. Ferrari; M. L. Ferrer; D. Francis; S. Gadomski; S. Gameiro; B. Gorini; B. Green; W. Haberichter; C. Haberli; R. Hauser; Christian Hinkelbein; R. E. Hughes-Jones; M. Joos; G. Kieft; S. Klous
Event data from proton-proton collisions at the LHC will be selected by the ATLAS experiment in a three level trigger system, which reduces the initial bunch crossing rate of 40 MHz at its first two trigger levels (LVL1+LVL2) to ~3 kHz. At this rate the Event-Builder collects the data from all read-out system PCs (ROSs) and provides fully assembled events to the the event-filter (EF), which is the third level trigger, to achieve a further rate reduction to ~ 200 Hz for permanent storage. The event-builder is based on a farm of O(100) PCs, interconnected via gigabit Ethernet to O(150) ROSs. These PCs run Linux and multi-threaded software applications implemented in C++. All the ROSs and one third of the event-builder PCs are already installed and commissioned. We report on performance tests on this initial system, which show promising results to reach the final data throughput required for the ATLAS experiment.
Nuclear Instruments & Methods in Physics Research Section A-accelerators Spectrometers Detectors and Associated Equipment | 2004
S. Armstrong; K. Assamagan; John Baines; C. P. Bee; M. Biglietti; A. Bogaerts; V. Boisvert; M. Bosman; S. Brandt; B. Caron; P. Casado; G. Cataldi; D. Cavalli; M. Cervetto; G. Comune; A. Corso-Radu; A. Di Mattia; M.M. Diaz Gomez; A. Dos Anjos; J.G. Drohan; N. Ellis; M. Elsing; B. Epp; F. Etienne; S. Falciano; A. Farilla; Simon George; V. M. Ghete; S. Gonzalez; M. Grothe
We present an overview of the strategy for Event Selection at the ATLAS High Level Trigger and describe the architecture and main components of the software developed for this purpose.
IEEE Transactions on Nuclear Science | 2005
S.R. Armstrong; A. Dos Anjos; John Baines; C. P. Bee; M. Biglietti; J. A. Bogaerts; V. Boisvert; M. Bosman; B. Caron; P. Casado; G. Cataldi; D. Cavalli; M. Cervetto; G. Comune; Pc Muino; A. De Santo; M.D. Gomez; M. Dosil; N. Ellis; D. Emeliyanov; B. Epp; F. Etienne; S. Falciano; A. Farilla; Simon George; V. M. Ghete; S. Gonzalez; M. Grothe; S. Kabana; A. Khomich
The Event Filter (EF) selection stage is a fundamental component of the ATLAS Trigger and Data Acquisition architecture. Its primary function is the reduction of data flow and rate to values acceptable by the mass storage operations and by the subsequent offline data reconstruction and analysis steps. The computing instrument of the EF is organized as a set of independent subfarms, each connected to one output of the Event Builder (EB) switch fabric. Each subfarm comprises a number of processors analyzing several complete events in parallel. This paper describes the design of the ATLAS EF system, its deployment in the 2004 ATLAS combined test beam together with some examples of integrating selection and monitoring algorithms. Since the processing algorithms are not explicitly designed for EF but are adapted from the offline ones, special emphasis is reserved to system reliability and data security, in particular for the case of failures in the processing algorithms. Other key design elements have been system modularity and scalability. The EF shall be able to follow technology evolution and should allow for using additional processing resources possibly remotely located
Archive | 2004
P. Conde-Muíño; C. Santamarina-Rios; A. Negri; J. Masik; Philip A. Pinto; S. George; S. Resconi; S. Tapprogge; Z. Qian; V. Vercesi; V. Pérez-Réale; M. Grothe; L. Luminari; John Baines; B. Caron; P. Werner; N. Panikashvili; R. Soluk; A. Di Mattia; A. Kootz; C. Sanchez; B. Venda-Pinto; F. Touchard; N. Nikitin; S. Gonzalez; E. Stefanidis; A. J. Lowe; M. Dosil; V. Boisvert; E. Thomas
During the runtime of any experiment, a central monitoring system that detects problems as soon as they appear has an essential role. In a large experiment, like ATLAS, the online data acquisition system is distributed across the nodes of large farms, each of them running several processes that analyse a fraction of the events. In this architecture, it is necessary to have a central process that collects all the monitoring data from the different nodes, produces full statistics histograms and analyses them. In this paper we present the design of such a system, called the gatherer. It allows to collect any monitoring object, such as histograms, from the farm nodes, from any process in the
IEEE Transactions on Nuclear Science | 2008
W. Vandelli; M. Abolins; A. Battaglia; H. P. Beck; R. E. Blair; A. Bogaerts; M. Bosman; M. D. Ciobotaru; R. Cranfield; G. Crone; J. W. Dawson; R. Dobinson; M. Dobson; A. Dos Anjos; G. Drake; Y. Ermoline; R. Ferrari; M. L. Ferrer; D. Francis; S. Gadomski; S. Gameiro; B. Gorini; B. Green; W. Haberichter; C. Haberli; R. Hauser; Christian Hinkelbein; R. E. Hughes-Jones; M. Joos; G. Kieft
Event data from proton-proton collisions at the LHC will be selected by the ATLAS experiment in a three-level trigger system, which, at its first two trigger levels (LVL1+LVL2), reduces the initial bunch crossing rate of 40 MHz to ~3 kHz. At this rate, the Event Builder collects the data from the readout system PCs (ROSs) and provides fully assembled events to the Event Filter (EF). The EF is the third trigger level and its aim is to achieve a further rate reduction to ~200 Hz on the permanent storage. The Event Builder is based on a farm of 0(100) PCs, interconnected via a Gigabit Ethernet to 0(150) ROSs. These PCs run Linux and multi-threaded software applications implemented in C++. All the ROSs, and substantial fractions of the Event Builder and Event Filter PCs have been installed and commissioned. We report on performance tests on this initial system, which is capable of going beyond the required data rates and bandwidths for Event Building for the ATLAS experiment.
IEEE Transactions on Nuclear Science | 2006
S.R. Armstrong; A. Dos Anjos; John Baines; C. P. Bee; M. Biglietti; J. A. Bogaerts; V. Boisvert; M. Bosman; B. Caron; P. Casado; G. Cataldi; D. Cavalli; M. Cervetto; G. Comune; Pc Muino; A. De Santo; A. Di Mattia; M.D. Gomez; M. Dosil; N. Ellis; D. Emeliyanov; B. Epp; S. Falciano; A. Farilla; Simon George; V. M. Ghete; S. Gonzalez; M. Grothe; S. Kabana; A. Khomich
To cope with the 40 MHz event production rate of LHC, the trigger of the ATLAS experiment selects events in three sequential steps of increasing complexity and accuracy whose final results are close to the offline reconstruction. The Level-1, implemented with custom hardware, identifies physics objects within Regions of Interests and operates with a first reduction of the event rate to 75 kHz. The higher trigger levels, Level-2 and Level-3, provide a software based event selection which further reduces the event rate to about 100 Hz. This paper presents the algorithm (/spl mu/Fast) employed at Level-2 to confirm the muon candidates flagged by the Level-1. /spl mu/Fast identifies hits of muon tracks inside the barrel region of the Muon Spectrometer and provides a precise measurement of the muon momentum at the production vertex. The algorithm must process the Level-1 muon output rate (/spl sim/20 kHz), thus particular care has been taken for its optimization. The result is a very fast track reconstruction algorithm with good physics performance which, in some cases, approaches that of the offline reconstruction: it finds muon tracks with an efficiency of about 95% and computes p/sub T/ of prompt muons with a resolution of 5.5% at 6 GeV and 4.0% at 20 GeV. The algorithm requires an overall execution time of /spl sim/1 ms on a 100 SpecInt95 machine and has been tested in the online environment of the Atlas detector test beam.
IEEE Transactions on Nuclear Science | 2008
H. P. Beck; M. Abolins; A. Battaglia; R. E. Blair; A. Bogaerts; M. Bosman; M. D. Ciobotaru; R. Cranfield; G. Crone; J. W. Dawson; R. Dobinson; M. Dobson; A. Dos Anjos; G. Drake; Y. Ermoline; R. Ferrari; M. L. Ferrer; D. Francis; S. Gadomski; S. Gameiro; B. Gorini; B. Green; W. Haberichter; C. Haberli; R. Hauser; Christian Hinkelbein; R. E. Hughes-Jones; M. Joos; G. Kieft; S. Klous
Event data from proton-proton collisions at the LHC will be selected by the ATLAS experiment by a three level trigger system, which reduces the initial bunch crossing rate of 40 MHz at its first two trigger levels (LVL1+LVL2) to ~3 kHz. At this rate the Event-Builder collects the data from all Read-Out system PCs (ROSs) and provides fully assembled events to the the Event-Filter (EF), which is the third level trigger, to achieve a further rate reduction to ~200 Hz for permanent storage. The Event-Builder is based on a farm of 0 (100) PCs, interconnected via Gigabit Ethernet to 0 (150) ROSs. These PCs run Linux and multi-threaded software applications implemented in C++. All the ROSs and one third of the Event-Builder PCs are already installed and commissioned. Performance measurements have been exercised on this initial system, which show promising results that the required final data rates and bandwidth for the ATLAS event builder are in reach.
IEEE Transactions on Nuclear Science | 2006
C. Santamarina; Pc Muino; A. Dos Anjos; S.R. Armstrong; Jt Baines; C. P. Bee; M. Biglietti; J. A. Bogaerts; M. Bosman; B. Caron; P. Casado; G. Cataldi; D. Cavalli; G. Comune; G. Crone; D. Damazio; A. De Santo; M.D. Gomez; A. Di Mattia; N. Ellis; D. Emeliyanov; B. Epp; S. Falciano; H. Garitaonandia; Simon George; A.G. Mello; V. M. Ghete; R. Gonçalo; J. Haller; S. Kabana
ATLAS is one of the four major Large Hadron Collider (LHC) experiments that will start data taking in 2007. It is designed to cover a wide range of physics topics. The ATLAS trigger system has to be able to reduce an initial 40 MHz event rate, corresponding to an average of 23 proton-proton inelastic interactions per every 25 ns bunch crossing, to 200 Hz admissible by the Data Acquisition System. The ATLAS trigger is divided in three different levels. The first one provides a signal describing an event signature using dedicated custom hardware. This signature must be confirmed by the High Level Trigger (HLT) which using commercial computing farms performs an event reconstruction by running a sequence of algorithms. The validity of a signature is checked after every algorithm execution. A main characteristic of the ATLAS HLT is that only the data in a certain window around the position flagged by the first level trigger are analyzed. In this work, the performance of one sequence that runs at the Event Filter level (third level) is demonstrated. The goal of this sequence is to reconstruct and identify high transverse momentum electrons by performing cluster reconstruction at the electromagnetic calorimeter, track reconstruction at the Inner Detector, and cluster track matching.
nuclear science symposium and medical imaging conference | 2004
S. Armstrong; A.D. Anjos; John Baines; C. P. Bee; Michela Biglietti; J. A. Bogaerts; V. Boisvert; M. Bosman; B. Caron; P. Casado; G. Cataldi; D. Cavalli; M. Cervetto; G. Comune; Pc Muino; A. De Santo; M.D. Gomez; M. Dosil; N. Ellis; D. Emeliyanov; B. Epp; S. Falciano; A. Farilla; S. George; V. Ghete; S. Gonzalez; M. Grothe; S. Kabana; A. Khomich; G. Kilvington
The ATLAS experiment at the Large Hadron Collider (LHC) will face the challenge of efficiently selecting interesting candidate events in pp collisions at 14 TeV center of mass energy, while rejecting the enormous number of background events, stemming from an interaction rate of up to 10/sup 9/ Hz. The First Level trigger will reduce this rate to around O(100 kHz). Subsequently, the High Level Trigger (HLT), which is comprised of the Second Level trigger and the Event Filter, will need to further reduce this rate by a factor of O(10/sup 3/). The HLT selection is software based and will be implemented on commercial CPUs, using a common framework built on the standard ATLAS object oriented software architecture. In this paper an overview of the current implementation of the selection for electrons and photons in the HLT is given. The performance of this implementation has been evaluated using Monte Carlo simulations in terms of the efficiency for the signal channels, rate expected for the selection, data access and manipulation times, and algorithm execution times. Besides the efficiency and rate estimates, some physics examples will be discussed, showing that the triggers are well adapted for the physics programme envisaged at LHC. The electron and photon trigger software is also being exercised at the ATLAS 2004 Combined Test Beam, where components from all ATLAS subdetectors are taking data together along the the H8 SPS extraction line at CERN; from these tests a validation of the selection architecture chosen in a real on-line environment is expected.