Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where D. Prigent is active.

Publication


Featured researches published by D. Prigent.


ieee npss real time conference | 2004

The base-line DataFlow system of the ATLAS trigger and DAQ

H. Beck; M. Abolins; A. Dos Anjos; M. Barisonzi; M. Beretta; R. E. Blair; J. A. Bogaerts; H. Boterenbrood; D. Botterill; M. D. Ciobotaru; E.P. Cortezon; R. Cranfield; G. Crone; J. Dawson; R. Dobinson; Y. Ermoline; M. L. Ferrer; D. Francis; S. Gadomski; S. Gameiro; P. Golonka; B. Gorini; B. Green; M. Gruwe; S. Haas; C. Haeberli; Y. Hasegawa; R. Hauser; Christian Hinkelbein; R. E. Hughes-Jones

The base-line design and implementation of the ATLAS DAQ DataFlow system is described. The main components of the DataFlow system, their interactions, bandwidths, and rates are discussed and performance measurements on a 10% scale prototype for the final ATLAS TDAQ DataFlow system are presented. This prototype is a combination of custom design components and of multithreaded software applications implemented in C++ and running in a Linux environment on commercially available PCs interconnected by a fully switched gigabit Ethernet network.


ieee-npss real-time conference | 2005

Deployment and use of the ATLAS DAQ in the combined test beam

S. Gadomski; M. Abolins; I. Alexandrov; A. Amorim; C. Padilla-Aranda; E. Badescu; N. Barros; H. P. Beck; R. E. Blair; D. Burckhart-Chromek; M. Caprini; M. Ciobotaru; P. Conde-Muíño; A. Corso-Radu; M. Diaz-Gomez; R. Dobinson; M. Dobson; Roberto Ferrari; M. L. Ferrer; David Francis; S. Gameiro; B. Gorini; M. Gruwe; S. Haas; C. Haeberli; R. Hauser; R. E. Hughes-Jones; M. Joos; A. Kazarov; D. Klose

The ATLAS collaboration at CERN operated a combined test beam (CTB) from May until November 2004. The prototype of ATLAS data acquisition system (DAQ) was used to integrate other subsystems into a common CTB setup. Data were collected synchronously from all the ATLAS detectors, which represented nine different detector technologies. Electronics and software of the first level trigger were used to trigger the setup. Event selection algorithms of the high level trigger were integrated with the system and were tested with real detector data. A possibility to operate a remote event filter farm synchronized with ATLAS TDAQ was also tested. Event data, as well as detectors conditions data were made available for offline analysis


Archive | 2004

Performance of the ATLAS DAQ DataFlow system

G. Unel; E. Pasqualucci; M. Gruwe; H. Beck; H. Zobernig; R. Ferrari; M. Abolins; D. Prigent; K. Nakayoshi; Pérez-Réale; R. Hauser; G. Crone; A. J. Lankford; A. Kaczmarska; D. Botterill; Fred Wickens; Y. Nagasaka; L. Tremblet; R. Spiwoks; E Palencia-Cortezon; S. Gameiro; P. Golonka; R. E. Blair; G. Kieft; J. L. Schlereth; J. Petersen; J. A. Bogaerts; A. Misiejuk; Y. Hasegawa; M. Le Vine

The baseline DAQ architecture of the ATLAS Experiment at LHC is introduced and its present implementation and the performance of the DAQ components as measured in a laboratory environment are summarized. It will be shown that the discrete event simulation model of the DAQ system, tuned using these measurements, does predict the behaviour of the prototype configurations well, after which, predictions for the final ATLAS system are presented. With the currently available hardware and software, a system using ~140 ROSs with 3GHz single cpu, ~100 SFIs with dual 2.4 GHz cpu and ~500 L2PUs with dual 3.06 GHz cpu can achieve the dataflow for 100 kHz Level 1 rate, with 97% reduction at Level 2 and 3 kHz event building rate. ATLAS DATAFLOW SYSTEM The 40 MHz collision rate at the LHC produces about 25 interactions per bunch crossing, resulting in terabytes of data per second, which has to be handled by the detector electronics and the trigger and DAQ system [1]. A Level1 (L1) trigger system based on custom electronics will reduce the event rate to 75 kHz (upgradeable to 100 kHz – this paper uses the more demanding 100 kHz). The ________________________________________ #. Also affiliated with University of California at Irvine, Irvine, USA *. On leave from Henryk Niewodniczanski Institute of Nucl. Physics, Cracow +. Presently at CERN, Geneva, Switzerland 91 DAQ system is responsible for: the readout of the detector specific electronics via 1630 point to point read-out links (ROL) hosted by Readout Subsystems (ROS), the collection and provision of “Region of Interest data” (ROI) to the Level2 (L2) trigger, the building of events accepted by the L2 trigger and their subsequent input to the Event Filter (EF) system where they are subject to further selection criteria. The DAQ also provides the functionality for the configuration, control, information exchange and monitoring of the whole ATLAS detector readout [2]. The applications in the DAQ software dealing with the flow of event and monitoring data as well as the trigger information are called “DataFlow” applications. The DataFlow applications up to the EF input and their interactions are shown in Figure 1. Figure 1 ATLAS DAQ-DataFlow applications and their interactions (up to the EventFilter) SFI L2PU L2SV DFM pROS ROS ROI data


IEEE Transactions on Nuclear Science | 2006

Deployment and Use of the ATLAS DAQ in the Combined Test Beam

S. Gadomski; M. Abolins; I. Alexandrov; A. Amorim; C. Padilla-Aranda; E. Badescu; N. Barros; H. Beck; R. E. Blair; D. Burckhart-Chromek; M. Caprini; M. D. Ciobotaru; P. Conde-Muíño; A. Corso-Radu; M. Diaz-Gomez; R. Dobinson; M. Dobson; R. Ferrari; M. L. Ferrer; D. Francis; S. Gameiro; B. Gorini; M. Gruwe; S. Haas; C. Haeberli; R. Hauser; R. E. Hughes-Jones; M. Joos; A. Kazarov; D. Klose


IEEE Transactions on Nuclear Science | 2006

ATLAS DataFlow: the read-out subsystem, results from trigger and data-acquisition system testbed studies and from modeling

J. C. Vermeulen; M. Abolins; I. Alexandrov; A. Amorim; A. Dos Anjos; E. Badescu; N. Barros; H. Beck; R. E. Blair; D. Burckhart-Chromek; M. Caprini; M. D. Ciobotaru; A. Corso-Radu; R. Cranfield; G. Crone; J. W. Dawson; R. Dobinson; M. Dobson; G. Drake; Y. Ermoline; R. Ferrari; M. L. Ferrer; D. Francis; S. Gadomski; S. Gameiro; B. Gorini; B. Green; M. Gruwe; S. Haas; W. Haberichter

Collaboration


Dive into the D. Prigent's collaboration.

Top Co-Authors

Avatar

M. Abolins

Michigan State University

View shared research outputs
Top Co-Authors

Avatar

R. E. Blair

Argonne National Laboratory

View shared research outputs
Top Co-Authors

Avatar

H. Beck

Heidelberg University

View shared research outputs
Top Co-Authors

Avatar

R. Hauser

Michigan State University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge