Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where R. Spiwoks is active.

Publication


Featured researches published by R. Spiwoks.


Journal of Instrumentation | 2008

The ATLAS central level-1 trigger logic and TTC system

S. Ask; D. Berge; P Borrego-Amaral; D. Caracinha; N. Ellis; P. Farthouat; P. Gallno; S. Haas; J. Haller; P. Klofver; A. Krasznahorkay; A. Messina; C. C. Ohm; T. Pauly; M. Perantoni; H Pessoa Lima Junior; G. Schuler; D. Sherman; R. Spiwoks; T. Wengler; J.M. de Seixas; R Torga Teixeira

The ATLAS central level-1 trigger logic consists in the Central Trigger Processor and the interface to the detector-specific muon level-1 trigger electronics. It is responsible for forming a level-1 trigger in the ATLAS experiment. The distribution of the timing, trigger and control information from the central trigger processor to the readout electronics of the ATLAS subdetectors is done with the TTC system. Both systems are presented.


ieee-npss real-time conference | 2005

The ROD Crate DAQ of the ATLAS data acquisition system

S. Gameiro; G. Crone; Roberto Ferrari; D. Francis; B. Gorini; M. Gruwe; M. Joos; G. Lehmann; L. Mapelli; A. Misiejuk; E. Pasqualucci; J. Petersen; R. Spiwoks; L. Tremblet; G. Unel; W. Vandelli; Y. Yasu

In the ATLAS experiment at the LHC, the ROD Crate DAQ provides a complete framework to implement data acquisition functionality at the boundary between the detector specific electronics and the common part of the data acquisition system. Based on a plugin mechanism, it allows selecting and using common services (like data output and data monitoring channels) and developing simple libraries to control, monitor, acquire and/or emulate detector specific electronics. Providing also event building functionality, the ROD Crate DAQ is intended to be the main data acquisition tool for the first phase of detector commissioning. This paper presents the design, functionality and performance of the ROD Crate DAQ and its usage in the ATLAS DAQ and during detector tests


ieee npss real time conference | 2004

The base-line DataFlow system of the ATLAS trigger and DAQ

H. Beck; M. Abolins; A. Dos Anjos; M. Barisonzi; M. Beretta; R. E. Blair; J. A. Bogaerts; H. Boterenbrood; D. Botterill; M. D. Ciobotaru; E.P. Cortezon; R. Cranfield; G. Crone; J. Dawson; R. Dobinson; Y. Ermoline; M. L. Ferrer; D. Francis; S. Gadomski; S. Gameiro; P. Golonka; B. Gorini; B. Green; M. Gruwe; S. Haas; C. Haeberli; Y. Hasegawa; R. Hauser; Christian Hinkelbein; R. E. Hughes-Jones

The base-line design and implementation of the ATLAS DAQ DataFlow system is described. The main components of the DataFlow system, their interactions, bandwidths, and rates are discussed and performance measurements on a 10% scale prototype for the final ATLAS TDAQ DataFlow system are presented. This prototype is a combination of custom design components and of multithreaded software applications implemented in C++ and running in a Linux environment on commercially available PCs interconnected by a fully switched gigabit Ethernet network.


Computer Physics Communications | 1998

The ATLAS DAQ and event filter prototype “−1” project

G. Ambrosini; D. Burckhart; M. Caprini; M. Cobal; P.-Y. Duval; F. Etienne; Roberto Ferrari; David Francis; R. W. L. Jones; M. Joos; S. Kolos; A. Lacourt; A. Le Van Suu; A. Mailov; L. Mapelli; M. Michelotto; G. Mornacchi; R. Nacasch; M. Niculescu; K. Nurdan; C. Ottavi; A. Patel; Frédéric Pennerath; J. Petersen; G. Polesello; D. Prigent; Z. Qian; J. Rochez; F. Scuri; M. Skiadelli

Abstract A project has been approved by the ATLAS Collaboration for the design and implementation of a Data Acquisition and Event Filter prototype, based on the functional architecture described in the ATLAS Technical Proposal. The prototype consists of a full “vertical” slice of the ATLAS Data Acquisition and Event Filter architecture, including all the hardware and software elements of the data flow, its control and monitoring as well as all the elements of a complete on-line system. This paper outlines the project, its goals, structure, schedule and current status and describes details of the system architecture and its components.


IEEE Transactions on Nuclear Science | 2008

The Configuration System of the ATLAS Trigger

A. Dos Anjos; P.J. Bell; D. Berge; J. Haller; S. Head; Shumin Li; A. Hocker; T. Kono; T. McMahon; M. Nozicka; H. von der Schmitt; R. Spiwoks; J. Stelzer; T. Wengler; Werner Wiedenmann

The ATLAS detector at CERNs LHC will be exposed to proton-proton collisions at a rate of 40 MHz. To reduce the data rate, only potentially interesting events are selected by a three-level trigger system. The first level is implemented in custom-made electronics, with an output rate to less than 100 kHz. The second and third level are software triggers with a final output rate of 100 to 200 Hz. A system has been designed and implemented that holds and records the full configuration information of all three trigger levels at a centrally maintained location. This system provides fast access to consistent configuration information of the online trigger system for the purpose of data taking as well as to all parts of the offline trigger simulation. The use of relational database technology provides a means of reliable recording of the trigger configuration history over the lifetime of the experiment. In addition to the online system, tools for flexible browsing and manipulation of trigger configurations, and for their distribution across the ATLAS reconstruction sites have been developed. The usability of this design has been demonstrated in dedicated configuration tests of the ATLAS level-1 Central Trigger and of a 600-node software trigger computing farm. Further tests on a computing cluster which is part of the final high level trigger system were also successful.


ieee-npss real-time conference | 2012

Topological and Central Trigger Processor for 2014 LHC luminosities

G. Anders; B. Bauss; D. Berge; V. Büscher; Taylor Childers; R. Degele; Eleanor Dobson; Andreas Ebling; Nick Ellis; Philippe Farthouat; Carolina Gabaldon; B. Gorini; S. Haas; Weina Ji; M. Kaneda; Stefan Mattig; A. Messina; Carsten Meyer; S. Moritz; T. Pauly; Ruth Pottgen; U. Schafer; R. Spiwoks; S. Tapprogge; T. Wengler; Volker Wenzel

The ATLAS experiment is located at the European Center for Nuclear Research (CERN) in Switzerland. It is designed to observe collisions at the Large Hadron Collider (LHC): the worlds largest and highest-energy particle accelerator. Event triggering and Data Acquisition is one of the extraordinary challenges faced by the detectors at the high luminosity LHC collider upgrade. During 2011, the LHC reached instantaneous luminosities of 4 × 1033cm-1s-1 and produced events with up to 24 interactions per colliding proton bunch. This places stringent operational and physical requirements on the ATLAS Trigger in order to reduce the nominal 40MHz collision rate to a manageable event storage rate up to 400Hz and, at the same time, select those events considered interesting. The Level-1 Trigger is the first rate-reducing step in the ATLAS Trigger, with an output rate of 75kHz and decision latency of less than 2.5μs. It is primarily composed of the Calorimeter Trigger, Muon Trigger, the Central Trigger Processor (CTP) and by 2014 a complete new electronics module: the Topological Processor (TP). The TP will make it possible, for the first time, to concentrate detailed information from sub-detectors in a single Level-1 module. This allows the determination of angles between jets and/or leptons, or even more complex observables such as muon isolation or invariant mass. This requires to receive on a single module a total bandwidth of about 1Tb/s and process the data within less than 100 ns. In order to accept this new information from the TP, the CTP will be upgraded to process double the number of trigger inputs and logical combinations of these trigger inputs. These upgrades also address the growing needs of the complete Level-1 trigger system as LHC luminosity increases. During the LHC shutdown in 2013, the TP and the upgraded CTP will be installed. We present the justification for such an upgrade, the proposed upgrade to the CTP, and tests on the TP demonstrator and prototype, emphasizing the characterization of the high speed links and tests of the topological algorithms latency and logic utilization.


ieee-npss real-time conference | 2007

Performance of the final Event Builder for the ATLAS Experiment

H. P. Beck; M. Abolins; A. Battaglia; R. E. Blair; A. Bogaerts; M. Bosman; M. D. Ciobotaru; R. Cranfield; G. Crone; J. W. Dawson; R. Dobinson; M. Dobson; A. Dos Anjos; G. Drake; Y. Ermoline; R. Ferrari; M. L. Ferrer; D. Francis; S. Gadomski; S. Gameiro; B. Gorini; B. Green; W. Haberichter; C. Haberli; R. Hauser; Christian Hinkelbein; R. E. Hughes-Jones; M. Joos; G. Kieft; S. Klous

Event data from proton-proton collisions at the LHC will be selected by the ATLAS experiment in a three level trigger system, which reduces the initial bunch crossing rate of 40 MHz at its first two trigger levels (LVL1+LVL2) to ~3 kHz. At this rate the Event-Builder collects the data from all read-out system PCs (ROSs) and provides fully assembled events to the the event-filter (EF), which is the third level trigger, to achieve a further rate reduction to ~ 200 Hz for permanent storage. The event-builder is based on a farm of O(100) PCs, interconnected via gigabit Ethernet to O(150) ROSs. These PCs run Linux and multi-threaded software applications implemented in C++. All the ROSs and one third of the event-builder PCs are already installed and commissioned. We report on performance tests on this initial system, which show promising results to reach the final data throughput required for the ATLAS experiment.


IEEE Symposium Conference Record Nuclear Science 2004. | 2004

The ATLAS level-1 central trigger system

P. Amaral; N. Ellis; Philippe Farthouat; P. Gallno; J. Haller; T. Pauly; H.P. Lima; Tadashi Maeno; I.R. Arcas; J.M. de Seixas; G. Schuler; R. Spiwoks; R.T. Teixeira; T. Wengler

The central part of the ATLAS level-1 trigger system consists of the central trigger processor (CTP), the local trigger processors (LTPs), the timing, trigger and control (TTC) system, and the read-out driver busy (ROD/spl I.bar/BUSY) modules. The CTP combines information from calorimeter and muon trigger processors, as well as from other sources and makes the final level-1 accept decision (L1A) on the basis of lists of selection criteria, implemented as a trigger menu. Timing and trigger signals are fanned out to about 40 LTPs which inject them into the sub-detector TTC partitions. The LTPs also support stand-alone running and can generate all necessary signals from memory. The TTC partitions fan out the timing and trigger signals to the sub-detector front-end electronics. The ROD-BUSY modules receive busy signals from the front-end electronics and send them to the CTP (via an LTP) to throttle the generation of L1As. An overview of the ATLAS level-1 central trigger system will be presented, with emphasis on the design and tests of the CTP modules.


In: 8th Workshop on Electronics for LHC Experiments, Colmar, France, 9-13 Sep 2002; 2002. p. 227-231. | 2002

The ATLAS level-1 muon to central trigger processor interface (MUCTPI)

N. Ellis; Philippe Farthouat; K. Nagano; G. Schuler; C. Schwick; R. Spiwoks; T. Wengler

The Level-1 Muon to Central Trigger Processor Interface (MUCTPI) receives trigger information synchronously with the 40 MHz LHC clock from all trigger sectors of the muon trigger. The MUCTPI combines the information and calculates total multiplicity values for each of six programmable pT thresholds. It avoids double counting of single muons by taking into account the fact that some muons cross more than one sector. The MUCTPI sends the multiplicity values to the Central Trigger Processor which takes the final Level-1 decision. For every Level-1 Accept the MUCTPI also sends region-of- interest information to the Level-2 trigger and event data to the data acquisition system. Results will be presented on the functionality and performance of a demonstrator of the MUCTPI in full-system stand-alone tests and in several integration tests with other elements of the trigger and data acquisition system. Lessons learned from the demonstrator will be discussed along with plans for the final system.


IEEE Transactions on Nuclear Science | 2006

The ROD crate DAQ software framework of the ATLAS data acquisition system

S. Gameiro; G. Crone; R Ferrari; D. Francis; B. Gorini; M. Gruwe; M. Joos; G. Lehmann; L. Mapelli; A. Misiejuk; E. Pasqualucci; J. Petersen; R. Spiwoks; L. Tremblet; G. Unel; W. Vandelli; Y. Yasu

In the ATLAS experiment at the LHC, the ROD Crate DAQ provides a complete software framework to implement data acquisition functionality at the boundary between the detector specific electronics and the common part of the data acquisition system. Based on a plugin mechanism, it allows selecting and using common services (like data output and data monitoring channels) and developing software to control and acquire data from detector specific modules providing the infrastructure for control, monitoring and calibration. Including also event building functionality, the ROD Crate DAQ is intended to be the main data acquisition tool for the first phase of detector commissioning. This paper presents the design, functionality and performance of the ROD Crate DAQ and its usage in the ATLAS data acquisition system and during detector tests.

Collaboration


Dive into the R. Spiwoks's collaboration.

Top Co-Authors

Avatar

D. Berge

University of Amsterdam

View shared research outputs
Top Co-Authors

Avatar

R. Ferrari

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge