Testing new-physics models with global comparisons to collider measurements: the Contur toolkit
A. Buckley, J. M. Butterworth, L. Corpe, M. Habedank, D. Huang, D. Yallup, M. Altakach, G. Bassman, I. Lagwankar, J. Rocamonde, H. Saunders, B. Waugh, G. Zilgalvis
SSciPost Physics Submission
Testing new-physics models with global comparisons tocollider measurements: the
Contur toolkit
Editors:
A. Buckley , J. M. Butterworth , L. Corpe ,M. Habedank , D. Huang , D. Yallup Additional authors:
M. Altakach , G. Bassman , I. Lagwankar ,J. Rocamonde , H. Saunders , B. Waugh , G. Zilgalvis School of Physics & Astronomy, University of Glasgow,University Place, G12 8QQ, Glasgow, UK Department of Physics & Astronomy, University College London,Gower St., WC1E 6BT, London, UK Department of Physics, Humboldt University, Berlin, Germany Department of Computer Science and Engineering, PES University, Bangalore, IndiaFebruary 9, 2021
Abstract
Measurements at particle collider experiments, even if primarily aimed at understandingStandard Model processes, can have a high degree of model independence, and implicitlycontain information about potential contributions from physics beyond the Standard Model.The
Contur package allows users to benefit from the hundreds of measurements preserved inthe
Rivet library to test new models against the bank of LHC measurements to date. Thismethod has proven to be very effective in several recent publications from the
Contur team,but ultimately, for this approach to be successful, the authors believe that the
Contur toolneeds to be accessible to the wider high energy physics community. As such, this manualaccompanies the first user-facing version:
Contur v2. It describes the design choices that havebeen made, as well as detailing pitfalls and common issues to avoid. The authors hope thatwith the help of this documentation, external groups will be able to run their own
Contur studies, for example when proposing a new model, or pitching a new search.
Contents
Contur workflow 52.2 The
Contur philosophy 72.3 Limitations of
Contur a Now at CERN b Now at University of Cambridge c Now at Tessella a r X i v : . [ h e p - ph ] F e b ciPost Physics Submission Rivet analyses 12
Rivet routines into orthogonal pools 133.2 Adding user-provided or modified
Rivet analyses 133.3 Rivet routine special cases and common pitfalls 13
Run arguments 174.2.2 Parameter card
Parameters arguments 174.3 Generator template 184.4 Grid structure and HPC support 19
Contur likelihood analysis 25
Contur study with
Herwig A.1 Use of
Docker
Contur on a single
YODA file 32A.5 Setting up for
Contur batch jobs (HPC system) 33A.6 Running
Contur on a grid 34A.7 Making
Contur heatmaps 35 B Contur tools and utilities 35
B.1
Contur Docker containers 35B.2 Exporting the results of a
Contur statistical analysis to a CSV file using contur-export
Contur scan directories with contur-gridtool
Contur statistical analyses 37B.5 Submitting
Contur scans to a HPC systems using contur-batch contur-zoom
Contur scan using contur-plot contur-mkhtml ciPost Physics Submission
B.9
Herwig -specific cross-section visualisation tools 41B.10 Interactive visualisation 42B.11 Other tools 45
C BSM models as UFO files 46D Support for SLHA files 47
D.1 Scanning over a directory 47D.2 Modifying SLHA files 47
E Support for pandas
DataFrame s 48
E.1 Creating pickle files 48E.2 Loading pickle files 48E.3 Interoperability 49
F Support for other event generators 49
F.1
MadGraph support 49F.2
Powheg support 50
G The analysis database 52
G.1 General configuration 52G.2 Special cases 53
References 53
The discovery of the Higgs boson was the capstone of decades of research, and cemented thevalidity of the Standard Model (SM) as our best understanding so far of the building blocks ofthe universe. The SM boasts a predictive track record worthy of its position as one of thetriumphs of modern science. It led to the discovery of the vector bosons W and Z , the topquark, and the Higgs boson, and SM cross-section predictions — ranging across ten orders ofmagnitude from the inclusive jet cross-section at O (10 ) pb, to electroweak V V jj processes at O (10 − ) pb — have been found to agree with experimental data through decades of scrutiny,with no significant deviations.Despite this monumental achievement, the SM is ostensibly an approximation. Qualita-tive phenomena such as the cosmological matter-antimatter asymmetry, and astrophysicalobservations consistent with dark-matter and dark-energy contributions to cosmic structureand dynamics suggest directly that the SM is not the whole story. These indications arereinforced by technical issues within the SM such as the “unnatural” need for fine-tuning ofits key parameters, and its formal incompatibility with relativistic gravity.With the absence so far of evidence for electroweak-scale supersymmetry, or of obvious newresonances in measured spectra, the field of collider physics finds itself at a crossroads. Forthe first time in fifty years, there is no single guiding theory to motivate discoveries. On the3 ciPost Physics Submission other hand, the LHC has delivered the largest dataset ever collected in particle physics, withthe promise of a dataset an order of magnitude larger to be delivered by the high-luminosity(HL) LHC in the coming years. A transition from a top-down, theory-driven approach to abottom-up, data-driven one is needed if we are to use these data to achieve the widest possiblecoverage of possible extensions to the SM.The problem is that the field of particle physics does not currently work efficiently indata-driven mode. Searches may take years to produce and concentrate only on certainsignatures of a handful of models at a time. These models may even already be excluded, sincethe new particles and interactions which they feature would have modified well-understood andmeasured SM spectra. What if we could harness the power of the hundreds of existing LHCmeasurements preserved in Rivet [1], to rapidly tell whether a model is already excluded? Amore comprehensive approach to ruling out models could liberate person-power and resourcesto focus on the trickiest signatures. This is the purpose of Constraints On New Theories Using
Rivet ( Contur ), a project first described in Ref. [2].The
Contur method has proven an effective and complementary approach to ruling outnew physics models in a series of case studies [3–6], as well as in providing a “due diligence”check for newly proposed models [7, 8]. Running a
Contur -like scan of any newly proposednew-physics model, or when a new search is being designed, should be routine in experimentalparticle physics, and would potentially liberate search teams to focus on models which have not already been ruled out. This shortcut around models which — no matter how theoreticallyelegant — are already incompatible with model-independent observations will accelerate thefeedback loop between theorists and experimentalists, and bring us more efficiently to thelong-sought understanding of what lies beyond the SM.The
Contur code is now mature enough to turn this vision into a reality, and this manual isintended to accompany the first major user-facing release of the
Contur code (
Contur v2, taggedon Zenodo as Ref. [9], so that theorists and experimentalists who are not
Contur developerscan use this technology to test new models themselves.
This document is structured as follows: this section gives a general introduction to the
Contur workflow and design philosophy. Section 3 deals with the relationship between
Rivet and
Contur , and how
Rivet analyses in the
Contur database are classified into orthogonal pools,with advice on adding new analyses. Section 4 runs through setting up and running
Contur scans over a set of parameter points in a given model. Section 5 explains how
Contur buildsa likelihood function to perform the statistical analysis of the results, and how exclusionvalues are calculated and analysed. Section 6 takes the user through the various plottingand visualisation tools which come with
Contur , to help validate and digest results of a scan.Finally, Section 7 concludes the manual. Some of the explanations and figures in these sectionshave been adapted from a PhD thesis partially focused on the development of
Contur [10].Several appendices are provided to give further detail on some functionality, as well asdetailed examples. Appendix A provides the user with a complete didactic example of theanalysis of a beyond-the-SM (BSM) model with
Contur , using the
Herwig [11] event generator.Appendix B provides detailed descriptions of the various helper executables and other utilitieswhich are provided in the
Contur package, including details about
Contur Docker containers.4 ciPost Physics Submission
Appendix C gives further details about the UFO [12] format, which is used to encapsulate thedetails of BSM models, while Appendix D details
Contur compatibility with the SLHA [13, 14]format. Appendix E documents how model parameter values can be provided to
Contur via pandas
DataFrame [15, 16] objects. Appendix F provides further details about how to usegenerators other than
Herwig with
Contur . Finally, Appendix G provides further detail aboutthe various databases and classifications which are used in the
Contur workflow.
Contur workflow
The basic premise of
Contur is that modifications to the SM Lagrangian typically introducechanges to already well-understood and measured differential cross-sections. Therefore, ifadding a beyond-SM component to the Lagrangian, i.e. a new interaction involving either SMor new BSM fields, would change a measured distribution beyond its experimental uncertainties,then, in simple terms “we’d already have seen it”. This can be quantified more precisely interms of statistical limit-setting, but the upshot is that if one can predict how a given BSMmodel would modify the hundreds of observables measured in existing LHC measurements,then it is already possible to exclude regions of its parameter space without the need for adedicated search.This perspective turns the immediate model-testing challenge from an experimental oneinto a computational and book-keeping one. Can we design a workflow to take a BSM modelwith a set of parameter values, generate simulated events from it, quickly infer the effect ofthose events in each bin of the LHC measurements to date, and compute the p -value (andhence exclusion status at some confidence level) for that model point? Can one then efficientlyrepeat that procedure over a range of parameter points, to determine the regions of parameterspace which are excluded? Contur is a tool that implements such a process. It builds onseveral existing data formats, conventions and packages to achieve this goal, and automaticallyhandles the steering of model parameters and associated book-keeping on the user’s behalf.The basic workflow is illustrated schematically in Figure 1.The first requirement is that the BSM model be implemented in a Monte Carlo eventgenerator (MCEG) such that its parameters can be set, and simulated events generated foranalysis. Historically this required manual coding, and hence focused on BSM models such assupersymmetry, technicolor, and new quarks and vector bosons, which were considered leadingcandidates for new physics before LHC operation. The Super-symmetric (SUSY) Les HouchesAccord (SLHA [13, 14]) format was developed as a MCEG-independent way of specifying themass and decay spectra of such models, and is understood by many MCEGs. As the “obvious”BSM models waned and gave way to a much wider spectrum of possibilities, a complementaryformat — the Universal FeynRules Output [12] (UFO) — was developed to transmit not justparameter choices but the entire model, built up from a Python-based encoding of the BSMLagrangian. The combination of UFO and SLHA files provides an industry-standard wayto package the details of any BSM model, such that most MCEGs can interpret it withoutneeding model-specific code. Its ubiquity means that theorists routinely publish UFO fileswhen proposing a new model, making them easy to study and test. Details on how to use anew UFO file as an input to
Contur can be found in Appendix C, and use of SLHA-drivenconfigurations in Appendix D.MCEGs use the specified BSM model and parameters to simulate new-physics events in highenergy collisions. In the default
Contur workflow, the
Herwig [11] event generator is used (seeAppendix A for an example), but other event generators, such as
MadGraph5 aMC@NLO [17]5 ciPost Physics Submission
Sampling model parametersCalculate observablesEvaluating the likelihood for a modelVisualisation of parameter space contur-batchexternal toolsconturcontur-plot
Figure 1: An example schematic of the
Contur workflow. The dotted box denotes the portionof the workflow that makes extensive use of external packages, affording multiple options, suchas the choice of MCEG. These two steps are described together in Section 4, ‘Sampling modelparameters’. The next stage taking physics observables as inputs to a statistical analysis,‘Evaluating the likelihood for the model’ is described in Section 5. Finally some of the tools tovisualise the output of the likelihood analysis, ‘Visualisation of parameter space’, are coveredin Section 6. 6 ciPost Physics Submission and
Powheg [18] are also supported. Additionally, if events are already generated and parametersteering is therefore not required,
Rivet and thus
Contur can analyse events stored in
HepMC [19,20] format. More details on support for various event generators in
Contur are given inAppendix F.The generated events are fed into
Rivet (see Section 3), the output of which then correspondsto the extra BSM contribution which would have been present in any of the hundreds of spectrameasured at the LHC so far, if the generated model existed in nature. The BSM componentcan then be compared to the size of the uncertainty for the measurement, and optionally tothe SM expectation. Measurements are grouped into orthogonal pools (see Section 3.1), and
Contur uses the best constraint from each pool to form a global exclusion measure for a givenmodel at a given set of parameter values. The details of the statistical treatment can be foundin Section 5.1.This whole process typically takes under an hour for a single point on a single computenode. Repeated for a grid of parameter values, and running in parallel on a compute farm,
Contur can determine in a few hours whether wide regions of a model’s parameter space are stillpotentially viable, or already excluded by existing LHC measurements. The
Contur packagecomes with plotting and visualisation tools to present and digest the results of a scan. Theseare discussed in Section 6 and Appendix B.
Contur philosophy
Contur is designed to efficiently address the question “How compatible is a proposed physicsmodel with published LHC results?”
This question needs to be asked each time a new model isproposed. The ability to answer it depends on a number of factors.Firstly one must define what is meant by “LHC results”. Collider physics experimentsproduce a variety of different types of results, which can be broadly classified as follows.1. Extraction of fundamental parameters of the SM, such as the W mass, the Weinbergangle, etc... Such results give experimental constraints on SM parameters which arecalculable analytically in perturbative field theory.2. Extraction of so-called inclusive quantities, such as (for example) the total productioncross-section t ¯ t , or WW . This usually involves theory input to extrapolate into regionsoutside the acceptance of the measurement.3. Measurements of fiducial particle-level observables. In other words, observables correctedfor detector effects or “unfolded”, but not extrapolated beyond detector acceptance.Comparing predictions to such measurements requires the generation of simulated events,making use of perturbative field theory but also non-perturbative models, and numericalMC techniques, so that fiducial phase space selections may be applied to final-stateparticles.4. Measurement of detector-level distributions. This is the most common type of result usedin searches by ATLAS and CMS. They are faster to produce than unfolded results, sincethe step of validating the model-independence of the unfolding (but not of calibration)can be skipped. However they cannot be compared to theory without an additionaldetector simulation step. 7 ciPost Physics Submission G e a n t s i m u l a t i o n A n t i - k T j e t H a d r o n i s a t i o n e t c . R ec o n s t r u c t i o n A n t i - k T c a l o r i m e t e r j e t P a r t o n s h o w e r PhenomenologicalmodellingMaterialsimulation
Encode BSM physics
Fundamental parameterextraction (1) or suitablyinclusive quantities (2)
Build particle level quantities
Particle level experimentalobservables (3)
Build detector level quantities
Detector level experimentalobservables (4) or derivedmodel exclusions (5)
Contur T y p i c a l e x p e r i m e n t r ec a s t w o r k fl o w Analytic perturbativecalculation
Combine measurement data (3) and (4)* * Caveats exist on inclusion of (4)
Figure 2: A schematic illustrating the levels at which data and theory may be compared inLHC physics. The vertical (downward) arrows show the direction of increasing complexity ofthe theoretical prediction; in the reverse direction, increasingly complex corrections must beapplied to the data. Horizontal arrows show the comparison data available at each level.5. Exclusion regions from searches, usually derived from the detector-level distributionsmentioned above. These can sometimes be reinterpreted in terms of new models, butmay have significant implicit model-dependence.This categorisation is shown schematically in Figure 2. The direction of the arrow indicatesincreasing calculational complexity required of the theory to compare the result to SMpredictions. At first, just an analytical calculation of the SM parameter is needed. Then, MCsimulation at parton and particle-level are required. Finally, the effects of the detector mustbe modelled. The level of model assumption built into the experimental data increases in theopposite direction.All interpretations, or re-interpretations, of results involve compromises and approximations.The
Contur philosophy is to strive for speed and coverage of new models, at the expenseof some precision and sensitivity. To do this, we focus primarily on fiducial, particle-levelmeasurements, as a compromise between model dependence and detector independence: thatis, minimal theory extrapolation in the measurement, and minimal detector dependence in theBSM predictions. This means using results of type 3, and in some circumstances 4, from thelist above. A general discussion of reinterpretation tools and requirements is given in Ref. [21].In addition to making use of particle-level measurements to help exclude new physicsmodels, another pillar of the
Contur philosophy is to use inclusive event generation insteadof exclusively generating individual processes. Inclusive event generation has the advantageof covering all allowed final states which would be affected if that BSM model were realised.Generating events in this way,
Contur can paint a more comprehensive picture of the exclusion8 ciPost Physics Submission across all manner of final states, rather than focusing on the most spectacular signatures of anew model. Indeed, there are several cases in recent
Contur papers where exclusion power for amodel in some region of parameter space has come from an unexpected signature, which mightnot have been tested if the user had to actively switch on individual processes. By contrast,determining which processes are most important in different regions of model parameter spaceis not trivial if one is not an expert in the phenomenology of a particular BSM model.
Herwig is an event generator which features an inclusive mode which generates all 2 → Herwig isthe default event generator used in
Contur , as can be seen in the example in Appendix A.Nevertheless,
Contur retains the possibility to study individual processes, for instance, whenonly a particular process is of interest, or to check how much of a contribution would comefrom process which are more complex than 2 → Contur
It is important to note is that
Contur is not at present a ‘discovery’ tool. It will not identifyregions of BSM parameter space which are more favoured than the SM; such regions will showup as ‘allowed’, but the test statistic is one-sided and gives no more positive information abouta BSM scenario than that. In any case,
Contur only uses data which have been shown to agreewith the SM.Most of the other limitations of the
Contur method at present stem from incompleteinformation published by the experiments. Three common issues arise: the SM predictionof a measurement is not published, the bin-to-bin correlation information for the systematicuncertainties is not made available, or the public information contains hidden, model-dependentassumptions. These items are discussed in more detail in the following.The fact that most entries in the
HepData library (described in Section 3) currently donot include a SM prediction means that assumptions must be made with respect to thenull hypothesis in the
Contur method. In particular, if the SM-prediction information is notavailable,
Contur assumes the data are identically equal to the SM. This is an assumption thatis reasonable for distributions where the uncertainties on the SM prediction are not largerthan the uncertainties on the data; it is also the assumption made in the control regions ofmany searches, where the background evaluation is “data-driven”. When used in this mode,
Contur would be blind to a signal arising as the cumulative effect of a number of statisticallyinsignificant deviations across a range of experimental measurements d . To extract such asignal properly requires evaluation of the theoretical uncertainties on the SM predictions foreach channel. These predictions and uncertainties are gradually being added to Contur andcan be tried out using a command-line option (see Ref. [6] for a first demonstration). Forthese reasons, limits derived by
Contur where the theory predictions are not used directly arebest described as expected limits, delineating regions where the measurements are sensitiveand deviations are disfavoured. In regions where the confidence level is high, they do representa real exclusion.A further limitation comes from a lack of information about correlations between bins insome published measurements. For measurements which are not statistically limited, systematic d This would be particularly worrisome in low-statistics regions, where outlying events in the tails of the datawill not lead to a weakening of the limit, as would be the case in a search. However, measurements unfolded tothe particle-level are typically performed in bins with a requirement of minimum number of events in any givenbin, reducing the impact of this effect (and also weakening the exclusion limits). ciPost Physics Submission correlations between bins may be important. Without knowing the size of the correlationsbetween bins, Contur must only use the single most sensitive bin in a given distribution to avoiddouble-counting correlated excesses across multiple bins. This limits the sensitivity of
Contur when a BSM signal is spread over several bins in a distribution. However, in an increasingnumber of cases, a breakdown of the size of each major source of correlated uncertainty ineach bin is provided by the experiments, and in these cases
Contur is able to make use of them.More fundamentally, some measurements are defined in ways which make their use in
Contur limited or impossible. This usually occurs because SM assumptions have been built intothe measurement (for example extrapolations to parton level, or into significant unmeasuredphase space regions), important selection cuts (a common example being jet vetoes) havenot been implemented in the fiducial phase space definition, or large data-driven backgroundsubtractions (for example in H → γγ ) have been made. Existing examples, and the conditionsunder which some such routines may or may not be used, are discussed in Section 3.3.Finally, and trivially, if a Rivet routine and
HepData entry are not available for a measure-ment,
Contur cannot use it.
The
Contur tool is mostly structured in the standard Python form, with a package directory contur containing the majority of processing logic, and secondary bin and data directoriesrespectively containing executable “user-facing” scripts, and various types of supporting datafiles. A set of unit tests is implemented in the tests directory, using the pytest framework.An outline of the directory structure can be seen in Listing 1.The contur
Python package is internally divided into modules reflecting the distinctparameter-scanning, MC-run management, and statistical post-processing tasks in of
Contur operation. The scan module provides helper functions for generating and combining parameter-space scan data, plot implements the standard data-presentation formats, and the run moduleprovides the logical cores of the main user scripts in a form amenable to pytest testing. Thestatistical machinery central to
Contur lives in the main contur namespace, supported byutility functions, data-loading tools, and
Contur ’s analysis-pool data from the util , factories and data modules. These classes are documented in inline documentation using Pydoc , whichis linked from the main
Contur repository.The bin directory contains the main user scripts described in this paper, plus the auxiliaryones described in Section B. The data directory contains a mixture of bundled and generateddata. Included in the release are:• sets of model files and generation templates created so far,• any modified or new
Rivet analysis codes and reference data not bundled with the
Rivet release,• theory-based background estimates for analyses where the data need not be assumed tobe purely SM.Files generated by the user after installation include:• the compiled
Rivet analysis-plugin libraries,• analysis-pool database (see Appendix G) ,10 ciPost Physics Submission
Listing 1: An illustration of
Contur software file structure. (cid:7) (cid:4) bin
User-facing executables and scripts contur
The main
Contur executable contur-batch
A utility for submitting
Contur scans to HPC systems contur-plot
A utility for plotting
Contur results contur-export
A utility for exporting
Contur results to a CSV file ...contur
Main Python package containing internal processing logic config
Configuration and initialisation logic data
Data manipulation and database-management code factories
Internal tools to produce scan and likelihood objects plot
Plotting and visualisation code run
Logical core of main
Contur script scan
Tools for parameter-space scan data manipulation util
General-purpose utilities and helper functions ...contur-visualiser
An auxiliary utility for interactive visualisation ...data
Directory containing supporting data files DB SQL files for the database of available
Rivet analyses
Models
A library of example BSM UFO models
Rivet
New or overloaded
Rivet analysis sources and files
Theory
Additional SM theory predictions for
Rivet analyses
TheoryRaw
Files from which SM theory predictions were imported ...docker
Directory containing build files for
Docker containers ...tests
Directory encapsulating the unit test framework ... (cid:6) (cid:5) ciPost Physics Submission • MC-run template files.The Contur package relies on a compiled
Rivet installation and set of analysis overrides,and requires manually copying template files from data/Models to run MC scans. For thesereasons, it is not usually recommended to perform the installation using the Python setuptools scheme e . Installing and using Contur is instead normally done directly from the downloadedproject directory, by sourcing a script called setupContur.sh on each shell session, andby running the make command only once upon installation. Sourcing setupContur.sh setsenvironment variables (such as
CONTUR ROOT ) that
Rivet uses to locate custom analyses anddata. Additionally, this script appends the
Contur
Python module path and the executable bin directory to the system
PYTHONPATH and
PATH respectively, mirroring the function ofPython’s standard setuptools . As the
Contur package makes use of a compiled database,referencing analysis lists derived from this at run time, setting these environment variablesis needed to operate parts of the workflow. Furthermore, a short python script is run when setupContur.sh is sourced which checks the various dependencies and paths. Rivet analyses
Rivet functions as a library of preserved particle-level measurements from colliders. Eachpublication has a corresponding
Rivet routine: a runnable C++ code snippet which encapsulatesat particle-level the definition of the measured cross section.
Rivet routines can be thought ofas filters that select generated events which would enter the fiducial region, and project theirproperties into histograms with the same observables and binnings as the measurements. Thenative
Rivet format for histograms and associated analysis objects is called
YODA . Severalhundred measurements are preserved in this way, many of them from LHC experiments.The
HepData [22] repository contains a digitized record of the measured cross-sectionvalues and their uncertainties, sometimes also including the best SM theory predictions at thetime, and sometimes including a breakdown of uncertainties in each bin or other correlationinformation. This information is synchronised between
Rivet and
HepData whenever a newrelease of
Rivet is made, so that a faithful comparison of generator output to measured dataand uncertainties can be made.The measurements present in
Rivet and used by
Contur in the current version are thosein Refs. [23–114], although new measurements are continually being added.
Contur re-usesthese encapsulated analysis routines, but runs with generated BSM events rather than the SMprocess which was typically the target of the measurement.
Rivet is specifically designed torun multiple (or indeed, all) analysis plugins simultaneously for a given beam configuration. Ithas been optimised to do this quickly and efficiently. Thus BSM events generated by
Herwig oranother MCEG are filtered through all available plugins, leading to a multitude of histogramsshowing if, and where, the signal would have appeared in existing LHC measurements. Thesize of the signal can then be compared to the relevant
HepData reference histogram, to decideif the set of BSM parameters in question would have produced a distortion to the SM spectrumbeyond measured uncertainties. e The repository does contain a setup.py file that allows installing
Contur as a package, for example to allowthe statistical interpretation function to be accessible from other programmes. However, most of the
Contur functionality will not be available with this method, which is currently a work in progress, and for most users itis not recommended. ciPost Physics Submission Rivet routines into orthogonal pools
If injection of BSM signal leads to an excess in a measured distribution, there may alsobe excesses in measurements of similar final states arising from essentially the same events,and since correlations between the measurements cannot be accounted for, this could leadto an overestimate in the sensitivity. To avoid such double counting,
Contur classifies
Rivet histograms into orthogonal pools based on centre-of-mass energy of the LHC beam, theexperiment which performed the measurement, and the final state which was probed. Foranalyses which measured several final states which are implemented as different options withinthe
Rivet plugin, it is possible to sort histograms from the same analysis into different pools.If there are orthogonal phase space regions measured within the same analysis (for exampledifferent rapidity regions in a jet cross section) it is possible to combine the non-overlappinghistograms into a “subpool”, in which case the combined exclusion of the subpool will beevaluated, and treated as though it came from a single histogram.The results from each pool can then be combined without risk of over-stating the sensitivityto a given signal. Analysis pools are named as h Experiment i h
Center-of-mass energy ih Final state i , where:• h Experiment i can be ATLAS , LHCB or CMS at present;• h Center-of-mass energy i can be , or TeV;• h Final state i is a short string which loosely describes the final state, with details givenin Table 1 ;These pools, and other information, are stored in a database, described in Appendix G. Rivet analyses
To make a local modification to an existing
Rivet routine or include one not yet in the
Rivet release, the new or modified analysis plugin can be copied into the contur/data/Rivet directory along with any updated reference files. Further, if a new theory calculation becomesavailable this can be added to the contur/data/Theory directory. The new routine can becompiled with a simple make call, followed by setupContur.sh . This will over-ride the default(un-modified) version of the replaced analysis for the next run. The new analysis should alsobe added to the analysis.sql file, documented in Appendix G.
Most analyses preserved in
Rivet are particle-level measurements, meaning the measurement isto a large extent defined in terms of an observable final state, and the effects of the detectorhave already been corrected for during the unfolding procedure, within some fiducial region.As a result, predictions and measurements can be compared directly, without the need forsmearing or detector simulation. Some exceptions and caveats exist however, limiting theapplicability of some analyses. The current known special cases are discussed below, and theircategorisation in the
Contur database structure is discussed in Appendix G.2.13 ciPost Physics Submission h Final state i tag Description of target final state Three leptons Four leptons
EEJET e + e − at the Z pole, plus optional jets EE GAMMA e + e − plus photon(s) EMETJET
Electron, missing transverse momentum, plus optional jets (typically W , semi-leptonic t ¯ t analyses) EMET GAMMA
Electron, missing transverse momentum, plus photon
GAMMA
Inclusive (multi)photons
GAMMA MET
Photon plus missing transverse momentum
HMDY
Dileptons above the Z pole HMDY EL
Dileptons above the Z pole, electron channel HMDY MU
Dileptons above the Z pole, muon channel JETS
Inclusive hadronic final states
LLJET
Dileptons (electrons or muons) at the Z pole, plus optional jets LL GAMMA
Dilepton (electrons or muons) plus a photon
LMDY
Dileptons below the Z pole LMETJET
Lepton, missing transverse momentum, plus optional jets (typically W , semi-leptonic t ¯ t analyses) METJET
Missing transverse momentum plus jets
MMETJET
Muon, missing transverse momentum, plus optional jets (typically W ,semi-leptonic t ¯ t analyses) MMET GAMMA
Muon, missing transverse momentum, plus photon
MMJET µ + µ − at the Z pole, plus optional jets MM GAMMA µ + µ − plus photon(s) TTHAD
Fully hadronic top events
L1L2MET
Different-flavour dileptons plus missing transverse momentum( i.e. WW and t ¯ t measurements)Table 1: Description of the currently considered h Final state i tags used to sort analysishistograms into orthogonal pools. 14 ciPost Physics SubmissionRatio plots The current most powerful particle-level measurement of missing energy is inthe form of the measurement of a ratio of ‘‘ plus jets to missing energy plus jets [77]. Thecancellations involved bring greater precision, but the SM leptonic process is hard-coded asthe denominator, so the results are not reliable for models that would change this — forexample, enhanced Z production will contribute to both the numerator and the denominator.For models where this is expected to be an issue, the analysis may be excluded by setting the --xr flag at Contur runtime. H → γγ These fiducial measurements [111] are very powerful for models which enhanceSM Higgs production. However, they rely on a fit to the γγ mass continuum to subtractbackground. Signals from models which enhance non-resonant γγ production would presumablyhave influenced this fit, and might have been absorbed into it, so looking at their contributiononly in the H mass window will overestimate the sensitivity. These analyses may be excludedin such cases by setting the --xhg flag at Contur run time.
Searches
Detector-level
Rivet routines do exist for some searches, and can be used by
Contur [80, 83]. In this case
Rivet ’s custom smearing functionality is used, and the SMbackground from
HepData is used for comparison. These searches may be turned on by settingthe -s flag at Contur run-time. H → WW Like H → γγ , these measurements [59, 94] could potentially be very impor-tant when SM Higgs production is enhanced. However they involve very large data-drivenbackground subtraction (principally for top), and the reliability this for non-SM productionmechanisms (of Higgs, WW , or just dileptons and missing energy) is in general hard todetermine. These analyses may be turned on by setting the --whw flag at Contur run time.
ATLAS WZ This analysis [75] may be useful for models which enhance WZ production,but it calculates event kinematics using the flavour of the neutrinos, and so its impact on othermissing energy signals is difficult to evaluate. The analysis may be turned on by setting the --awz flag at Contur run time. b -jet veto Analyses targeting WW production processes generally use b -jet vetos to suppress WW production via t ¯ t . In some cases, this kinematic requirement is made only at detectorlevel, and not included in the fiducial cross section definition implemented in the Rivet routine [57, 94, 95, 101]. These analyses are therefore likely to give misleading results whenused on non-SM WW production processes.These exceptions are catalogued in the analysis database (see Appendix G) and should betaken into account when implementing or adding a new analysis to Contur . Some guidelinesfor designing analyses to minimise their model dependence and maximise their impact ina
Contur -like approach are given in Ref. [21]. The most important principle is that theory-based extrapolations should be avoided where possible, both for background subtraction andunmeasured signal regions. This essentially means defining a fiducial measurement region interms of final state particles which as far as possible faithfully reflects the actual detector-levelevent selection. 15 ciPost Physics Submission
New physics models usually have a number of parameters which are not fixed. Surveying sucha model begins with identifying the parameters of interest and sampling points within thatparameter space.
Contur provides a simple custom tool-set to facilitate this, currently limitedto sampling a small number of parameters in a single scan.The scanning functionality is implemented in the scan module, and user interaction ismostly controlled with the contur-batch executable. This executable requires three coreuser-defined components governing the behaviour of the scan:• A run information directory containing the required common files such as model definitionand analysis lists. Preparing this directory is outlined in Section 4.1;• A parameter card dictating how the parameters of interest should be sampled. Thestructure of this file is explained in Section 4.2;• A template generator steering file. This depends on the MCEG being used, and isdiscussed in Appendices A.5 and F.Alternatively to constructing scans of model parameters, specific parameter choices can bemanually sampled. By calculating observables for a chosen set of parameters (demonstratedusing
Herwig in Appendix A.3), a file containing
YODA analysis objects can be fed directly tothe likelihood machinery described in Section 5. This allows manual sampling of a parameterspace using the
Contur likelihood machinery.
The list of observables to calculate is dependent on the available list of compiled
Rivet analyses.As discussed in Section 3, this list can be augmented by the user and is subject to changedependent on the
Rivet version used. The contur-mkana command line utility is called togenerate some static lists of available analyses to feed into the MCEG codes. Specifically,
Herwig -style template files are created in a series of .ana files, and a shell script to setenvironment variables containing lists of analyses useful for command-line-steered MCEGs isalso written. After contur-mkana is invoked, re-sourcing the setupContur.sh script definesthe necessary environment variables. The
Herwig analysis list files will also now exist in data/share . Local model files and, if using
Herwig , the analysis list files, should be copiedto a subdirectory of the local run area (default name
RunInfo ) f . This subdirectory is thensupplied to the main contur-batch executable via the grid command-line argument. The parameter card is supplied to the contur-batch executable via the --param file command-line argument. The structure of this file is based on the input/output struc-ture defined by the Python configObj package. Entries delineated by square braces definedictionaries named as the contained string. Double square braces are a dictionary within theparent dictionary. The two main dictionaries to steer the parameter sampler are
Run and
Parameters . An example parameter card with three model parameters is given in Listing 2. f These files will be copied anutomatically by contur-batch if not already present. ciPost Physics Submission Listing 2: An example
Contur configuration file for a model with three free parameters (cid:7) (cid:4) [Run]generator = "/path/to/generatorSetup.sh"contur = "/path/to/setupContur.sh"[Parameters][[x0]]mode = LOGstart = 1.0stop = 10000.0number = 15[[x1]]mode = CONSTvalue = 2.0[[x2]]mode = RELform = {x0}/{x1} (cid:6) (cid:5)
Two additional dictionaries are implemented, which allow the user to make processing moreefficient by skipping certain points (using a block named
SkippedPoints ) g and scaling thenumber of events generated at each point (using a block named NEventScalings ), since somepoints may be need to be probed with more precision than others. The
NEventScalings dictionary is only applied if the grid is submitted using the --variablePrecision optionof contur-batch . Both the
SkippedPoints and
NEventScalings dictionaries can be addedautomatically to a parameter card using the contur-zoom utility, which is designed to helpthe user iteratively refine a parameter scan, and which is documented in Appendix B.6.
Run arguments
The
Run block is intended to control high level steering of the parameter sampler. Twodictionary keys are defined in this block:• generator , path to a shell script that configures the necessary variables to setup theevent generator;• contur , path to a shell script that configures the necessary variables to load the
Contur package.Both of these callable scripts are expected to set up the required software stack to execute thecalculation of observables on a High Performance Computing (HPC) node.
Parameters arguments
Within the
Parameters dictionary, a series of sub-dictionaries (in double square braces) definethe treatment of each parameter in the model. The string used as the name of this dictionaryis the name of the parameter, and must also appear in the MCEG run-card template. The g In future, we intend to make further use of pandas
DataFrame compatibility to provide such functionalitymore elegantly. ciPost Physics Submission mode field defines the type of the parameter, and opens additional allowed fields modifyingits behaviour. The available values for mode , with the sub-list detailing the unique additionalparameters for each, are given below:• CONST , a constant parameter. – value , a float with the value to assume for this parameter.• LOG/LIN , a uniform logarithmically- or linearly-spaced parameter. – start/stop , the floats of the boundaries of the target sampled space for thisparameter (note: start must be a smaller number than stop ). – number , an integer number of values to sample in the range.• REL , a relative parameter, defined with reference to one of more of the other parameters. – form , any mathematical expression that Python can evaluate using the eval() function form of the standard library, where parameter names wrapped by curlybraces, as seen on Listing 2, will be replaced by the value of that parameter beforeevaluating the expression. The name between braces must match exactly that ofthe parameter as specified in the Parameter block. For safety and efficiency, it ispreferable (and often necessary) to use the
DATAFRAME mode if complex mathematicalexpressions ( i.e. anything beyond basic arithmetic operations) are required togenerate the desired value for this parameter.•
SINGLE/SCALED . Single string substitution. If the parameter name is “ slha file ”,provide a path to a single SLHA file as name which will be treated as described inAppendix D.•
DIR , if using the SLHA specification, giving a directory to the SLHA files as name .Each file in the directory will generate a separate run point with the parameters setaccordingly.•
DATAFRAME , one can also provide a pandas
DataFrame in a pickle file as name , whichprovides the parameters to vary and their values, one point for each row of the table. pandas
DataFrame support is further documented in Appendix E .With these tools, many parameters available in the model can be scoped in the
Contur parameter sampler. The parameters whose mode is either
LOG , LIN or DATAFRAME are thescanned parameters, and the number of such parameters is the dimensionality of the scan.
REL or CONST parameters are then ways to correctly set the additional parameters of themodel. Any dimension of scan is technically possible but typically only up to two or threeparameters have been considered in physics studies using
Contur . For high-dimensional scans, contur-export allows exporting results to a CSV file, so that alternative visualisation toolsbeyond contur-plot can be used (see Appendix B.2).
To interface an MCEG with the
Contur parameter sampler, a template of the generator inputhas to be provided. This template file is supplied to the contur-batch executable via the18 ciPost Physics Submission
Listing 3: Snippet of a
Herwig input card for the same three free parameters as previouslydefined in Listing 2. (cid:7) (cid:4) read FRModel.modelset /Herwig/FRModel/Particles/X:NominalMass {x0}set /Herwig/FRModel/FRModel:x1 {x1}set /Herwig/FRModel/FRModel:x2 {x2}insert HPConstructor:Incoming 0 /Herwig/Particles/uinsert HPConstructor:Incoming 0 /Herwig/Particles/ubarinsert HPConstructor:Outgoing 0 /Herwig/FRModel/Particles/Xset HPConstructor:Processes SingleParticleInclusive (cid:6) (cid:5) --template file command line argument. The parameters that are scoped in
Contur asdescribed in Section 4 are then substituted into this file, thus defining the generator runconditions.Following the example of Listing 2 for a
Contur parameter file, a snippet of the matching
Herwig input card is shown in Listing 3. Much of the syntax is
Herwig -specific, and furtherdiscussion is left to the
Herwig documentation. Important features to notice are that theparameters are being defined in the
Herwig
FRModel (short for
FeynRules model, the placeholderfor a
Herwig -parsed
UFO model file). The names within curly braces match the parameterdictionary names in the parameter card file, allowing numeric values for each to be substitutedin, following the rules defined in Section 4.2. Since this workflow is based upon string parsingand substitution, any event generator configuration that can be steered in a similar way canbe substituted. In the example, the mass of the X particle has been scanned with the Contur sampler by varying the defined parameter, x0 .An example of the process definition is also included in Listing 3 for this toy model. In thisexample, the instruction given to the generator is to inclusively generate all 2 → X particle. According tothe Feynman rules in the parsed FRModel.model file, all allowed diagrams will be generated.This is the ideal generator running mode for
Contur , consistent with its inclusive philosophy.However, not all generators provide this inclusive option; also, in some cases it may be usefulto focus on specific processes.As motivated in Section 2.2, this generator setup should be set to generate signal-onlycontributions to the relevant observables. The statistical analysis detailed in Section 5 willtreat the observables resulting from generator runs as being additive signal contributions tothe background model.Specific worked examples of setting up the generator template for the default
Herwig eventgenerator chain are given in Appendix A. Support for
MadGraph and
Powheg workflows is alsoimplemented and examples are presented in Appendix F.1 and Appendix F.2 respectively. Thechoice of generator is controlled by the --mceg command-line variable, defaulting to
Herwig . The execution of observable calculation in
Contur is realised in two steps: definition of theevent-generation and observable-construction jobs, and execution of those jobs.First, if .ana files are required and do not already exist locally, they will be automatically19 ciPost Physics Submission
Listing 4: An example of the
Contur formatted grid directory for a single beam energy andsingle point scan. (cid:7) (cid:4) myscan00
Parent scan directory
Subdirectory for each requested beam energy sampled_points.dat
A flat file, listing of all the sampled points
Directory for each sampled point params.dat
Flat file with the chosen parameter point
LHC.in
Generator template with substituted parameter values runpoint_000.sh
Shell script to execute (cid:6) (cid:5) copied by contur-batch from $CONTUR ROOT/data/share to the local
RunInfo directory.Next, a run directory will be created (named myscan by default), with a subdirectory foreach distinct set of run conditions (currently the three available LHC beam energies). In adedicated subdirectory of each of these for each point in the parameter grid, the samplercreates all associated generator files, with the required commands to run the generator and theselected
Rivet analyses. A shell script containing all the commands to execute the generatorrun from a fresh login shell is also written An example scan directory is shown in Listing 4.Next, the scripts which perform the calculations for each parameter point need to beexecuted. The contur-batch executable will automatically send each job to a HPC node.
Contur supports the
PBS , HTCondor and
Slurm batch systems, the one in current use beingcontrolled by use of the the --batch command-line argument. The default behaviour is to usePBS submission, where the queue name is controlled by the queue command-line argument.
Slurm differs from
PBS only in use of the sbatch command-line tool in place of qsub , whilethe
HTCondor system differs from the others in not having queues and having to generate ajob description file (JDF) for each scan point’s condor batch call.Alternatively, if the --scan-only command-line option is used, contur-batch will onlygenerate the batch scripts (and JDFs if necessary) but not submit them, leaving detailed runcontrol entirely to the user. In either mode, no batch-system management is performed by
Contur once the jobs are running: for this you should use the suite of tools specific to yourbatch system ( qstat , qdel , etc., or their Slurm or HTCondor equivalents).The contur-batch executable also controls the number of events which are generatedfor each parameter point (using the --numevents option, defaulting to 30,000) h . Duringexecution, once the generator has reached the requested number of events, the observablescalculated by Rivet are stored as filled histograms in the
YODA histogram format. Eachparameter space point subdirectory in the grid as shown in Listing 4 will have a corresponding
YODA file containing the calculated observables. h In general this should correspond to an effective luminosity comparable to the luminosity of any statistically-limited measurements they are to be compared to. For many BSM models the cross sections are small, so thisnumber of events is not enormous. Recent
Contur publication have for example typically generated the default30,000 events per set of parameter values. ciPost Physics Submission Calculation of the CL s exclusion at a given point in parameter space requires the constructionof a likelihood function for that point. The main analysis executable, called simply contur , isresponsible for this task. Taking as input a series of calculated observables in YODA format,this can be run either on a single point in parameter space, or on a grid of points generatedusing contur-batch , in which case a map of the likelihood of the parameter points explored bythe parameter sampler is constructed. This section describes the calculation of the likelihoodfor an individual point in parameter space.As this is the main analysis component in
Contur , the functionality is implementedthroughout the package modules. The core analysis classes are implemented in the factories module. The entry-point analysis class is named
Depot which should contain the majority ofthe relevant user-access methods. Several intermediate classes handle various aspects of thedata flow, down to the lowest-level class defining the statistical method,
Likelihood . The data module implements much of the interaction between
Rivet and
Contur , defining howto build covariance matrices for example. The run module implements the behaviour of theexecutable, and interaction of this calculation with the rest of the modules.
A test statistic based on the profiled log-likelihood ratio can be written, t µ = − λ ( µ ) = − L ( µ, ˆˆ ν ( µ )) L (ˆ µ, ˆ ν (ˆ µ )) , (1)with µ being the parameter of interest (POI) and ν being nuisance parameters. A singlehat, e.g. ˆ ν , denotes the maximum likelihood estimator for the parameter. A double hat, e.g. ˆˆ ν , denotes the conditional maximum likelihood estimator for the parameter, conditionedon the assumed value of the POI. This test statistic can be used to construct a frequentistconfidence interval on the POI. The convention in High Energy Physics (HEP) is to use theCL s prescription [115, 116], defined as a ratio of p -values,CL s = p s + b − p b , (2)with p -values defined as, p = Z ∞ t µ, obs f ( t µ | µ ) dt µ , (3)where f ( t µ | µ ) is the probability density function of the test statistic under an assumed POI.The test statistic in the asymptotic limit [117] can be approximated by t µ = ( µ − ˆ µ ) σ + O (cid:18) √ N (cid:19) , (4)with σ the variance of the POI and N the data sample size. Likelihoods in HEP are oftenwritten as a Poisson distribution composed of three separate counts; the hypothesised signal21 ciPost Physics Submission count ( s ), the expected background count ( b ) and the observed count ( n ). The POI is now asignal strength parameter. This likelihood is written as, L ( µ ) = ( µs + b ) n n ! e − ( µs + b ) . (5)The form of the test statistic in equation (4) can then be rewritten as χ µ = (( µs + b ) − n ) σ − ((ˆ µs + b ) − n ) σ , (6)where σ is now the variance of the counting test. As these are now standard χ distributions,they are approximated by a normal distribution in the large sample limit (via the centrallimit theorem). This model can be extended by incorporating nuisance parameters on thebackground model into the likelihood function given in equation (5), and by taking a productof multiple counting tests. A likelihood for a single histogram with i bins and j sources ofcorrelated background nuisance in each bin can be written as, L ( µ, ~ν ) = Y i ( µs i + b i + P j ν i,j ) n i n i ! e − ( µs i + b i + P j ν i,j ) Y j exp( ~ν > j Σ − j ~ν j ) , (7)= Y i Pois( µs i + b i + X j ν i,j | n i ) Y j Gauss iD ( ~ν j | , Σ j ) , (8)with Σ j being the covariance matrix for each correlated source of nuisance and ~ν j being thecorresponding vector for the correlated nuisance parameters across the bins. In this case thereare now multiple sources of nuisance common to each counting test, or bin in the histogram.In this example there would be j different sources of nuisance, so there are j constraint terms.The constraints are now i -dimensional Gaussians to account for the covariance of each nuisanceparameter between bins. The sources of nuisance can be profiled by maximising the loglikelihood for the hypothesised µ .The practical implementation of this has relied on the inclusion of the uncertainty breakdowninto the YODA reference data files included with
Rivet . In the asymptotic regime, maximisingthe log likelihood is equivalent to minimizing the χ . The minimisation itself, and handlingof covariance information, are achieved with the help of the SciPy [118] statistics packagealong with
NumPy [119] for array manipulation. Assuming each individual uncertainty arisingfrom a common named source in the reference data is fully correlated, the correlation matrixfor each source of uncertainty can be built. This gives the set of Σ j matrices needed tomaximise the likelihood. Minimising the χ for all nuisances simultaneously gives the requisiteconditional maximum likelihood estimators, ˆˆ ν i,j , required to form the profile likelihood asgiven in equation (1). Following similar asymptotic arguments, a CL s confidence interval canbe calculated with the full set of nuisances suitably profiled. As an example, the test statisticin the limiting cases leading to equation (6), for two counting tests with one correlated sourceof nuisance can be written, χ µ = µs + b + ˆˆ ν , − n µs + b + ˆˆ ν , − n ! T · Σ Σ Σ Σ ! − · µs + b + ˆˆ ν , − n µs + b + ˆˆ ν , − n ! . (9)For more complex cases, the sum of the covariance matrices built from each nameduncertainty then gives the full covariance matrix between bins which can be used to calculate22 ciPost Physics Submission the likelihood combing all bins in a histogram. Noting that after using the breakdown ofthe total uncertainty into its component sources to profile the nuisances, the resulting totalcovariance (Σ = P Σ j ) between bins can be used to construct the test. This test statisticomits the second “reference” ˆ µ term seen in equation (6), this term is trivial when running thedefault mode of generating background models from data, however does need full treatment ina similar manner when extending to non trivial background models (see Section 5.3 for moredetailed discussion of background models).The --correlations flag can be set in the contur executable to enable the calculationto use the correlation information where it is available. The default behaviour of Contur ishowever not to build the correlations between counting tests, and hence fall back to collectingsingle bins with the largest individual CL s to represent the histogram. This is because the fulluse of correlations can make the main Contur run over a large parameter grid quite slow, dueto the nuisance parameter minimisation step. Various command-line options are provided tospeed up convergence of the fit, or to apply a minimum threshold on the size of considered toerror sources, all of which may speed up the process without significantly affecting the result.Neglecting the systematic uncertainty correlations entirely, and falling back on the “single bin”approach for all histograms, is very fast and in most cases gives a result reasonably close tothe full exclusion, albeit with more vulnerability to binning effects. The user is encouraged toexperiment with these settings, perhaps neglecting correlations for initial scoping scans, andreserving the full correlation treatment for final results.Currently there is no functionality to correlate named systematics between histograms,which might in principle allow combination of all bins in an entire analysis for example. Formost purposes, correlating a given histogram gives the required information. Combiningdifferent histograms is then taking a product of the likelihood in equation (7), where thehistograms chosen to combine are deemed to be sufficiently statistically uncorrelated. Howthese likelihood blocks are chosen to “safely” minimize correlations when combining histogramsis described in the following section.
In Section 3.1 the division of available
Rivet analyses into pools was shown. As describedin Section 3, the data and simulation used for comparison come in the form
YODA objectsfrom the relevant
HepData entries. Some of these carry information about the correlationsbetween systematic uncertainties. A likelihood of the form given in equation (7) can be usedto calculate a representative CL s for each histogram.There are also overlaps between event samples used in many different measurements,which lead to non-trivial correlations in the statistical uncertainties. To avoid spuriously highexclusion rates due to multiple counting of a single unusual feature against several datasets, analgorithm is used to combine histograms safely. To represent this, a pseudo-code realisation ofthe three main components of the algorithm is given in Listing 5. Starting with an imaginedfunction, Likelihood , which would build a likelihood function of a form similar to thatdescribed in Section 5.1 from input histogram(s), and return a computed CL s value. Thestages to combine all the available information into a full likelihood are realised as follows:1. Calling BuildFullLikelihood loops through the defined pools in
Contur , and calls
EvaluatePool on each pool.2. Within each pool, work through all histograms calling
EvaluateHistogram on each.23 ciPost Physics Submission
Listing 5: Pseudo-code implementation of the three components of the pool sorting algorithm (cid:7) (cid:4) function BuildFullLikelihood()for Pool in ConturPoolstests append EvaluatePool(Pool)Concatenate testsreturn Likelihood(tests)function EvaluatePool(Pool)for Histogram in Poolscores append EvaluateHistogram(Histogram)Concatenate all orthogonal Histogramsreturn Histogram with max(scores)function EvaluateHistogram(Histogram)if Histogram has correlation and (build correlation == True)return Likelihood(Histogram)else: return max(Likelihood(Histogram.bins)) (cid:6) (cid:5)
3. Depending on desired behaviour
EvaluateHistogram either builds the correlation matrixwhere possible, returning the CL s of the full correlated histogram or defaults to finding thebin within the histogram with the maximum discrepancy, returning this to EvaluatePool .4. Now with each histogram evaluated,
Concatenate can be called, combining orthogonalcounting tests within the pool where allowed. Where a single bin has been used (if nocorrelation information is requested and or found), the histogram is reduced to this singlebin representation. The histogram (or concatenated histogram) with the largest CL s within the pool is returned to BuildFullLikelihood
5. Now the representative histogram from each pool has been appended to a list, this listcan also be concatenated. The bins (or bin) extracted from each pool are treated asuncorrelated counting tests with a block diagonal correlation matrix between each pool.The representative CL s forming the full likelihood can then be returned.While selecting the most significant deviation within each pool sounds intuitively suspect,in this case it is a conservative approach. Operating in the context of limit setting means thatdiscarding the less-significant deviations simply reduces sensitivity. The formal argument for a test statistic based on the profile likelihood ratio was given inSection 5.1. An alternative to this would be to use a test statistic based on a simple hypothesislikelihood ratio between the signal ( µ = 1) and no signal ( µ = 0) hypotheses. Such a teststatistic could be written, in a similar form to equation (6), as, χ µ = (( µs + b ) − n ) σ − ( b − n ) σ , (10)where the background model, b , can have nuisance parameters included in a similar fashionwhich can in turn also be profiled. In the case that the modelled background value, b ,24 ciPost Physics Submission approaches the observed count, n , then the two forms of the test statistic converge. Thisis equivalent to the statement that as the most likely signal strength (ˆ µ ) tends to zero, the‘reference’ values in the χ test statistics both tend to zero. In this limiting case, the argumentfollowed that the form of the test statistic omitting these reference values, given in equation (9),was sufficient. The example Rivet plot shown in Figure 3 illustrates a signal model appearingin a region of a generated histogram that would however represent different CL s intervalsarising from the two constructions. If the resonance in this spectrum were to appear in one ofthe regions where the theoretical expectation closely matches the data instead, the two formswould largely coincide.The default mode of running Contur is to generate the background model from the data,and with this the coincidence of the two forms of the test statistic is guaranteed. Typically,state of the art theoretical predictions are not automatically provided alongside measurementdata. If such data are provided (as detailed in Section 3.2), then invoking the --theory command line option in the contur executable will load and use this data where appropriate.As extensive use of theoretically-generated background models has not yet been made inany physics studies, the default implemented behaviour is to report the CL s resulting froma direct hypothesis test, essentially as written in equation (9). When more use is made oftheoretically-generated “non-trivial” background models in physics studies, it is intended toreport both forms of the test statistic as standard. In cases where the nontrivial backgroundmodel is known to poorly model the data, such as in Figure 3, it is expected that the twoforms of the test statistic would start to significantly diverge. The combination of a full profilelikelihood with correlated nuisances will enable sophisticated physics studies, however it isexpected that the current standard based on a simple ‘direct’ hypothesis test will remain usefulfor a range of fast pragmatic studies. Contur likelihood analysis
The method described thus far in this section is handled automatically in the contur executable.Either a single
YODA file, or a directory containing a structured grid steered by the parametersampler (as described in Section 4.4), can be supplied to this executable with the --grid command-line argument.In the former case, a summary file is written which may then be processed by the contur-mkhtml script to produce a web page summarising the exclusion and display all thetested
Rivet histograms, highlighting those which actually contributed to the likelihood.In the latter case, the grid will be processed point-wise, evaluating the full likelihood ateach parameter point that has been sampled. The resultant grid of evaluated likelihoods iswritten into a .map file, which is a file containing a serialised instance of the
Depot class. Thisis written out using the standard library pickle functionality and can be read and manipulatedfor further processing. The
Pydoc documentation describing the details of this class is linkedfrom the main
Contur repository. The executable implements a number of high level controloptions for vetoing analyses and controlling the statistical treatment.
The .map files described in the previous section contain the
Contur likelihood analysis fora sampled collection of points. The core plotting tools that interact with these files are25 ciPost Physics Submission
Data[ ] . . , Catani et al − − − − Isolated diphoton cross-section vs diphoton invariant mass d σ / d M γγ [ p b / G e V ] . . . . . . M γγ [GeV] M C / D a t a (a) Data[ ] . . , Catani et al − − − − Isolated diphoton cross-section vs diphoton invariant mass d σ / d M γγ [ p b / G e V ] . . . . . . M γγ [GeV] M C / D a t a (b) Figure 3: A comparison of a generic Axion-like Particle model [120] producing a 50 GeVresonance in a diphoton mass measurement [47]. An NNLO QCD diphoton backgroundprediction is shown in green [121] for comparison. Figure 3a shows the effect of using thetheoretically calculated background model, with Figure 3b showing the background modelgenerated from data. 26 ciPost Physics Submission described in this section. There are multiple auxiliary tools to aid visual understanding ofthe .map files, which are detailed in Appendix B. The core plotting library, which is buildupon matplotlib [122, 123], is implemented in the plot module, and user interaction with thismodule is driven by the contur-plot executable. This executable requires three arguments;the .map file generated from the main contur executable and the names of two parameterson which to draw the axes. Visualisation is limited to two dimensions, but if more than twodimensions were scanned, then multiple 2D plotting instances can be invoked. The names ofthe requested parameters should match what they were initially called in the parameter sampler(see Section 4.2). The main default visualisation of the likelihood space is demonstrated inSection 6.1. Some methods to interface additional information (such as exclusion contoursfrom other tools) into the default visualisation tools are reviewed in Section 6.2.
The sensitivities calculated by
Contur for each grid point can be expressed as 2D heatmaps, forthe overall sensitivity or for each pool separately. The heatmaps indicate where the consideredsignal model can be excluded due to existing LHC measurements available in
Rivet and whichpart of the phase space is still open. The per-pool heatmaps give more detailed insights intowhere a specific pool contributes, allowing to draw conclusions on the production processesand decay modes involved. An overview of how the individual pools’ sensitivities compareto each other is provided by plotting the dominant pools:
Contur then shows colour-codedwhich pool has the highest sensitivity for a given grid point in the same plane as for theheatmaps. Finally,
Contur also provides exclusion contours at 68% and 95% confidence levelas interpolated from the 2D sensitivity grids. Examples of these types of output plots can befound in Figure 4. Further information about available options can be found in Appendix B.7.
The default grid visualisation tools described in Section 6.1 provide two methods to supplyadditional data, allowing creation of additional grids to overlay on the native
Contur grid.Both methods use the Python importlib package, defining a series of Python functions toimport via command-line arguments to the contur-plot executable. Both methods requireuser-defined functions that take as an argument a Python dictionary of the parameters, namedas specified in the scan (see Section 4). Both methods expect to return a pseudo “exclusion”value specifying the exclusion at the requested point in parameter space. The values areexpected to be set such that negative numbers are allowed and positive numbers are excluded( i.e. the contour is drawn at the level set of zero), it is generally advised to use the “distance”of the value from 0 to accurately fit the contour. The two methods, and examples of functionsexpected for the two formats, are given in the following subsections.
The first method of adding additional data to a plot is invoked by supplying a file containing thefunctions with the command-line argument -d (or --data ) NameOfFile . These functions defineuser supplied grids, allowing arbitrary numbers of points within the space to be considered.These can be read from additional data sources within the supplied file or simply used tocalculate analytic constraints at a much higher resolution than in the
Contur scan. Thefunction should return a tuple with the first argument being the list of parameter space27 ciPost Physics Submission M Z (GeV)250500750100012501500 M D M ( G e V ) . . . . . . C L s M Z (GeV)250500750100012501500 M D M ( G e V ) ATLAS E missT +jet CMS γ ATLAS µµ +jetATLAS ee +jet ATLAS γ + E missT CMS jetsATLAS e + E missT +jet ATLAS γ ATLAS jetsATLAS µ + E missT +jetFigure 4: Contur
2D heatmaps for a Dark Matter vector mediator model, in the vector mediatormass versus dark matter mass plane. The 95% CL (solid white line) and 68% CL (dashedwhite line) exclusion limits are superimposed. The plot on the right shows the breakdown intothe most sensitive analysis pool for each scan point.dictionaries of the new considered points, and the second argument being a list of floats of thepseudo-exclusion values. An example data function is shown in Listing 6.
The second method of adding additional data to a plot is invoked by supplying a file containingthe functions with the command line argument -t (or --theory ) NameOfFile . These functionsrecycle the existing
Contur sample of points to evaluate functions on a grid of the same resolutionas the
Contur scan. The function should return a float of the pseudo-exclusion value. Internally,
Contur will then evaluate this function on the
Contur grid. An example function is shown inListing 7.
This manual accompanies the release of
Contur v2, which is the first public-facing version. Inthis document, the method and the structure of the
Contur package were set out, the corefunctionality of the
Contur code was described and the motivations behind key design choiceswere given. On a more philosophical note, the objective of the
Contur package is to allowthe HEP community easily to re-use the LHC analyses preserved in
Rivet and
HepData toderive exclusions on new physics models. These analyses, the bulk of which are particle-levelmeasurements of SM processes, are often highly model-independent, and can be used to ruleout models which would have interfered with otherwise well-understood spectra. The factthat models can be tested programmatically, making use of the runnable code snippets in28 ciPost Physics Submission
Listing 6: An example user-defined data function. (cid:7) (cid:4) def ExampleDataFunction(paramDict):import numpy as nppts=[]vals=[]p1_axes = np.linspace(10,1000.,1)p2_axes = np.linspace(10,100.,1)for p1 in p1_axes:for p2 in p2_axes:temp=dict.fromkeys(paramDict)temp["contur_p1_name"]=p1temp["contur_p2_name"]=p2pts.append(temp)vals.append(p1/p2-0.5)return pts, vals (cid:6) (cid:5)
Listing 7: An example user-defined theory function. (cid:7) (cid:4) def ExampleTheoryFunction(paramDict):p1=float(paramDict["contur_p1_name"])p2=float(paramDict["contur_p2_name"])return p1/p2-0.5 (cid:6) (cid:5) ciPost Physics Submission Rivet which encapsulate their fiducial regions, implies that large regions of parameter spacecan be probed with minimal “hands-on” effort from analysts. The authors believe that thisability to interrogate existing LHC data directly, rather than construct a new search for eachnew model proposed by theorists, is a key step in the necessary paradigm shift in HEP from“top-down” (theory-driven) to “bottom-up” (data-driven), which is being brought about by theproliferation of candidate new physics models, in the face of increasingly large datasets andcorresponding pressure on computing and human resources. The
Contur developers are alwayshappy to receive feature requests, and new members of the team are welcome to contribute.
Acknowledgements
Thanks
The authors would like to thank David Grellscheid, Michael Kr¨amer and Bj¨ornSarrazin for helping to realise the first
Contur proof of principle. We would also like to thankall the colleagues (students, post-docs and academics) who’ve used development versions of
Contur and helped us to develop and validate the code in the process.
Funding information
Funding sources: MA, AB, JMB, DY and DH from European Union’sHorizon 2020 research and innovation programme as part of the Marie Sk lodowska-CurieInnovative Training Network MCnetITN3 (grant agreement no. 722104). AB, JMB, LC andBW from UKRI STFC consolidated grants for experimental particle physics at UCL andGlasgow, and DY from an STFC studentship. AB from Royal Society University ResearchFellowship scheme, grant UF160548.
A Example
Contur study with
Herwig
Contur supports various event generators, as documented in Appendix F, but the defaultchoice is
Herwig . This is because
Herwig features an inclusive event generation mode, whereone can very easily generate all 2 → DM vector mediator
UFO file.
A.1 Use of
Docker
As documented in Appendix B.1, the user may find it convenient to run
Contur within a
Docker container on a local machine. While this avoids a formal installation of
Contur ’s dependencies,it does also prohibit the user from submitting jobs to a HPC cluster: for this, one needs todo a full installation on the relevant cluster. Nonetheless, one can still generate events, run
Contur on individual parameter points, and analyse results of
Contur scans which have beenperformed elsewhere. If the user wishes to use a
Docker container to run this example, theycan follow the commands in Listing 8 and proceed with the rest of the example.
A.2 Setting up the run area
The model used in this example is one of many pre-loaded example UFO files and associatedtemplates that come with the
Contur package. These can be found in the contur/data/Models ciPost Physics Submission Listing 8: Initiating a docker session (cid:7) (cid:4) $ docker pull hepstore/contur-herwig:latest$ docker run -it hepstore/contur-herwig$ [container] setupContur.sh (cid:6) (cid:5)
Listing 9: Compiling a UFO file with
Herwig (cid:7) (cid:4) mkdir runareacd runareacp -r $CONTUR_ROOT/data/share RunInfocd RunInfocp -r $CONTUR_ROOT/data/Models/DM/DM_vector_mediator_UFO .ufo2herwig DM_vector_mediator_UFO/make (cid:6) (cid:5) directory. The first step is to make a work area, and to copy into it a template
RunInfo directory as well as the model’s UFO files. Once this has been done, one needs to convert theUFO to the
Herwig format, and compile it. This will render the model readable by
Herwig .Listing 9 shows the steps to setup a run area for a DM vector mediator model.
A.3 Event generation In Herwig , an
EventGenerator object is built to generate events. The configuration for thisobject is done in a
Herwig input file (see Listing 10 for an example), with filename extension .in . In
Contur these files are usually called
LHC.in , and for each example model
Contur provides there is an associated example input file in the model directory. A recommendedstarting point for
Contur studies is to complete a single run of
Herwig on the chosen model.The
LHC.in file needs to be customized for a particular model, by specifying the values ofthe parameters in the UFO file one is considering. This might mean setting particle masses,coupling strengths or other model parameters. The input file should therefore contain lineslike those in Listing 10 , customized to the parameters of a given model. One should alsospecify which BSM particles should be considered during event generation (either as outgoingor intermediate particles), and add the setting to inclusively generate all processes involvingthat particle. Finally, one needs to tell
Herwig to pipe the generated events into
Rivet , sothat they can be analysed directly and used in the
Contur workflow. For batch runs,
Contur ’ssteering code automatically appends these lines.After setting up the input file, one can simply read it and generate events with
Herwig . Forthe case of the DM vector mediator model, a template
Herwig configuration file for a singlerun is provided in the model directory. Listing 11 shows the steps to using this template forevent generation. First one must copy the template file into the run area. Next, the
Herwig read step reads and builds the event generator from the configuration file
LHC.in , and the
Herwig run line tells the event generator to generate 200 events. Note that the
Herwig runcard,
LHC.run , is the output of the first line. If successfully run, the output of Listing 11 willproduce the file
LHC.yoda containing the results of the
Herwig run. Note that the commandshere will read the analysis listing file from the installed area. If you wish instead31 ciPost Physics Submission
Listing 10: An example of how to set up a configuration file for
Herwig (cid:7) (cid:4) set /Herwig/FRModel/Particles/Y1:NominalMass 120set /Herwig/FRModel/Particles/Xm:NominalMass 130set /Herwig/FRModel/FRModel:gYq 0.1set /Herwig/FRModel/FRModel:gYXm 0.3...insert HPConstructor:Outgoing 0 /Herwig/FRModel/Particles/Y1insert ResConstructor:Intermediates 0 /Herwig/FRModel/Particles/Y1set HPConstructor:Processes SingleParticleInclusive...set EventGenerator:EventHandler:LuminosityFunction:Energy 13000set EventGenerator:EventHandler:LuminosityFunction:Energy 13000create ThePEG::RivetAnalysis Rivet RivetAnalysis.soinsert EventGenerator:AnalysisHandlers 0 Rivetread 13TeV.anasaverun LHC EventGenerator (cid:6) (cid:5)
Listing 11: Generating events with
Herwig (cid:7) (cid:4) cd runareacp RunInfo/DM_vector_mediator_UFO/LHC-test.in LHC.inHerwig read LHC.in -I RunInfo -L RunInfoHerwig run LHC.run -N 200 (cid:6) (cid:5) to read a local version, modify the read command -I argument to point to that instead. A.4 Running
Contur on a single
YODA file
Following the steps of the previous section where an
LHC.yoda file was produced from theDM vector mediator model, the second command in Listing 12 tells
Contur to analyse it. Thecomputed exclusion will be printed to the terminal. Additional options when running
Contur are also available, and can be accessed through contur --help .Once
Contur has successfully analysed the
YODA file, an
ANALYSIS folder is made, andan exclusion for the model at the specified parameter points is printed alongside some otherinformation about the run. Following along the case study for the DM vector mediator model,the printed exclusion corresponds to the parameter values defined in Listing 10. It is oftenuseful to plot the relevant
Rivet histograms from the
Contur run to get a better idea of theunderlying physics in the calculated exclusion. The third line of Listing 12 runs contur-mkhtml on the analyzed
YODA file, which generates a contur-plots directory that contains all theListing 12: Running
Contur on a single
YODA file (cid:7) (cid:4) cd runareacontur LHC.yodacontur-mkhtml LHC.yoda (cid:6) (cid:5) ciPost Physics Submission Listing 13: Setting up to run a
Contur batch job (cid:7) (cid:4) cd runareacp RunInfo/DM_vector_mediator_UFO/LHC-example.in LHC.incp RunInfo/DM_vector_mediator_UFO/param_file.dat . (cid:6) (cid:5)
Listing 14: An example of how to set up a configuration file for
Herwig batch runs (cid:7) (cid:4) set /Herwig/FRModel/Particles/Y1:NominalMass {mY1}set /Herwig/FRModel/Particles/Xm:NominalMass {mXm}set /Herwig/FRModel/FRModel:gYXm {gYXm}set /Herwig/FRModel/FRModel:gYq {gYq}...insert HPConstructor:Outgoing 0 /Herwig/FRModel/Particles/Y1insert HPConstructor:Outgoing 0 /Herwig/FRModel/Particles/Xmset HPConstructor:Processes SingleParticleInclusive (cid:6) (cid:5) histograms, alongside an index.html file to view them in a browser. And example of theoutput is shown in Figure 5.
A.5 Setting up for
Contur batch jobs (HPC system)
This step cannot be run from within a
Docker container, as it requires access to a HPC system.Instead, one should use a full installation on a HPC cluster.
The next step is to essentially repeat the procedures described in Appendix A.3 and Ap-pendix A.4 and complete a series of runs at various model parameter points, so that anexclusion for the model in the parameter space can be drawn as a 2D heatmap. This can bedone efficiently using
Contur ’s automated batch-job submission functionality. Here we assumethat qsub is available on your system; it is the default choice for
Contur . Slurm and HTCondorbatch systems are also supported, as described in Section 4.4.To set up
Contur for batch-job submission, the user must tell
Contur what region ofparameter space to sample. To do this, the user should replace the nominal mass of darkmatter ( Xm ), nominal mass of the vector mediator ( Y1 ), and the couplings gYq and gYXm inListing 10 by arbitrary variables. A steering file, param file.dat , will specify what values toset for each parameter. Template files are available in the data/share directory and can becopied to runarea as per Listing 13.The newly-copied LHC.in should resemble Listing 14, where the values for the modelparameters have been replaced with their respective variables inside curly brackets. Also noticethat this version of the
Herwig input file is missing the commands that specify beam energy,run the
Rivet pipeline, and save the event generator (cf. listing 10). These lines will be addedautomatically by
Contur for each beam energy when the batch-jobs are submitted.The user should then modify the steering file param file.dat to look like listing Listing 15,replacing the placeholder paths for
Herwig and
Contur under the
Run heading with local paths.The free parameters of our DM vector mediator model are listed under
Parameters . Thevariable name for each parameter must match those in Listing 14 for
Contur to recognize theparameter and substitute in the correct value.33 ciPost Physics Submission
Listing 15:
Contur steering file for DM vector mediator model (cid:7) (cid:4) [Run]generator =’/path/to/generatorSetup.sh’contur =’/path/to/setupContur.sh’[Parameters][[mXm]]mode = LINstart = 10.0stop = 1610.0number = 10[[mY1]]mode = LINstart = 10.0stop = 3610.0number = 10[[gYXm]]mode = CONSTvalue = 1.0[[gYq]]mode = CONSTvalue = 0.25 (cid:6) (cid:5)
Listing 16: Submitting a
Contur batch job with 1000 events per point to the mediumc7 batchqueue (queue names will of course depend on your local cluster) (cid:7) (cid:4) cd runareacontur-batch -n 1000 -q mediumc7 (cid:6) (cid:5)
For each parameter, the mode for sampling must be specified. For this example, the particlemasses of the dark matter candidate and the vector mediator are set to the
LIN linear mode.The start and stop options indicate the sampling range in GeV, and number=15 tells
Contur to sample 10 points in this range in a linear fashion. The coupling of the vector mediatorto dark matter and to quarks are set to the
CONST constant mode, with values 1.0 and 0.25respectively.The user is now ready to submit a batch job over the specified range of parameter points,following the commands of Listing 16 in the runarea directory. This will create a directorycalled myscan00 which contains three directories corresponding to the beam energies 7, 8, and13 TeV. Inside each beam energy directory there will be 100 run-point subdirectories thatcorrespond to your specified range in param file.dat . Inspecting a few of these reveals the
Herwig input files generated, and the shell scripts ( runpoint xxxx.sh ). These shell scriptscan be submitted manually, or run locally in your terminal for troubleshooting.
A.6 Running
Contur on a grid
Once the batch jobs have successfully finished, each runpoint directory in myscan00 should haveproduced a
YODA file. Each
YODA file will be named according to the scan point, for example34 ciPost Physics Submission
Listing 17: Running
Contur on a grid (cid:7) (cid:4) cd runareacontur -g myscan00 (cid:6) (cid:5)
Listing 18: Plotting 2D heatmaps with
Contur (cid:7) (cid:4) cd runarea/ANALYSIScontur-plot contur.map mXm mY1 (cid:6) (cid:5)
LHC-S101-runpoint 0001.yoda . Some functionalities are provided by contur-gridtool ,which performs various manipulations on a completed
Contur grid and can be useful fortroubleshooting. For example, running contur-gridtool -c myscan00 runs a check to seewhether all grid points have valid
YODA files. See section B.3 for full details of how to use it.Listing 17 runs
Contur with the -g or --grid option, which means statistical analysis is tobe performed on the specified grid myscan00 . The output of this step is a directory named ANALYSIS , inside which a contur.map file of the corresponding grid is produced. You maywish to create a shell script for this step and submit it to the batch system for larger grids.
A.7 Making
Contur heatmaps
The last step in a
Contur sensitivity study is the visualisation of the computed limits in theform of 2D sensitivity heatmaps. Once
Contur has successfully run and produced a contur.map file, one can run the contur-plot command on it while specifying the variables to plot. Inthe example in Listing 18, the mass of the dark matter particle mXm is on the x -axis and themass of the vector mediator mY1 is on the y -axis. The output plots are shown in Figure 4. B Contur tools and utilities
The
Contur package provides several tools and executables to assist the user in the preparation,manipulation and visualisation of
Contur results. These tools are documented below.
B.1
Contur Docker containers
Containerisation of software packages with
Docker provides a convenient way to bundle a pieceof software with all its dependencies, so that one can forego a formal installation and simplyrun the software simply by downloading and entering the container. This also allows fast andtrouble-free deployment across operating systems. The
Contur developers maintain two typesof container, which are regularly updated on
DockerHub :• hepstore/contur is a container which includes the latest version of Contur along withall its dependencies except for the
Herwig
MCEG. This is useful for users who do not wishto generate events, but instead analyse the results of existing scans performed elsewhere,or make use of the visualisation tools. It may also be of use for expert users who wish touse the container as a base and install other MCEGs on top.35 ciPost Physics Submission
Listing 19: Examples of how to download and run
Contur via a
Docker container assuming
Docker is installed on the machine (instructions to install
Docker on a variety of operatingsystems are available online). (cid:7) (cid:4) $ docker pull hepstore/contur-herwig:latest $ docker run -it -p 80:80 -v path/to/useful/directory:/mydir contur-herwig $ [container] source setupContur.sh $ [container] ... $ [container] exit $ git clone https://gitlab.com/hepcedar/contur.git$ cd docker/contur $ docker build -t contur . $ docker run -it -p 80:80 -v path/to/useful/directory:/mydir contur (cid:6) (cid:5) • hepstore/contur-herwig is a container which includes the latest version of Contur along with all its dependencies including the
Herwig
MCEG. This is heavier than the hepstore/contur container, but allows the user to generate events.These containers can be downloaded from the command line using for example dockerpull hepstore/contur-herwig:latest , where the tag after the colon can be replaced byanother keyword to download a particular version.A limitation of running
Contur via a
Docker container is that one does not typically haveaccess to a HPC on a local machine. Furthermore, HPC clusters often do not support jobsrunning through a
Docker container. Therefore, it is usually not possible to submit scans toHPC clusters if using
Contur via a container.If the user wishes to build their own container locally, they can make use of the
Dockerfile swhich are provided in docker/*/Dockerfile . In addition to the
Dockerfile s used for theabove-listed containers, a contur-dev Dockerfile is provided, for users wishing to access thedevelopment branch of
Contur . A detailed example of how to download or build one of the
Contur containers is provided in Listing 19.
B.2 Exporting the results of a
Contur statistical analysis to a CSV file using contur-export
The map files containing the
Contur likelihood analysis for a sampled collection of pointsdescribed in Section 5.4 can also be exported to a CSV file by using the contur-export ciPost Physics Submission command with the -i and -o flags to specify the input and output paths for the map and CSVfiles respectively. Adding the -dp or --dominant-pools flag appends a column containing thedominant pools for each point. B.3 Manipulating
Contur scan directories with contur-gridtool
Contur provides a compilation of grid tools for managing grids produced with the contur-batch command. These can be accessed with the contur-gridtool command followed by variousoptional parameters. They allow to merge different signal grids into a single one ( --merge ),removing files unnecessary for post-processing ( --remove-merged ) or compressing those thatare crucial to reduce disk space ( --archive ). Other options check for ( --check ) or resubmit( --submit ) failed jobs to the batch system or identify the grid points that are most consistentwith a given set of parameters ( --findPoint ). B.4 Concatenating the results of
Contur statistical analyses
Running the contur command on a scan directory produces a .map file. One may want toconcatenate the results of several scans by merging their relevant .map files. This can beachieved using the contur-mapmerge command.
B.5 Submitting
Contur scans to a HPC systems using contur-batch
An executable called contur-batch is provided to prepare a parameter space scan and submitbatch jobs. It produces a directory for each of the various beam energies (7, 8 and 13 TeVby default, but configurable with the --beams command), containing generator configurationfiles detailing the parameters used at that run point and a shell script to run the generatorthat is then submitted to a HPC cluster. The --param file , --template file and --grid options may be used to specify the names of the relevant configuration files if they differfrom the defaults. The number of events to generate at each point is controlled by the --numevents option, defaulting to 30 , --variablePrecision flag, which then looks for an additional section of the parameter cardentitled NEventScalings indicating how to scale the number of events for each point. SeeSection 4.2 for more information on the parameter card, and Section B.6 for information onthe contur-zoom tool which can be used to automatically add the
NEventScalings sectionto a parameter card.By default, the tool assumes that
Herwig is the MCEG, but this can be changed with --mceg . If the user wishes to run their own instances of a MCEG and
Rivet , and pipe thisinformation to the jobs, the flag --pipe-hepmc can be used. The MCEG seed can be changedusing the --seed option.One can use the --out dir option to specify where the scan directory should be written.Several options exist to specify the batch system to use ( --batch system ) and the queuename ( --queue ) and well as the maximum wall time ( --walltime ) and maximum memoryconsumption for jobs ( --memory ). Finally, the --scan-only flag can be used to do a dry run:prepare the directories without submitting them to the cluster.37 ciPost Physics Submission
B.6 Iteratively refining the scanned parameter space with contur-zoom
The contur-zoom utility is designed to optimise the hyper-parameters of a parameter scan,such as the ranges, granularity of binning, and number of events to generate at each point. Thereasoning behind this tool is that not all areas of a parameter space are interesting: indeed,parts of the parameter space which are well below the exclusion level, or well above it, canbe ignored, and the more interesting regions to focus on are those where the gradient of CL s values is large. Furthermore, focusing only on interesting regions of parameter space avoidswasting computing resources on points where the result is unambiguous. A user approaching anew model may wish to begin with a coarse, wide-ranging scan of parameter space, and theniteratively “zoom” into the more interesting regions. contur-zoom automatically determines a new set of hyper-parameters for a parametercard, given the results of a previous coarser scan. It does this by defining a figure of meritfor each point in a scan, to approximate the CL s gradient at that point; this is calculated asthe average difference in the CL s value of a given point with respect to all adjacent pointsin the scan. By construction, this figure of merit will always be within 0 and 1, since thatis the range of possible CL s values. This figure of merit is implemented for Contur scans ofarbitrarily-high dimensionality.Given a figure of merit for each point, it is possible to change the ranges of a scan to focuson the region with the fastest change in CL s , by specifying a minimum threshold for the figureof merit. The “zoomed” range of parameter values is an n -dimensional parameter space thatis obtained by iterating through each dimension, and locating the smallest range of points onthat axis that contains all points above the threshold, and using this reduced parameter spacefor the next iteration. The result is a new rectilinear scan range of the model parameters,containing all points with a figure of merit above the threshold.This procedure can be applied to a Contur parameter scan using the contur-zoom command,where the threshold can be specified with --thresh , defaulting to 0 .
25. The .map file forthe original scan should be provided using --m path or -m , and the corresponding originalparameter file with --o path . One can either choose to replace the original parameter filewith the zoomed version (using --replace ), or specify where to write the new files to with --n path . If one wanted to restrict the zooming to a single dimension of the n -dimensionalparameter space, this can be achieved using the --param option. Finally, one can over-ridethe figure of merit for particular points with special CL s values, so that they are included inthe new range regardless of the gradient. For example, one may wish to keep all points on the68% and/or 95% CL contours. This can be achieved using the --vals option, and specifyinga space-separated list of CL s values (between 0 and 1). The algorithm will then keep all pointswith CL s within 0.01 of the specified values.To avoid wasting computing resources on uninteresting points, one may consider excludingpoints with a figure of merit below a given threshold from processing. This can be achievedusing the --skipPoints option of contur-zoom , with the exclusion threshold specified by --thresh . The indices of all points below that threshold will be added to a new block of theparameter card, labelled SkippedPoints , and these points will not be processed during the
Contur scan directory preparation and processing.Finally, one may want to prepare a variable-precision scan over a parameter space, i.e. onewhere the number of events generated at each point may change depending on the region ofparameter space. One may wish to generate more events near “interesting” regions, accordingto the figure of merit defined above. This can be achieved using the --nEventScalings option.38 ciPost Physics Submission
This option will add a section to the parameter card labelled
NEventScalings , which is simplythe value of the figure of merit at each point. When using contur-batch with a parameter cardwhich has a
NEventScalings section, and using the the --variablePrecision (or simply -v ) option to indicate a variable-precision scan, the number of events specified with -n willindicate a maximum number of events, which will be scaled by the value of the figure of meritat that point. Thus, the points with the highest figure of merit (which is always between 0and 1 by construction) will be processed with a number of events close to the maximum, whileless interesting points, where the CL s gradient is smaller, will be generated with fewer events.Finally, the contur-zoom command allows the user to re-bin the parameter space basedon the figure of merit, while maintaining the same ranges. This can be achieved using the --rebin flag. This will create a new parameter card where the number of bins along each axis isdoubled. This option can be used in tandem with the --skipPoints and --nEventScalings options, where the figure of merit of the newly-generated bins will be by default set to thesame value as their parent bins.Examples of the effect of contur-zoom commands on an example parameter card can beseen in Listing 20. The suggested approach would be to begin with a broad, coarse scan overa given parameter space, and iteratively update the ranges, number of points and number ofbins using contur-zoom . B.7 Visualising the results of a
Contur scan using contur-plot
The contur-plot executable produces visualisations of the results of a
Contur grid scan. Thetool takes as input a .map file obtained from running contur -g on a parameter scan directory.This executable can handle 2- or 3-dimensional scans. The user should therefore specify a .map file to read and 2 or 3 variables to plot as positional arguments.In addition to the positional arguments, the user can specify --theory and --data arguments to add additional information to a plot. This is discussed in detail in Section 6.2.The plot title can be set using the --title option. The x - and/or y -axis labels can be setusing --xlabel and --ylabel options, which accept L A TEX formatting but special charactersmust be escaped with a slash. Furthermore, the user may choose to display the x - and/or y -axison a logarithmic scale using --xlog and --ylog flags. In addition to the overall heatmaps,the heatmaps for individual analysis pools can be generated if the --pools flag is turned on,where certain pools can be skipped using the --omit option.Some other expert-user options exist to control the interpolation between points, for example.The full list can be viewed using --help . A few examples of contur-plot commands areshown in Listing 21. B.8 Visualising a single parameter point with contur-mkhtml
It is often useful to run
Contur on a single parameter point ( i.e. a single
YODA file) from ascan, to understand which analyses are providing the exclusion power, and view the associatedhistograms super-imposing the measured data and the generated signal at that point. The contur-mkhtml utility prepares a summary page for the user, which concisely presents themost important histograms which contribute to the CL s exclusion at a given point.The executable can only be used after the main contur executable has been run on the YODA file of a given point, as in the example in Listing 12. An example screenshot of the39 ciPost Physics Submission
Listing 20: Examples of the usage and output of the contur-zoom command. (cid:7) (cid:4) $ contur-zoom --thresh 0.3 --o_path param_file.dat --n_path param_file.zoomed.dat --m_path ANALYSIS/contur.map$ cat param_file.zoomed.dat[Run]generator = "/path/to/generatorSetup.sh"contur = "/path/to/setupContur.sh"[Parameters][[x0]]mode = LINstart = 300.0stop = 1000.0number = 15[[x1]]mode = CONSTvalue = 2.0$ contur-zoom --thresh 0.05 --skipPoints --o_path param_file.zoomed.dat --n_pathparam_file.zoomed.excl.dat --m_path ANALYSIS/contur.map$ cat param_file.zoomed.excl.dat[Run]generator = "/path/to/generatorSetup.sh"contur = "/path/to/setupContur.sh"[Parameters][[x0]]mode = LINstart = 300.0stop = 1000.0number = 15[[x1]]mode = CONSTvalue = 2.0[SkippedPoints]points = 5 6 7 21 22 25 26 29 30 33 34 37 38 45 46 47 48 49 50 51 52 $ contur-zoom --nEventScalings --skipPoints --o_path param_file.zoomed.excl.dat --n_path param_file.zoomed.excl.scaled.dat --m_path ANALYSIS/contur.map$ cat param_file.zoomed.excl.scaled.dat[Run]generator = "/path/to/generatorSetup.sh"contur = "/path/to/setupContur.sh"[Parameters][[x0]]mode = LINstart = 300.0stop = 1000.0number = 15[[x1]]mode = CONSTvalue = 2.0[SkippedPoints]points = 5 6 7 21 22 25 26 29 30 33 34 37 38 45 46 47 48 49 50 51 52 [NEventScalings]points = 0.16713080576991923 0.16713080576991923 0.167130805769919230.2614096406084278 0.08944325063426813 .. (cid:6) (cid:5) ciPost Physics Submission Listing 21: Examples of the usage of the contur-plot command. (cid:7) (cid:4) $ contur-plot myscan.map Xm Y1 $ contur-plot myscan.map Xm Y1 gYq -s 50 $ contur-plot myscan.map Xm Y1 gYq -sc $ contur-plot myscan.map Xm Y1 gYq -a (cid:6) (cid:5)
Figure 5: Example of the summary web page for a single-point contur run, as produced by contur-mkhtml .summary web page which is produced can be seen in Figure 5. The --reducedPlots flagcauses only the most important histograms to be plotted, thus speeding up processing time.
B.9
Herwig -specific cross-section visualisation tools
For each point in a scan of parameters,
Herwig produces log files which detail the generatedcross-sections and branching ratios for the processes which contribute. This is valuableinformation, as one can use it to understand how the phenomenology of the model changesfor different regions of the parameter space. Two helper Python executables are provided, toparse this information and present it in a digestible format.First, contur-extract-herwig-xs-br parses the log files for a given point in the parameterscan, and returns an ordered list of processes and their cross-sections to the terminal. Thislist represents all the processes which contribute some configurable fraction of the total cross-section (option --tolerance , by default 1%) at that point. To aid digestion of results, similarprocesses are grouped together, with their cross-sections summed. The default summation41 ciPost Physics Submission rules are summarised below.• Differences between leptons (electrons, muons and taus) are ignored. This behaviourcan be over-ridden using --splitLeptons, --sl ;• Differences between incoming partons are ignored: all flavours of quarks and gluons aremerged. This behaviour can be over-ridden using --splitPartons, --sp ;• Differences between particles and antiparticles are ignored. This behaviour can beover-ridden using --splitAntiparticles, --sa ;• Differences between “light” quarks are ignored: u, d, s, c, b are grouped. This behaviourcan be over-ridden using --splitLightQuarks, --sq , or just the b can be split outusing --split b, --sb ;• Optionally, one can choose to ignore differences between electroweak bosons W, Z, H using --mergeBosons, --mb ;The resulting output will show the outgoing particles from the matrix element. It maybe that one is interested not in the particles which come out of the hard scatter, butthe stable particles one would find in the final state. To help determine this information, contur-extract-herwig-xs-br can recursively apply SM branching fractions of unstable SMparticles, and can extract the predicted branching fractions of BSM particles from the log filesand apply those recursively too. This behaviour can be activated using the --foldBRs,--br option or --foldBSMBRs,--br bsm for BSM decays only. Some examples of the output of thescript can be found in Listing 22.A second executable, contur-scan-herwig-xs-br , can call contur-scan-herwig-xs-br at each point in a parameter scan, and present the cross-section information as a cross-sectionheatmap for each process. At present, the tool can only handle two-dimensional scans, andthe variables to use as the x - and y -axes of the resulting plots should be provided via --xy .This script takes the same options as contur-scan-herwig-xs-br in terms of merging similarprocesses, and additionally can take a -p or --pools option, which further groups final statesinto pools of “analysis types”, for example grouping together processes which have the same orsimilar number of leptons, photons, jets (from quarks, or gluons) or b -jets, or missing energy(from neutrinos or stable BSM particles). Examples of outputs of this tool can be found inFigure 6. B.10 Interactive visualisation
To further aid digestion of
Contur results, a web-based visualisation tool, contur-visualiser ,is provided. This tools combines the
Contur parameter scan, CL s calculation and Herwig log-file parsing, to build an interactive webpage where these results are presented in a combinedway. The page can be opened on any browser on the local machine. The result is a heatmapshowing the CL s exclusion at each point of a parameter scan, where hovering the cursor over aparticular point reveals the cross-section information for that point. Clicking on a given pointwill cause the evaluation of a single-point Contur run (as described in Section B.8), and willopen a new page showing the summary for that point.Since the visualiser is I/O intensive, it is recommended to run the contur-visualiser locally rather than via ssh . Users may find it convenient to run this tool via a
Docker image42 ciPost Physics Submission
Listing 22: Examples of the usage and output of the contur-extract-herwig-xs-br script. (cid:7) (cid:4) [myscan00]$ contur-extract-herwig-xs-br -i 13TeV/0001/ --tolerance 0.013TeV/0001 :: gYXm=1.000000, gYq=0.250000, mXm=116.666667, mY1=10.000000totalXS 167000000.00 fb145000000.00 fb, (86.83%), p p \rightarrow Y1 q14500000.00 fb, (8.68%), p p \rightarrow Y1 g5500000.00 fb, (3.29%), p p \rightarrow Y1 Y1900000.00 fb, (0.54%), p p \rightarrow W Y1700000.00 fb, (0.42%), p p \rightarrow q q300000.00 fb, (0.18%), p p \rightarrow Y1 \gamma200000.00 fb, (0.12%), p p \rightarrow Y1 Z[myscan00]$ contur-extract-herwig-xs-br -i 13TeV/0001/ --tolerance 0.0 \--splitPartons --splitAntiparticles13TeV/0001 :: gYXm=1.000000, gYq=0.250000, mXm=116.666667, mY1=10.000000totalXS 167000000.00 fb105000000.00 fb, (62.87%), g q \rightarrow Y1 q40000000.00 fb, (23.95%), \bar{q} g \rightarrow Y1 \bar{q}14500000.00 fb, (8.68%), \bar{q} q \rightarrow Y1 g5500000.00 fb, (3.29%), \bar{q} q \rightarrow Y1 Y1900000.00 fb, (0.54%), \bar{q} q \rightarrow W Y1700000.00 fb, (0.42%), \bar{q} q \rightarrow \bar{q} q300000.00 fb, (0.18%), \bar{q} q \rightarrow Y1 \gamma200000.00 fb, (0.12%), \bar{q} q \rightarrow Y1 Z[myscan00]$ contur-extract-herwig-xs-br -i 13TeV/0001/ --tolerance 0.0 \--splitPartons --splitLightQuarks --mergeBosons13TeV/0001 :: gYXm=1.000000, gYq=0.250000, mXm=116.666667, mY1=10.000000totalXS 167000000.00 fb83000000.00 fb, (49.70%), g u \rightarrow Y1 u62000000.00 fb, (37.13%), d g \rightarrow Y1 d9000000.00 fb, (5.39%), u u \rightarrow Y1 g5500000.00 fb, (3.29%), d d \rightarrow Y1 g3700000.00 fb, (2.22%), u u \rightarrow Y1 Y11800000.00 fb, (1.08%), d d \rightarrow Y1 Y1900000.00 fb, (0.54%), d u \rightarrow V Y1700000.00 fb, (0.42%), d d \rightarrow u u300000.00 fb, (0.18%), u u \rightarrow Y1 \gamma200000.00 fb, (0.12%), d d \rightarrow V Y1[myscan00]$ contur-extract-herwig-xs-br -i 13TeV/0001/ --tolerance 0.0 --foldBRs13TeV/0001 :: gYXm=1.000000, gYq=0.250000, mXm=116.666667, mY1=10.000000totalXS 167000000.00 fb145000000.00 fb, (86.83%), p p \rightarrow q q q14500000.00 fb, (8.68%), p p \rightarrow g q q6250000.00 fb, (3.74%), p p \rightarrow q q q q700000.00 fb, (0.42%), p p \rightarrow q q300000.00 fb, (0.18%), p p \rightarrow \gamma q q288000.00 fb, (0.17%), p p \rightarrow \nu l q q41000.00 fb, (0.02%), p p \rightarrow \nu \nu q q21000.00 fb, (0.01%), p p \rightarrow l l q q (cid:6) (cid:5) ciPost Physics Submission
500 1000 1500 2000 2500 3000 3500mY12004006008001000120014001600 m X m e + e + e + e + pp Y q (a)
500 1000 1500 2000 2500 3000 3500mY12004006008001000120014001600 m X m e - e - e + e + e + e + e + pp XmXm (b) c r o ss - s e c t i o n [ f b ]
500 1000 1500 2000 2500 3000 3500mY12004006008001000120014001600 m X m e + e + e + e + nJets > = 2 (c)
500 1000 1500 2000 2500 3000 3500mY12004006008001000120014001600 m X m e - e - e + e + e + e + e + MET (d) c r o ss - s e c t i o n [ f b ] Figure 6: Leading-order cross-sections extracted from
Herwig logs using the contur-scan-herwig-xs-br tool. The top row corresponds to the default output for thetool, showing the three dominant 2 → --br --pools options activated. The stable BSM particlesare assumed to show up as missing transverse energy (MET). These plots reveal that theproduction of certain particles, and certain decay modes, are only accessible in certain regionsof the parameter space, giving an insight of how the phenomenology of the model varies fromregion to region. 44 ciPost Physics Submission Listing 23: Detailed guide to using the contur-visualiser tool. (cid:7) (cid:4) cd /contur/docker/contur-visualiserdocker build -t contur-website . docker run -it -p 80:80 -v {$DATASET_DIR}:/dataset contur-website cd contur-visualiser./contur-visualiser -d {$DATASET_DIR} -x {X} -y {Y} -m
Dockerfile s are available in /contur/docker/contur-visualiser ). Wheninitialising the
Docker image, one should map the location of the scan files and map (whichmay have been downloaded from a cluster), as well as exposing port 80 to the docker containervia the docker option -p 80:80 , so its output can be accessed via HTTP.Once inside the container, the contur-visualiser tool should be run from the contur-visualiser directory, and requires a path to a .map file (using -m ) the path to the scan directory (using -d ), and the names of the variables to plot (using -x and -y ).The visualiser will then run through each point in the parameter scan directory and collectthe output of contur-extract-herwig-xs-br , as well as the CL s values at each point. Then,it will create an interactive page which can be accessed by opening http://0.0.0.0:80/ on the local machine (outside the Docker container). An example of a screenshot of such aweb page is provided in Figure 7. Hovering over the points on the heatmap reveal the x , y and CL s values at that point, while the side-panel shows the contur-extract-herwig-xs-br output at that point, so that the user can gauge which processes might be contributing toexcluded points, for example. To dig further into the details of a given point, the user canclick on a point on the heatmap, and this will trigger the terminal in the Docker container torun contur-mkhtml on that point. Once the terminal has finished running that command, afurther click will open a new window, displaying the summary plots for that point, similar tothose shown in Figure 5. A detailed example of how to run the contur-visualiser tool isprovided in Listing 23.
B.11 Other tools
As explained in Section 4.1, the contur-mkana helps the user to generate static lists of availableanalyses to feed into the MCEG codes. A
Herwig -style template file is created in a series of .ana files, and a shell script to set environment variables containing lists of analyses useful forcommand-line-steered MCEGs is also written.45 ciPost Physics Submission
Figure 7: Screenshot showing the output of the contur-visualiser utility. contur-mkthy is designed to help prepare SM theory predictions for particular
Rivet analyses, for a more robust statistical treatment. This information is not always providedby the
HepData entry of a given measurement, so it sometimes has to be obtained from analternative source. This script helps the user translate the raw prediction into a format usableby
Contur . This tool is not usually intended to be needed by regular users.
C BSM models as UFO files
The Universal FeynRules Object [12] (UFO) format is a Python-based way to encapsulate theLagrangian of a new model. It contains the basic information about new couplings, particlesand parameters which are required to generate BSM events. Since its inception in 2011, thisformat has become something of an industry standard, which is well-known and commonly usedby theorists, and there exists a database of models with such implementations i . Furthermore,as the name implies, it is a format which is compatible with multiple event generators.To get started with the study of a particular BSM model with Contur , the UFO file shouldbe copied into the local
RunInfo directory. The documentation associated with the modelshould give the adjustable parameter names. The precise next steps will depend upon thegenerator. A detailed example using
Herwig is provided in section Appendix A. i https://feynrules.irmp.ucl.ac.be/wiki/ModelDatabaseMainPage ciPost Physics Submission Listing 24: Using a directory of SLHA files as the basis for a scan. (cid:7) (cid:4) [Parameters][[slha_file]]mode = DIRname = "/home/username/slha1_files" (cid:6) (cid:5)
Listing 25: Using a single SLHA file as the basis for a scan. (cid:7) (cid:4) [Parameters][[slha_file]]mode = SINGLEname = C1C1WW_300_50_SLHA.slha1[[M1000022]]block = MASSmode = LINstart = 10stop = 210number = 5[[M1000024]]block = MASSmode = LINstart = 200stop = 300number = 5 (cid:6) (cid:5)
D Support for SLHA files
SUSY Les Houches Accord files provide a mechanism for supplying the mass and decay spectraof the particle content of a new given model, and for a number of widely used BSM scenarioswhich are already implemented in general purpose MCEGs, provide a convenient way ofspecifying the parameter point under consideration.
Contur can make use of SLHA files in twoways. To do so it relies on the pyshla package [124].
D.1 Scanning over a directory
If a set of parameter points is predetermined, and the SLHA files are available, these can beused simply as the input of
Contur scan. The param file.dat syntax is as given in Listing 24.To make the parameters from the SLHA file available for plotting in the produced .map file, use the -S parameter to pass a comma-separated list of SLHA block names to contur when running on the grid. The parameter names will have the block name prepended, forexample MASS:1000022 would be the mass of the lightest neutralino.
D.2 Modifying SLHA files
If a single SLHA file is available and the user wishes to modify it, for example scanning two ofthe particle masses over some range, this can be done using the param file.dat syntax givenin Listing 25. 47 ciPost Physics Submission
Listing 26: Scaling values in a single SLHA file as the basis for a scan. (cid:7) (cid:4) [Parameters][[slha_file]]mode = SCALEDname = RPV-UDD.slha[[RVLAMUDD]]mode = LINstart = 1.00E-02stop = 1.00E-01number = 15 (cid:6) (cid:5)
This would use the SLHA file
C1C1WW 300 50 SLHA.slha1 as a template, and would scanthe χ (PID 1000022) and χ ± (PID 1000024) particle masses over the ranges specified by themode, start, stop parameters in the specific number of steps. The modified parameters (only)would be written to the params.dat files for later use in analysis. A single letter in front ofthe particle ID integer is required, and should be unique to a given SLHA block, allowing (forexample) several properties of the same particle (in different blocks) to be varied at once.Alternatively, all the parameters in a single block may be scaled by a common factor, asshown in Listing 26, where the couplings in the RVLAMUDD block will also be multiplied byfactors from 0.01 to 0.1, in 15 steps.
E Support for pandas
DataFrame s pandas DataFrame s provide an interoperable mechanism for supplying values for model parame-ters which are scoped into the
Contur parameter sampler; and for a number of models boundedby other observations, imposing arbitrary constraints on parameter values. For improved scanefficiency,
DataFrame s enable the generation of non-rectilinear grids and dynamic modifica-tion of scan resolution.
Contur loads
DataFrame s from pickle files, which are files containingserialised Python object structures, and makes use of the pandas and pickle packages.
E.1 Creating pickle files
If a pandas
DataFrame with column names corresponding to parameter names in param file.dat is available, data frame.to pickle(’path/to/file.pkl’) can be used toproduce and save a pickle file to load into
Contur . E.2 Loading pickle files
If a pickle file is available, the
DATAFRAME block of a parameter file can be used to specifyan absolute path, or path relative to the current working directory, under the keyword name ,from which to load a pickle file into
Contur . Contur only supports one pickle file for each param file.dat , although an arbitrary number of parameters can be extracted from that file.An example is shown in Listing 27. 48 ciPost Physics Submission
Listing 27: An example
Contur configuration file showing the usage of a
DataFrame to load aparameter (cid:7) (cid:4) [Run]generator = "/path/to/generatorSetup.sh"contur = "/path/to/setupContur.sh"[Parameters][[x0]]mode = LOGstart = 1.0stop = 10000.0number = 15[[x1]]mode = CONSTvalue = 2.0[[x2]]imode = DATAFRAMEname = "/path/to/pickle_dataframe.pkl"[[x3]]mode = DATAFRAMEname = "/path/to/pickle_dataframe.pkl" (cid:6) (cid:5)
E.3 Interoperability
The
DATAFRAME mode can be used alongside other modes; for modes such as
LOG/LIN withmore than one parameter value, the scan will occur across each entry in the pandas
DataFrame . F Support for other event generators
F.1
MadGraph support
Events are generated with
MadGraph5 aMC@NLO [17] through a steering script, an examplefor which is given in Listing 28. This is functionally comparable to the
LHC.in file for steering
Herwig as shown in Listing 10. In the steering script, at first
MadGraph -specific variables arebeing set. If a grid of signal points is to be generated using a batch system, it is importantto include the options set run mode 0 and set nb core 1 as by default
MadGraph runs onmultiple cores which can be problematic on some HPC systems. These two lines configure
MadGraph correctly for single core mode and make thus more efficient use of computationalresources.The desired UFO can be used by calling import h UFO model directory i , which – in contrastto usage of Herwig – does not need to be compiled. Afterwards, the model-specific processes aredefined j and MadGraph started ( launch ). Parton showering for the generated events as well asgiving a
HepMC file as output is taken care of by
Pythia [125], initialised by shower=Pythia8 .Afterwards, generation- and model-specific parameters are set. Just as for
Herwig , parametersshould be included in curly brackets if
Contur is used to generate a signal grid, otherwise j In the example, the arbitrary choice of a top quark pair produced in association with the mediator is made. ciPost Physics Submission Listing 28: An example
MadGraph steering script. (cid:7) (cid:4) set group_subprocesses Autoset ignore_six_quark_processes Falseset gauge unitaryset loop_optimized_output Trueset complex_mass_scheme Falseset automatic_html_opening Falseset run_mode 0set nb_core 1import model ./DM_vector_mediator_UFOdefine p = g u c d s u˜ c˜ d˜ s˜ b b˜generate p p > t t˜ Y1 DMS=2 QCD=4output mgeventslaunchshower=Pythia8set MY1 120set MXm 130set gYq 0.1set gYXm 0.3 (cid:6) (cid:5)
Listing 29: Steps from event generation with
MadGraph to exclusion from
Contur (cid:7) (cid:4) $MG_DIR/bin/mg5_aMC mg_dmv.sh rivet --skip-weights -a $CONTUR_RA13TeV \mgevents/Events/run_01/tag_1_pythia8_events.hepmc contur Rivet.yoda --wn "Weight_MERGING=0.000" (cid:6) (cid:5) concrete parameter values should be given.After setting up the steering script,
MadGraph generates events when called as $MG DIR/bin/mg5 aMC h MG steering script i where $MG DIR points to the installation directoryof MadGraph . This will give
HepMC files as output in mgevents/Events that can be pro-cessed subsequently with
Rivet to obtain a
YODA file. Starting from this, the steps involving
Contur are almost identical to those detailed for
Herwig in Sections A.4 to A.7. Due todifferent MC weight nomenclature within
MadGraph , when running on a single parameterpoint, the option --skip-weights should be given to the rivet command as well as --wn"Weight MERGING=0.000" to the contur command to ensure the correct MC weight is pickedup by
Rivet and
Contur . A complete example for the steps from event generation with
MadGraph to a
Contur exclusion is given in Listing 29.To generate a signal grid with
Contur using contur -g , specify
MadGraph to be used asthe MC generator by giving the option --mceg madgraph . F.2
Powheg support
Events can be generated in
Powheg in the .lhe format using the pwhg main executable togetherwith an input file called powheg.input . These events can then be transformed to the .hepmc format and showered using a full-final-state generator such as
Pythia . These .hepmc eventscan then be passed through
Rivet as usual to obtain a
YODA file for processing by
Contur to50 ciPost Physics Submission get exclusion limits.Machinery to steer
Powheg using
Contur has been created based on the
PBZpWp
Powheg package which produces events at leading and next-to-leading order for electroweak t ¯ t hadropro-duction in models with flavour non-diagonal Z boson couplings and W bosons [126]. ThreeBSM models are currently implemented, namely the Sequential Standard Model (SSM) [127],the Topcolour (TC) model [128,129], and the Third Family Hypercharge Model (TFHMeg) [130].In what follows we exemplify this steering chain by explaining how to run jobs on a HPCsystem to set exclusion limits on the mass of Z in the SSM. Powheg running does not supportthe UFO format, but the example discussed in this section could be used as an example if onewanted to use other
Powheg packages.To run a batch job one needs three executables ( main-pythia , pwhg main ,and pbzp input contur.py ), two files ( param file.dat and powheg.input template ), andone directory ( RunInfo ), all in one run directory. main-pythia is responsible for the creationof the
HepMC file and of the parton showering. More details on these are listed below.• The
RunInfo directory contains the needed analysis steering files ( .ana ) and can beprepared as described in Section 4.4.• The pbzp input contur.py script is used to create and fill the powheg.input files basedon the model choice in param file.dat , it needs powheg.input template in order todo so.• The param file.dat file defines a parameter space, as with other generators.In the SSM, there are only two parameters, i.e. the mass ( mZp ) and the width (
GZp ) of the Z boson in GeV, but one also needs to include the name of the model (SSM in this example),and the parameters of the other models as dummy k . The param file.dat of the SSM shouldbe formatted as in Listing 30, where setupPBZpWp.sh is a script which sets the environmentneeded to run pwhg main and setupEnv.sh a script which sets up the run-time environmentwhich the batch jobs will use, as a minimum it will need to contain the lines to execute your rivetenv.sh and yodaenv.sh files. For all the set up files, one should give the full explicitpath. The setupPBZpWp.sh and the setupEnv.sh should be always in the same order asshown in this example, i.e. in generator one first gives the full path to setupPBZpWp.sh then the one for setupEnv.sh . In addition, one should check that the parameters defined in params file.dat are also defined in powheg.input template , in other words, removing oradding new parameters should be done in both files.The HPC submission procedure using contur-batch follows the same workflow as for otherMCEG options, but specifying --mceg pbzpwp and -t powheg.input template to indicatethe correct template. When the batch job is complete there should, in every run point directory,be a runpoint xxxx.yoda file and an output.pbzpwp directory that contains the .lhe file.Creating the heatmap can then be done as explained in Sections A.6 and A.7. k the angle θ sb ( tsb ) needed for the TFHMeg, and cot θ H ( cotH ) needed for the TC model, since for nowwe only include the SSM, the TFHMeg and the TC models. This is done in order to be able to use the same powheg.input template for all the models. ciPost Physics Submission Listing 30: An example
Contur configuration file for the SSM (cid:7) (cid:4) [Run]generator = "/path/to/setupPBZpWp.sh","/path/to/setupEnv.sh"contur = "/path/to/setupContur.sh"[[mZp]]mode = LINstart = 1000.0stop = 5000.0number = 9[[GZp]]mode = LINstart = 50.0stop = 500.0number = 10[[model]]mode = SINGLEname = SSM[[tsb]]mode = SINGLEname = dummy[[cotH]]mode = SINGLEname = dummy (cid:6) (cid:5)
G The analysis database
The categorisation of
Rivet analyses into pools, as described in Section 3.1 is implementedin an
SQLite database, distributed with
Contur . The source code analysis.sql is in the data/DB directory, and after installation the compiled database will be in the same directory,named analysis.db . The database contains the following tables:
G.1 General configuration beams
Short text strings specifying known beam conditions, e.g. . analysis pool Defines the analysis pool names, associates them with a beam , and gives ashort text description of the pool. analysis
Lists the known
Rivet analyses, assigns them to an analysis pool , and stores theluminosity used, in the units corresponding to those used in the
Rivet code. blacklist
Optionally, for a given analysis , defines any histograms (via regular expressionmatching) which should be ignored. whitelist
Optionally, for a given analysis , defines any histograms (via regular expressionmatching) which should be used. If an analysis has any whitelist entries, all unmatched52 ciPost Physics Submission histograms will be ignored. subpool
Optionally for a given analysis , list (and name) subsets of histograms which areknow to be statistically “orthogonal” in the sense of containing no events in common. normalization
Some measurements are presented as area-normalised histograms (for exam-ple when the discussion focuses on shapes).
Contur requires the cross section normalisation, sothat it knows the weight with which signal events should be injected. For such histograms, thistable stores this normalisation factor. For searches, where the measured distribution is oftenjust a number of events per (number of units), this results sometimes in bins with unequalwidth. In this case, the “number of units” should be given in the nxdiff field. The number ofevents in each bin will be obtained by multiplying by the bin width and dividing by nx . If thebin width is constant, this can be left as zero, and will not be used. needtheory Analyses which both require and use the SM prediction.
G.2 Special cases
The remaining tables define various special cases of analyses which may be included or not ina
Contur run by setting command-line options at run-time. See Section 3.3 for usage and morediscussion on why these special cases are treated differently. metratio
Missing energy ratio measurement(s) from ATLAS. Included by default. higgsgg H → γγ analyses. Included by default. searches Search analyses (for which detector smearing is used). Excluded by default. higgsww H → WW analyses. Excluded by default. atlaswz ATLAS WZ analysis. Excluded by default. bveto Analyses with a b -jet veto which is not implemented in the fiducial phase space.Excluded by default. References [1] C. Bierlich, A. Buckley, J. M. Butterworth, L. Corpe, D. Grellscheid, C. Gutschow,P. Karczmarczyk, J. Klein, L. Lonnblad, C. S. Pollard, H. Schulz and F. Siegert,
RobustIndependent Validation of Experiment and Theory: Rivet version 3 , SciPost Phys. ,026 (2020), doi:10.21468/SciPostPhys.8.2.026, .[2] J. M. Butterworth, D. Grellscheid, M. Kr¨amer, B. Sarrazin and D. Yallup, Constrainingnew physics with collider measurements of Standard Model signatures , JHEP , 078(2017), doi:10.1007/JHEP03(2017)078, .53 ciPost Physics Submission [3] S. Amrith, J. M. Butterworth, F. F. Deppisch, W. Liu, A. Varma and D. Yallup, LHC Constraints on a B − L Gauge Model using Contur , JHEP , 154 (2019),doi:10.1007/JHEP05(2019)154, .[4] A. Buckley, J. M. Butterworth, L. Corpe, D. Huang and P. Sun, New sensitivityof current LHC measurements to vector-like quarks , SciPost Phys. (5), 069 (2020),doi:10.21468/SciPostPhys.9.5.069, .[5] J. M. Butterworth, M. Habedank, P. Pani and A. Vaitkus, A study of collider signaturesfor two Higgs doublet models with a Pseudoscalar mediator to Dark Matter (2020), .[6] G. Brooijmans et al. , Les Houches 2019 Physics at TeV Colliders: New Physics WorkingGroup Report , In (2020), .[7] B. Allanach, J. M. Butterworth and T. Corbett,
Collider constraints on Z models forneutral current B-anomalies , JHEP , 106 (2019), doi:10.1007/JHEP08(2019)106, .[8] J. M. Butterworth, M. Chala, C. Englert, M. Spannowsky and A. Titov, Higgs phe-nomenology as a probe of sterile neutrinos , Phys. Rev. D (11), 115019 (2019),doi:10.1103/PhysRevD.100.115019, .[9] The Contur development team,
Contur , doi:10.5281/zenodo.4133878 (2020).[10] D. Yallup,
Constraining new physics with fiducial measurements at the LHC. , Ph.D.thesis, University Coll. London (2020).[11] J. Bellm et al. , Herwig 7.0/Herwig++ 3.0 release note , Eur. Phys. J.
C76 (4), 196(2016), doi:10.1140/epjc/s10052-016-4018-8, .[12] C. Degrande, C. Duhr, B. Fuks, D. Grellscheid, O. Mattelaer and T. Reiter,
UFO- The Universal FeynRules Output , Comput. Phys. Commun. , 1201 (2012),doi:10.1016/j.cpc.2012.01.022, .[13] P. Z. Skands et al. , SUSY Les Houches accord: Interfacing SUSY spectrum calcula-tors, decay packages, and event generators , JHEP , 036 (2004), doi:10.1088/1126-6708/2004/07/036, hep-ph/0311123 .[14] B. Allanach et al. , SUSY Les Houches Accord 2 , Comput. Phys. Commun. , 8 (2009),doi:10.1016/j.cpc.2008.08.004, .[15] The pandas development team, pandas-dev/pandas: Pandas , doi:10.5281/zenodo.3509134(2020).[16] W. McKinney,
Data Structures for Statistical Computing in Python , In St´efan van derWalt and Jarrod Millman, eds.,
Proceedings of the 9th Python in Science Conference ,pp. 56 – 61, doi:10.25080/Majora-92bf1922-00a (2010).54 ciPost Physics Submission [17] J. Alwall, R. Frederix, S. Frixione, V. Hirschi, F. Maltoni, O. Mattelaer, H. S. Shao,T. Stelzer, P. Torrielli and M. Zaro,
The automated computation of tree-level andnext-to-leading order differential cross sections, and their matching to parton showersimulations , JHEP , 079 (2014), doi:10.1007/JHEP07(2014)079, .[18] S. Alioli, P. Nason, C. Oleari and E. Re, A general framework for implementing NLOcalculations in shower Monte Carlo programs: the POWHEG BOX , JHEP , 043(2010), doi:10.1007/JHEP06(2010)043, .[19] M. Dobbs and J. B. Hansen, The HepMC C++ Monte Carlo event record for High EnergyPhysics , Comput. Phys. Commun. , 41 (2001), doi:10.1016/S0010-4655(00)00189-2.[20] A. Buckley, P. Ilten, D. Konstantinov, L. L¨onnblad, J. Monk, W. Porkorski, T. Przedzinskiand A. Verbytskyi,
The HepMC3 Event Record Library for Monte Carlo Event Generators (2019), doi:10.1016/j.cpc.2020.107310, .[21] W. Abdallah et al. , Reinterpretation of LHC Results for New Physics: Status andRecommendations after Run 2 (2020), .[22] E. Maguire, L. Heinrich and G. Watt,
HEPData: a repository for high energy physicsdata , J. Phys. Conf. Ser. (10), 102006 (2017), doi:10.1088/1742-6596/898/10/102006, .[23] R. Aaij et al. , Study of forward Z + jet production in pp collisions at √ s = 7 TeV ,JHEP , 033 (2014), doi:10.1007/JHEP01(2014)033, .[24] M. Aaboud et al. , Measurements of electroweak
W jj production and constraints onanomalous gauge couplings with the ATLAS detector , Eur. Phys. J.
C77 (7), 474 (2017),doi:10.1140/epjc/s10052-017-5007-2, .[25] G. Aad et al. , Measurement of the production cross section of an isolated photonassociated with jets in proton-proton collisions at √ s = 7 TeV with the ATLAS detector ,Phys. Rev.
D85 , 092014 (2012), doi:10.1103/PhysRevD.85.092014, .[26] G. Aad et al. , Measurements of the W production cross sections in association with jetswith the ATLAS detector , Eur.Phys.J.
C75 (2), 82 (2015), doi:10.1140/epjc/s10052-015-3262-7, .[27] V. Khachatryan et al. , Measurements of jet multiplicity and differential production crosssections of Z + jets events in proton-proton collisions at √ s = 7 TeV , Phys.Rev.
D91 (5),052008 (2015), doi:10.1103/PhysRevD.91.052008, .[28] G. Aad et al. , Measurement of the double-differential high-mass Drell-Yan cross sectionin pp collisions at √ s = 8 TeV with the ATLAS detector , JHEP , 009 (2016),doi:10.1007/JHEP08(2016)009, .[29] G. Aad et al. , Measurement of the transverse momentum and φ ∗ η distributions of Drell-Yan lepton pairs in proton–proton collisions at √ s = 8 TeV with the ATLAS detector (2015), . 55 ciPost Physics Submission [30] S. Chatrchyan et al. , Measurement of the ratio of inclusive jet cross sections using theanti- k T algorithm with radius parameters R = 0 . and . in pp collisions at √ s = 7TeV (2014), .[31] S. Chatrchyan et al. , Measurements of differential jet cross sections in proton-protoncollisions at √ s = 7 TeV with the CMS detector , Phys.Rev.
D87 (11), 112002 (2013),doi:10.1103/PhysRevD.87.112002, .[32] V. Khachatryan et al. , Measurement of the WZ production cross section in pp collisionsat √ s = 7 and 8 TeV and search for anomalous triple gauge couplings at √ s = 8 TeV ,Eur. Phys. J.
C77 (4), 236 (2017), doi:10.1140/epjc/s10052-017-4730-z, .[33] M. Aaboud et al. , Measurement of the production cross section of three isolated photonsin pp collisions at √ s = 8 TeV using the ATLAS detector , Phys. Lett. B781 , 55 (2018),doi:10.1016/j.physletb.2018.03.057, .[34] V. Khachatryan et al. , Measurement of differential cross sections for top quark pairproduction using the lepton+jets final state in proton-proton collisions at 13 TeV , Phys.Rev.
D95 (9), 092001 (2017), doi:10.1103/PhysRevD.95.092001, .[35] A. Collaboration,
Measurement of multi-jet cross sections in proton-proton collisions ata 7 TeV center-of-mass energy (2011), .[36] G. Aad et al. , Measurement of the inclusive cross-section for the production of jets inassociation with a Z boson in proton-proton collisions at 8 TeV using the ATLAS detector ,Eur. Phys. J. C (10), 847 (2019), doi:10.1140/epjc/s10052-019-7321-3, .[37] G. Aad et al. , Measurements of jet vetoes and azimuthal decorrelations in dijet eventsproduced in pp collisions at √ s = 7 TeV using the ATLAS detector , Eur.Phys.J. C74 (11),3117 (2014), doi:10.1140/epjc/s10052-014-3117-7, .[38] G. Aad et al. , Measurement of three-jet production cross-sections in pp collisions at 7TeV centre-of-mass energy using the ATLAS detector , Eur.Phys.J. C75 (5), 228 (2015),doi:10.1140/epjc/s10052-015-3363-3, .[39] M. Aaboud et al. , Measurement of differential cross sections of isolated-photon plusheavy-flavour jet production in pp collisions at √ s = 8 TeV using the ATLAS detector (2017), .[40] A. M. Sirunyan et al. , Measurement of associated production of a W boson and a charmquark in proton-proton collisions at √ s =
13 TeV , Eur. Phys. J.
C79 (3), 269 (2019),doi:10.1140/epjc/s10052-019-6752-1, .[41] G. Aad et al. , Measurement of the production of a W boson in association with a charmquark in pp collisions at √ s = , JHEP , 068 (2014),doi:10.1007/JHEP05(2014)068, .[42] M. Aaboud et al. , Measurement of fiducial and differential W + W − production cross-sections at √ s =
13 TeV with the ATLAS detector (2019), .[43] M. Aaboud et al. , Measurement of b -hadron pair production with the ATLAS detector inproton-proton collisions at √ s = 8 TeV (2017), .56 ciPost Physics Submission [44] V. Khachatryan et al. , Measurements of the associated production of a Z boson and bjets in pp collisions at √ s = 8 TeV , Submitted to: Eur. Phys. J. C (2016), .[45] M. Aaboud et al. , Measurement of inclusive jet and dijet cross-sections in proton-proton collisions at √ s = 13 TeV with the ATLAS detector , JHEP , 195 (2018),doi:10.1007/JHEP05(2018)195, .[46] G. Aad et al. , Measurement of the high-mass Drell–Yan differential cross-section in ppcollisions at sqrt(s)=7 TeV with the ATLAS detector , Phys. Lett.
B725 , 223 (2013),doi:10.1016/j.physletb.2013.07.049, .[47] G. Aad et al. , Measurement of isolated-photon pair production in pp collisions at √ s = 7 TeV with the ATLAS detector , JHEP , 086 (2013), doi:10.1007/JHEP01(2013)086, .[48] G. Aad et al. , Measurement of dijet cross sections in pp collisions at 7TeV centre-of-mass energy using the ATLAS detector , JHEP , 059 (2014),doi:10.1007/JHEP05(2014)059, .[49] A. M. Sirunyan et al. , Measurement of the jet mass in highly boosted t-tbar events frompp collisions at sqrt(s) = 8 TeV (2017), .[50] M. Aaboud et al. , Measurement of the four-lepton invariant mass spectrum in 13 TeVproton-proton collisions with the ATLAS detector (2019), .[51] A. M. Sirunyan et al. , Measurement of the differential Drell-Yan cross section in proton-proton collisions at √ s = 13 TeV , JHEP , 059 (2019), doi:10.1007/JHEP12(2019)059, .[52] M. Aaboud et al. , Searches for scalar leptoquarks and differential cross-section measure-ments in dilepton-dijet events in proton-proton collisions at a centre-of-mass energy of √ s = 13 TeV with the ATLAS experiment (2019), .[53] G. Aad et al. , Measurements of
W γ and Zγ production in pp collisions at √ s =7˘2009˘2009TeV with the ATLAS detector at the LHC , Phys.Rev. D87 (11), 112003(2013), doi:10.1103/PhysRevD.87.112003, .[54] M. Aaboud et al. , Measurement of the inclusive jet cross-sections in proton-protoncollisions at √ s = 8 TeV with the ATLAS detector , JHEP , 020 (2017),doi:10.1007/JHEP09(2017)020, .[55] G. Aad et al. , Measurements of top-quark pair differential cross-sections in the lepton+jetschannel in pp collisions at √ s = 8 TeV using the ATLAS detector (2015), .[56] M. Aaboud et al. , Measurement of the k t splitting scales in Z → ‘‘ events in pp collisionsat √ s = 8 TeV with the ATLAS detector (2017), .[57] A. M. Sirunyan et al. , Measurement of the differential cross sections for the associatedproduction of a W boson and jets in proton-proton collisions at √ s = 13 TeV , Phys.Rev.
D96 (7), 072005 (2017), doi:10.1103/PhysRevD.96.072005, .57 ciPost Physics Submission [58] M. Aaboud et al. , Measurement of the cross section for isolated-photon plus jet productionin pp collisions at √ s = 13 TeV using the ATLAS detector , Phys. Lett.
B780 , 578(2018), doi:10.1016/j.physletb.2018.03.035, .[59] G. Aad et al. , Measurement of fiducial differential cross sections of gluon-fusion pro-duction of Higgs bosons decaying to
W W ∗ → eνµν with the ATLAS detector at √ s = 8 TeV , JHEP , 104 (2016), doi:10.1007/JHEP08(2016)104, .[60] M. Aaboud et al. , Search for triboson W ± W ± W ∓ production in pp collisions at √ s = 8 TeV with the ATLAS detector (2016), .[61] G. Aad et al. , Measurement of the inclusive jet cross-section in proton-proton collisionsat √ s = 7 TeV using 4.5 fb − of data with the ATLAS detector (2014), .[62] M. Aaboud et al. , Precision measurement and interpretation of inclusive W + , W − and Z/γ ∗ production cross sections with the ATLAS detector , Eur. Phys. J. C77 (6), 367(2017), doi:10.1140/epjc/s10052-017-4911-9, .[63]
Measurement of the production cross section for
Z/γ ∗ in association with jets in pp collisions at √ s = 7 TeV with the ATLAS detector (2011), .[64] G. Aad et al. , Measurement of total and differential W + W − production cross sections inproton-proton collisions at √ s = , JHEP , 029 (2016), doi:10.1007/JHEP09(2016)029, .[65] A. M. Sirunyan et al. , Measurement of differential cross sections for Z boson productionin association with jets in proton-proton collisions at √ s =
13 TeV (2018), .[66] G. Aad et al. , Differential top-antitop cross-section measurements as a function ofobservables constructed from final-state particles using pp collisions at √ s = 7 TeV inthe ATLAS detector (2015), .[67] G. Aad et al. , Measurement of the electroweak production of dijets in association with a Z-boson and distributions sensitive to vector boson fusion in proton-proton collisions at √ s = , JHEP , 031 (2014), doi:10.1007/JHEP04(2014)031, .[68] M. Aaboud et al. , Measurements of the production cross section of a Z boson inassociation with jets in pp collisions at √ s = 13 TeV with the ATLAS detector (2017), .[69] G. Aad et al. , Dynamics of isolated-photon plus jet production in pp collisionsat p ( s ) = 7 TeV with the ATLAS detector , Nucl.Phys.
B875 , 483 (2013),doi:10.1016/j.nuclphysb.2013.07.025, .[70] G. Aad et al. , Measurement of four-jet differential cross sections in √ s = 8 TeVproton–proton collisions using the ATLAS detector (2015), .[71] M. Aaboud et al. , Measurements of t ¯ t differential cross-sections of highly boosted topquarks decaying to all-hadronic final states in pp collisions at √ s = 13 TeV using theATLAS detector (2018), . 58 ciPost Physics Submission [72] G. Aad et al. , Measurement of W + W − production in pp collisions at √ s =7˘2009˘2009TeVwith the ATLAS detector and limits on anomalous W W Z and
W W γ couplings , Phys.Rev.
D87 (11), 112001 (2013), doi:10.1103/PhysRevD.87.112001, .[73] A. M. Sirunyan et al. , Measurements of differential Z boson production crosssections in proton-proton collisions at √ s = 13 TeV , JHEP , 061 (2019),doi:10.1007/JHEP12(2019)061, .[74] A. M. Sirunyan et al. , Measurements of differential cross sections of top quark pairproduction as a function of kinematic event variables in proton-proton collisions at √ s =
13 TeV (2018), .[75] M. Aaboud et al. , Measurement of the W ± Z boson pair-production cross section in pp collisions at √ s = 13 TeV with the ATLAS Detector , Phys. Lett.
B762 , 1 (2016),doi:10.1016/j.physletb.2016.08.052, .[76] G. Aad et al. , Measurement of the inclusive isolated prompt photon cross-section in pp collisions at √ s = 7 TeV using 35 pb − of ATLAS data , Phys. Lett. B706 , 150 (2011),doi:10.1016/j.physletb.2011.11.010, .[77] M. Aaboud et al. , Measurement of detector-corrected observables sensitive to the anoma-lous production of events with jets and large missing transverse momentum in pp collisionsat √ s = 13 TeV using the ATLAS detector (2017), .[78] V. Khachatryan et al. , Measurement of the double-differential inclusive jet cross sectionin proton˘2013proton collisions at √ s = 13 TeV , Eur. Phys. J.
C76 (8), 451 (2016),doi:10.1140/epjc/s10052-016-4286-3, .[79] R. Aaij et al. , Measurement of the cross-section for Z → e + e − production in pp collisionsat √ s = 7 TeV , JHEP , 106 (2013), doi:10.1007/JHEP02(2013)106, .[80] M. Aaboud et al. , Search for squarks and gluinos in final states with jets and missingtransverse momentum at √ s = 13 T eV with the ATLAS detector , Eur. Phys. J.
C76 (7),392 (2016), doi:10.1140/epjc/s10052-016-4184-8, .[81] G. Aad et al. , Measurements of Zγ and Zγγ production in pp collisions at √ s = , Phys. Rev. D93 (11), 112002 (2016),doi:10.1103/PhysRevD.93.112002, .[82] M. Aaboud et al. , Measurements of top-quark pair differential cross-sections in thelepton+jets channel in pp collisions at √ s = 13 TeV using the ATLAS detector , JHEP , 191 (2017), doi:10.1007/JHEP11(2017)191, .[83] G. Aad et al. , Search for high-mass dilepton resonances using 139 fb − of pp collisiondata collected at √ s =
13 TeV with the ATLAS detector (2019), .[84] M. Aaboud et al. , Measurements of differential cross sections of top quark pair productionin association with jets in pp collisions at √ s = 13 TeV using the ATLAS detector (2018), . 59 ciPost Physics Submission [85] G. Aad et al. , Measurement of the cross-section for W boson production in associationwith b-jets in pp collisions at √ s = 7 TeV with the ATLAS detector , JHEP , 084(2013), doi:10.1007/JHEP06(2013)084, .[86] V. Khachatryan et al. , Measurement and QCD analysis of double-differential inclusivejet cross sections in pp collisions at √ s = 8 TeV and cross section ratios to 2.76 and 7TeV , JHEP , 156 (2017), doi:10.1007/JHEP03(2017)156, .[87] S. Chatrchyan et al. , Measurement of the triple-differential cross section for pho-ton+jets production in proton-proton collisions at √ s =7 TeV , JHEP , 009 (2014),doi:10.1007/JHEP06(2014)009, .[88] G. Aad et al. , Measurement of the production cross section of jets in association witha Z boson in pp collisions at √ s = 7 TeV with the ATLAS detector , JHEP , 032(2013), doi:10.1007/JHEP07(2013)032, .[89] G. Aad et al. , Measurement of differential production cross-sections for a Z boson inassociation with b -jets in 7 TeV proton-proton collisions with the ATLAS detector (2014), .[90] A. M. Sirunyan et al. , Measurement of the triple-differential dijet cross section inproton-proton collisions at √ s = 8 TeV and constraints on parton distribution functions ,Eur. Phys. J. C (11), 746 (2017), doi:10.1140/epjc/s10052-017-5286-7, .[91] V. Khachatryan et al. , Measurement of the integrated and differential t-tbar productioncross sections for high-pt top quarks in pp collisions at sqrt(s) = 8 TeV (2016), .[92] M. Aaboud et al. , Measurement of differential cross sections and W + /W − cross-sectionratios for W boson production in association with jets at √ s = 8 TeV with the ATLASdetector , JHEP , 077 (2018), doi:10.1007/JHEP05(2018)077, .[93] G. Aad et al. , Measurement of the inclusive isolated prompt photon cross section in pp collisions at √ s = 8 TeV with the ATLAS detector (2016), .[94] V. Khachatryan et al. , Measurement of the transverse momentum spectrum of the Higgsboson produced in pp collisions at √ s = 8 TeV using H → W W decays , JHEP , 032(2017), doi:10.1007/JHEP03(2017)032, .[95] V. Khachatryan et al. , Differential cross section measurements for the production of a W boson in association with jets in proton˘2013proton collisions at √ s = 7 TeV , Phys.Lett.
B741 , 12 (2014), doi:10.1016/j.physletb.2014.12.003, .[96] S. Chatrchyan et al. , Measurement of four-jet production in proton-proton collisions atsqrt(s)=7 TeV (2013), .[97] G. Aad et al. , Measurement of the inclusive isolated prompt photon cross section in pp collisions at √ s = 7 TeV with the ATLAS detector using 4.6 fb − , Phys.Rev. D89 ,052004 (2014), doi:10.1103/PhysRevD.89.052004, .60 ciPost Physics Submission [98] G. Aad et al. , Measurements of four-lepton production in pp collisions at √ s = , Phys. Lett. B753 , 552 (2016), doi:10.1016/j.physletb.2015.12.048, .[99] G. Aad et al. , Measurement of the low-mass Drell-Yan differential cross section at √ s = 7TeV using the ATLAS detector , JHEP , 112 (2014), doi:10.1007/JHEP06(2014)112, .[100] M. Aaboud et al. , Measurements of integrated and differential cross sections for isolatedphoton pair production in pp collisions at √ s = 8 TeV with the ATLAS detector (2017), .[101] V. Khachatryan et al. , Measurements of differential cross sections for associated produc-tion of a W boson and jets in proton-proton collisions at √ s = , Phys. Rev. D(2016), doi:10.1103/PhysRevD.95.052002, [Phys. Rev.D95,052002(2017)], .[102] G. Aad et al. , Measurement of ZZ production in pp collisions at √ s = 7 TeV and limitson anomalous
ZZZ and
ZZγ couplings with the ATLAS detector , JHEP , 128(2013), doi:10.1007/JHEP03(2013)128, .[103] S. Chatrchyan et al. , Measurement of the cross section and angular correlations forassociated production of a Z boson with b-hadrons in pp collisions at √ s = , JHEP , 039 (2013), doi:10.1007/JHEP12(2013)039, .[104] G. Aad et al. , Measurement of the Z ( → ‘ + ‘ − ) γ production cross-section in pp collisions at √ s = 13 TeV with the ATLAS detector , JHEP , 054 (2020),doi:10.1007/JHEP03(2020)054, .[105] G. Aad et al. , Study of jets produced in association with a W boson in pp collisions at √ s = 7 TeV with the ATLAS detector (2012), .[106] A. M. Sirunyan et al. , Measurement of differential cross sections for the productionof top quark pairs and of additional jets in lepton+jets events from pp collisions at √ s =
13 TeV , Phys. Rev.
D97 (11), 112003 (2018), doi:10.1103/PhysRevD.97.112003, .[107] G. Aad et al. , Fiducial and differential cross sections of Higgs boson production measuredin the four-lepton decay channel in pp collisions at √ s =8 TeV with the ATLAS detector ,Phys. Lett. B738 , 234 (2014), doi:10.1016/j.physletb.2014.09.054, .[108] M. Aaboud et al. , Measurements of fiducial and differential cross-sections of t ¯ t productionwith additional heavy-flavour jets in proton-proton collisions at √ s = 13 TeV with theATLAS detector (2018), .[109] M. Aaboud et al. , Measurement of jet-substructure observables in top quark, W bosonand light jet production in proton-proton collisions at √ s = 13 TeV with the ATLASdetector , JHEP , 033 (2019), doi:10.1007/JHEP08(2019)033, .[110] M. Aaboud et al. , ZZ → ‘ + ‘ − ‘ + ‘ cross-section measurements and search for anoma-lous triple gauge couplings in 13 TeV pp collisions with the ATLAS detector , Phys. Rev. D97 (3), 032005 (2018), doi:10.1103/PhysRevD.97.032005, .61 ciPost Physics Submission [111] G. Aad et al. , Measurements of fiducial and differential cross sections for Higgs bosonproduction in the diphoton decay channel at √ s = 8 TeV with ATLAS , JHEP , 112(2014), doi:10.1007/JHEP09(2014)112, .[112] S. Chatrchyan et al. , Studies of Jet Mass in Dijet and W/Z + Jet Events , JHEP ,090 (2013), doi:10.1007/JHEP05(2013)090, .[113] G. Aad et al. , Measurement of the differential cross-section of highly boosted top quarks asa function of their transverse momentum in √ s = 8 TeV proton–proton collisions using theATLAS detector , Phys. Rev. D93 (3), 032009 (2016), doi:10.1103/PhysRevD.93.032009, .[114] A. M. Sirunyan et al. , Measurements of the differential jet cross section as a function ofthe jet mass in dijet events from proton-proton collisions at √ s = 13 TeV , JHEP ,113 (2018), doi:10.1007/JHEP11(2018)113, .[115] T. Junk, Confidence level computation for combining searches with small statistics ,Nucl. Instrum. Meth.
A434 , 435 (1999), doi:10.1016/S0168-9002(99)00498-2, hep-ex/9902006 .[116] A. L. Read,
Presentation of search results: The CL(s) technique , J. Phys.
G28 , 2693(2002), doi:10.1088/0954-3899/28/10/313, [,11(2002)].[117] G. Cowan, K. Cranmer, E. Gross and O. Vitells,
Asymptotic formulae for likelihood-basedtests of new physics , Eur. Phys. J.
C71 , 1554 (2011), doi:10.1140/epjc/s10052-011-1554-0,10.1140/epjc/s10052-013-2501-z, [Erratum: Eur. Phys. J.C73,2501(2013)], .[118] P. Virtanen, R. Gommers, T. E. Oliphant, M. Haberland, T. Reddy, D. Cournapeau,E. Burovski, P. Peterson, W. Weckesser, J. Bright, S. J. van der Walt, M. Brett et al. , SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python , Nature Methods , 261 (2020), doi:10.1038/s41592-019-0686-2.[119] C. R. Harris, K. J. Millman, S. J. van der Walt, R. Gommers, P. Virtanen, D. Cournapeau,E. Wieser, J. Taylor, S. Berg, N. J. Smith, R. Kern, M. Picus et al. , Array programmingwith NumPy , Nature , 357–362 (2020), doi:10.1038/s41586-020-2649-2.[120] M. Bauer, M. Neubert and A. Thamm,
Collider Probes of Axion-Like Particles , JHEP , 044 (2017), doi:10.1007/JHEP12(2017)044, .[121] S. Catani, L. Cieri, D. de Florian, G. Ferrera and M. Grazzini, Diphoton production at theLHC: a QCD study up to NNLO , JHEP , 142 (2018), doi:10.1007/JHEP04(2018)142, .[122] J. D. Hunter, Matplotlib: A 2d graphics environment , Computing in Science & Engi-neering (3), 90 (2007), doi:10.1109/MCSE.2007.55.[123] T. A. Caswell, M. Droettboom, A. Lee, E. S. de Andrade, J. Hunter, E. Firing, T. Hoff-mann, J. Klymak, D. Stansby, N. Varoquaux, J. H. Nielsen, B. Root et al. , matplotlib/-matplotlib: Rel: v3.3.4 , doi:10.5281/zenodo.4475376 (2021).[124] A. Buckley, PySLHA: a Pythonic interface to SUSY Les Houches Accord data , Eur.Phys. J. C (10), 467 (2015), doi:10.1140/epjc/s10052-015-3638-8, .62 ciPost Physics Submission [125] T. Sj¨ostrand, S. Ask, J. R. Christiansen, R. Corke, N. Desai, P. Ilten, S. Mrenna,S. Prestel, C. O. Rasmussen and P. Z. Skands, An introduction to PYTHIA 8.2 , Comput.Phys. Commun. , 159 (2015), doi:10.1016/j.cpc.2015.01.024, .[126] M. M. Altakach, T. Jeˇzo, M. Klasen, J. N. Lang and I. Schienbein,
Precise predictionsfor electroweak t ¯ t production at the LHC in models with flavour non-diagonal Z bosoncouplings and W bosons (2020), .[127] G. Altarelli, B. Mele and M. Ruiz-Altaba, Searching for new heavy vector bosons in p ¯ p colliders , Z. Phys. C45 , 109 (1989), doi:10.1007/BF01556677, Erratum-ibid. C , 676(1990).[128] C. T. Hill, Topcolor: top quark condensation in a gauge extension of the standard model ,Physics Letters B (3), 419 (1991), doi:https://doi.org/10.1016/0370-2693(91)91061-Y.[129] C. T. Hill,
Topcolor assisted technicolor , Phys. Lett. B , 483 (1995), doi:10.1016/0370-2693(94)01660-5, hep-ph/9411426 .[130] B. C. Allanach and J. Davighi,
Third family hypercharge model for R K ( ∗ ) and aspectsof the fermion mass problem , JHEP , 075 (2018), doi:10.1007/JHEP12(2018)075,1809.01158