Featured Researches

Data Analysis Statistics And Probability

Evaluating the phase dynamics of coupled oscillators via time-variant topological features

By characterizing the phase dynamics in coupled oscillators, we gain insights into the fundamental phenomena of complex systems. The collective dynamics in oscillatory systems are often described by order parameters, which are insufficient for identifying more specific behaviors. To improve this situation, we propose a topological approach that constructs the quantitative features describing the phase evolution of oscillators. Here, the phase data are mapped into a high-dimensional space at each time, and the topological features describing the shape of the data are subsequently extracted from the mapped points. These features are extended to time-variant topological features by adding the evolution time as an extra dimension in the topological feature space. The time-variant features provide crucial insights into the evolution of phase dynamics. Combining these features with the kernel method, we characterize the multi-clustered synchronized dynamics during the early evolution stages. Finally, we demonstrate that our method can qualitatively explain chimera states. The experimental results confirmed the superiority of our method over those based on order parameters, especially when the available data are limited to the early-stage dynamics.

Read more
Data Analysis Statistics And Probability

Event-by-Event Efficiency Fluctuations and Efficiency Correction for Cumulants of Superposed Multiplicity Distributions in Relativistic Heavy-ion Collision Experiments

We performed systematic studies on the effects of event-by-event efficiency fluctuations on efficiency correction for cumulant analysis in relativistic heavy-ion collision experiments. Experimentally, particle efficiencies of events measured under different experimental conditions should be different. For fluctuation measurements, the final event-by-event multiplicity distributions should be the superposed distributions of various type of events measured under different conditions. We demonstrate efficiency fluctuation effects using numerical simulation, in which we construct an event ensemble consisting of events with two different efficiencies. By using the mean particle efficiencies, we find that the efficiency corrected cumulants show large deviations from the original inputs when the discrepancy between the two efficiencies is large. We further studied the effects of efficiency fluctuations for the cumulants of net-proton distributions by implementing the UrQMD events of Au+Au collisions at s NN − − − − √ =7.7 GeV in a realistic STAR detector acceptance. We consider the unequal efficiency in two sides of the Time Projection Chamber (TPC), multiplicity dependent efficiency, and the event-by-event variations of the collision vertex position along the longitudinal direction ( V z ). When the efficiencies fluctuate dramatically within the studied event sample, the effects of efficiency fluctuations have significant impacts on the efficiency corrections of cumulants with the mean efficiency. We find that this effect can be effectively suppressed by binning the entire event ensemble into various sub-event samples, in which the efficiency variations are relatively small. The final efficiency corrected cumulants can be calculated from the weighted average of the corrected factorial moments of the sub-event samples with the mean efficiency.

Read more
Data Analysis Statistics And Probability

Event-shape engineering and heavy-flavour observables in relativistic heavy-ion collisions

Traditionally, events collected at relativistic heavy-ion colliders are classified according to some centrality estimator (e.g. the number of produced charged particles) related to the initial energy density and volume of the system. In a naive picture the latter are directly related to the impact parameter of the two nuclei, which sets also the initial eccentricity of the system: zero in the case of the most central events and getting larger for more peripheral collisions. A more realistic modelling requires to take into account event-by-event fluctuations, in particular in the nucleon positions within the colliding nuclei: collisions belonging to the same centrality class can give rise to systems with different initial eccentricity and hence different flow harmonics for the final hadron distributions. This issue can be addressed by an event-shape-engineering analysis, consisting in selecting events with the same centrality but different magnitude of the average bulk anisotropic flow and therefore of the initial-state eccentricity. In this paper we present the implementation of this analysis in the POWLANG transport model, providing predictions for the transverse-momentum and angular distributions of charm and beauty hadrons for event-shape selected collisions. In this way it is possible to get information on how the heavy quarks propagating (and hadronizing) in a hot environment respond both to its energy density and to its geometric asymmetry, breaking the perfect correlation between eccentricity and impact parameter which characterizes a modelling of the medium based on smooth average initial conditions

Read more
Data Analysis Statistics And Probability

Evidence for increasing frequency of extreme coastal sea levels

Projections of extreme sea levels (ESLs) are critical for managing coastal risks, but are made complicated by deep uncertainties. One key uncertainty is the choice of model structure used to estimate coastal hazards. Differences in model structural choices contribute to uncertainty in estimated coastal hazard, so it is important to characterize how model structural choice affects estimates of ESL. Here, we present a collection of 36 ESL data sets, from tide gauge stations along the United States East and Gulf Coasts. The data are processed using both annual block maxima and peaks-over-thresholds approaches for modeling distributions of extremes. We use these data sets to fit a suite of potentially nonstationary extreme value models by covarying the ESL statistics with multiple climate variables. We demonstrate how this data set enables inquiry into deep uncertainty surrounding coastal hazards. For all of the models and sites considered here, we find that accounting for changes in the frequency of coastal extreme sea levels provides a better fit than using a stationary extreme value model.

Read more
Data Analysis Statistics And Probability

Experimental noise in small-angle scattering can be assessed and corrected using the Bayesian Indirect Fourier Transformation

Small-angle X-ray and neutron scattering are widely used to investigate soft matter and biophysical systems. The experimental errors are essential when assessing how well a hypothesized model fits the data. Likewise, they are important when weights are assigned to multiple datasets used to refine the same model. Therefore, it is problematic when experimental errors are over- or underestimated. We present a method, using Bayesian Indirect Fourier Transformation for small-angle scattering data, to assess whether or not a given small-angle scattering dataset has over- or underestimated experimental errors. The method is effective on both simulated and experimental data, and can be used assess and rescale the errors accordingly. Even if the estimated experimental errors are appropriate, it is ambiguous whether or not a model fits sufficiently well, as the "true" reduced χ 2 of the data is not necessarily unity. This is particularly relevant for approaches where overfitting is an inherent challenge, such as reweighting of a simulated molecular dynamics trajectory against a small-angle scattering data or ab initio modelling. Using the outlined method, we show that one can determine what reduced χ 2 to aim for when fitting a model against small-angle scattering data. The method is easily accessible via a web interface.

Read more
Data Analysis Statistics And Probability

Explainable AI for ML jet taggers using expert variables and layerwise relevance propagation

A framework is presented to extract and understand decision-making information from a deep neural network (DNN) classifier of jet substructure tagging techniques. The general method studied is to provide expert variables that augment inputs ("eXpert AUGmented" variables, or XAUG variables), then apply layerwise relevance propagation (LRP) to networks both with and without XAUG variables. The XAUG variables are concatenated with the intermediate layers after network-specific operations (such as convolution or recurrence), and used in the final layers of the network. The results of comparing networks with and without the addition of XAUG variables show that XAUG variables can be used to interpret classifier behavior, increase discrimination ability when combined with low-level features, and in some cases capture the behavior of the classifier completely. The LRP technique can be used to find relevant information the network is using, and when combined with the XAUG variables, can be used to rank features, allowing one to find a reduced set of features that capture part of the network performance. In the studies presented, adding XAUG variables to low-level DNNs increased the efficiency of classifiers by as much as 30-40\%. In addition to performance improvements, an approach to quantify numerical uncertainties in the training of these DNNs is presented.

Read more
Data Analysis Statistics And Probability

Extending RECAST for Truth-Level Reinterpretations

RECAST is an analysis reinterpretation framework; since analyses are often sensitive to a range of models, RECAST can be used to constrain the plethora of theoretical models without the significant investment required for a new analysis. However, experiment-specific full simulation is still computationally expensive. Thus, to facilitate rapid exploration, RECAST has been extended to truth-level reinterpretations, interfacing with existing systems such as RIVET.

Read more
Data Analysis Statistics And Probability

Extracting distribution parameters from multiple uncertain observations with selection biases

We derive a Bayesian framework for incorporating selection effects into population analyses. We allow for both measurement uncertainty in individual measurements and, crucially, for selection biases on the population of measurements, and show how to extract the parameters of the underlying distribution based on a set of observations sampled from this distribution. We illustrate the performance of this framework with an example from gravitational-wave astrophysics, demonstrating that the mass ratio distribution of merging compact-object binaries can be extracted from Malmquist-biased observations with substantial measurement uncertainty.

Read more
Data Analysis Statistics And Probability

Extracting the oscillatory component from thermokinetic time series in the H/Pd system

The mean value theorem for integrals has been applied in construction of a base curve for non-equilibrium thermokinetic oscillations. Following discretization of the experimental periodic time series to form segments that approximately correspond to the oscillatory period, the mean value was calculated for each of them. The values so obtained were interpolated and the new non-oscillatory curve so constructed turned out to have the properties enabling it to be used as a baseline for the oscillatory component of the original thermokinetic time series. Crucially, both the area under the new curve and that under the original time series were strictly identical. The pointwise subtraction of the new base from the original curve yields another oscillatory time series that may be considered as the oscillatory component extracted from the experimental thermokinetic data. The mathematical basics for the method has been outlined. Two experimental thermokinetic time series resulting from oscillatory sorptions of H2 and D2 in the metallic Pd powder has been analyzed using the procedure, showing certain new empirical aspects that could not have been found otherwise.

Read more
Data Analysis Statistics And Probability

Extraction of Azimuthal Asymmetries using Optimal Observables

Azimuthal asymmetries play an important role in scattering processes with polarized particles. This paper introduces a new procedure using event weighting to extract these asymmetries. It is shown that the resulting estimator has several advantages in terms of statistical accuracy, bias, assumptions on acceptance and luminosities compared to other estimators discussed in the literature.

Read more

Ready to get started?

Join us today