Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nicolas Bousquet is active.

Publication


Featured researches published by Nicolas Bousquet.


Journal of Theoretical Biology | 2008

Redefining the maximum sustainable yield for the Schaefer population model including multiplicative environmental noise

Nicolas Bousquet; Thierry Duchesne; Louis-Paul Rivest

The focus of this article is to investigate the biological reference points, such as the maximum sustainable yield (MSY), in a common Schaefer (logistic) surplus production model in the presence of a multiplicative environmental noise. This type of model is used in fisheries stock assessment as a first-hand tool for biomass modelling. Under the assumption that catches are proportional to the biomass, we derive new conditions on the environmental noise distribution such that stationarity exists and extinction is avoided. We then get new explicit results about the stationary behavior of the biomass distribution for a particular specification of the noise, namely the biomass distribution itself and a redefinition of the MSY and related quantities that now depend on the value of the variance of the noise. Consequently, we obtain a more precise vision of how less optimistic the stochastic version of the MSY can be than the traditionally used (deterministic) MSY. In addition, we give empirical conditions on the error variance to approximate our specific noise by a lognormal noise, the latter being more natural and leading to easier inference in this context. These conditions are mild enough to make the explicit results of this paper valid in a number of practical applications. The outcomes of two case-studies about northwest Atlantic haddock [Spencer, P.D., Collie, J.S., 1997. Effect of nonlinear predation rates on rebuilding the Georges Bank haddock (Melanogrammus aeglefinus) stock. Can. J. Fish. Aquat. Sci. 54, 2920-2929] and South Atlantic albacore tuna [Millar, R.B., Meyer, R., 2000. Non-linear state space modelling of fisheries biomass dynamics by using Metropolis-Hastings within-Gibbs sampling. Appl. Stat. 49, 327-342] are used to illustrate the impact of our results in bioeconomic terms.


Journal of Applied Statistics | 2008

Diagnostics of prior-data agreement in applied Bayesian analysis

Nicolas Bousquet

This article focused on the definition and the study of a binary Bayesian criterion which measures a statistical agreement between a subjective prior and data information. The setting of this work is concrete Bayesian studies. It is an alternative and a complementary tool to the method recently proposed by Evans and Moshonov, [M. Evans and H. Moshonov, Checking for Prior-data conflict, Bayesian Anal. 1 (2006), pp. 893–914]. Both methods try to help the work of the Bayesian analyst, from preliminary to the posterior computation. Our criterion is defined as a ratio of Kullback–Leibler divergences; two of its main features are to make easy the check of a hierarchical prior and be used as a default calibration tool to obtain flat but proper priors in applications. Discrete and continuous distributions exemplify the approach and an industrial case study in reliability, involving the Weibull distribution, is highlighted.


Journal of Statistical Computation and Simulation | 2015

Density modification-based reliability sensitivity analysis

Paul Lemaître; Ekatarina Sergienko; Aurélie Arnaud; Nicolas Bousquet; Fabrice Gamboa; Bertrand Iooss

Sensitivity analysis (SA) of a numerical model, for instance simulating physical phenomena, is useful to quantify the influence of the inputs on the model responses. This paper proposes a new sensitivity index, based upon the modification of the probability density function (pdf) of the random inputs, when the quantity of interest is a failure probability (probability that a model output exceeds a given threshold). An input is considered influential if the input pdf modification leads to a broad change in the failure probability. These sensitivity indices can be computed using the sole set of simulations that has already been used to estimate the failure probability, thus limiting the number of calls to the numerical model. In the case of a Monte Carlo sample, asymptotical properties of the indices are derived. Based on Kullback–Leibler divergence, several types of input perturbations are introduced. The relevance of this new SA method is analysed through three case studies.


Computational Statistics & Data Analysis | 2012

Estimating discrete Markov models from various incomplete data schemes

Alberto Pasanisi; Shuai Fu; Nicolas Bousquet

The parameters of a discrete stationary Markov model are transition probabilities between states. Traditionally, data consist in sequences of observed states for a given number of individuals over the whole observation period. In such a case, the estimation of transition probabilities is straightforwardly made by counting one-step moves from a given state to another. In many real-life problems, however, the inference is much more difficult as state sequences are not fully observed, namely the state of each individual is known only for some given values of the time variable. A review of the problem is given, focusing on Monte Carlo Markov Chain (MCMC) algorithms to perform Bayesian inference and evaluate posterior distributions of the transition probabilities in this missing-data framework. Leaning on the dependence between the rows of the transition matrix, an adaptive MCMC mechanism accelerating the classical Metropolis-Hastings algorithm is then proposed and empirically studied.


PLOS ONE | 2014

Forecasting the Major Influences of Predation and Environment on Cod Recovery in the Northern Gulf of St. Lawrence

Nicolas Bousquet; Emmanuel Chassot; Daniel Duplisea; Mike O. Hammill

The northern Gulf of St. Lawrence (NGSL) stock of Atlantic cod (Gadus morhua), historically the second largest cod population in the Western Atlantic, has known a severe collapse during the early 1990 s and is currently considered as endangered by the Committee on the Status of Endangered Wildlife in Canada. As for many fish populations over the world which are currently being heavily exploited or overfished, urgent management actions in the form of recovery plans are needed for restoring this stock to sustainable levels. Stochastic projections based on a statistical population model incorporating predation were conducted over a period of 30 years (2010–2040) to assess the expected outcomes of alternative fishing strategies on the stock recovery under different scenarios of harp seal (Pagophilus groenlandicus) abundance and environmental conditions. This sensitivity study shows that water temperature is key in the rebuilding of the NGSL cod stock. Model projections suggest that maintaining the current management practice under cooler water temperatures is likely to maintain the species in an endangered status. Under current or warmer conditions in the Gulf of St. Lawrence, partial recovery might only be achieved by significant reductions in both fishing and predation pressure. In the medium-term, a management strategy that reduces catch could be favoured over a complete moratorium so as to minimize socio-economic impacts on the industry.


Reliability Engineering & System Safety | 2017

Functional Weibull-based models of steel fracture toughness for structural risk analysis: estimation and selection

Nadia Pérot; Nicolas Bousquet

A key input component of numerous reliability studies of industrial components or structures, steel fracture toughness is usually considered as a random process because of its natural variability. Moreover, toughness presents a high sensitivity to temperature which also plays a fundamental role, as an environmental forcing, in such studies. Therefore a particular attention has to be paid to the assessment of its stochastic functional modelling, conducted by a statistical analysis of indirect measures. While a Weibull shape arising from statistical physics is recognized as one of the most relevant approach to represent local variability, the selection of functional parameters requires an accurate methodology of fracture toughness modelling. This article provides such a methodology, that solves inconsistencies in former data treatments. The innovation consists in three improvements: (a) the thickness correction of the steel specimen is included throughout the calculation and not performed a priori; (b) nonstandard but informative data are included in the assessment as censored data; (c) a chi-square test is developed to assess the model quality relatively to fracture toughness data, indexed by temperature. Illustrated by the exploration of a database feed by several European manufacturers, this complete methodology is implemented in a dedicated software tool.


Risk Analysis | 2015

Nonparametric Estimation of the Probability of Detection of Flaws in an Industrial Component, from Destructive and Nondestructive Testing Data, Using Approximate Bayesian Computation

Merlin Keller; Anne‐Laure Popelin; Nicolas Bousquet; Emmanuel Remy

We consider the problem of estimating the probability of detection (POD) of flaws in an industrial steel component. Modeled as an increasing function of the flaw height, the POD characterizes the detection process; it is also involved in the estimation of the flaw size distribution, a key input parameter of physical models describing the behavior of the steel component when submitted to extreme thermodynamic loads. Such models are used to assess the resistance of highly reliable systems whose failures are seldom observed in practice. We develop a Bayesian method to estimate the flaw size distribution and the POD function, using flaw height measures from periodic in-service inspections conducted with an ultrasonic detection device, together with measures from destructive lab experiments. Our approach, based on approximate Bayesian computation (ABC) techniques, is applied to a real data set and compared to maximum likelihood estimation (MLE) and a more classical approach based on Markov Chain Monte Carlo (MCMC) techniques. In particular, we show that the parametric model describing the POD as the cumulative distribution function (cdf) of a log-normal distribution, though often used in this context, can be invalidated by the data at hand. We propose an alternative nonparametric model, which assumes no predefined shape, and extend the ABC framework to this setting. Experimental results demonstrate the ability of this method to provide a flexible estimation of the POD function and describe its uncertainty accurately.


Quality and Reliability Engineering International | 2015

On the Practical Interest of Discrete Inverse Pólya and Weibull‐1 Models in Industrial Reliability Studies

Alberto Pasanisi; Côme Roero; Emmanuel Remy; Nicolas Bousquet

Engineers often cope with the problem of assessing the lifetime of industrial components, under the basis of observed industrial feedback data. Usually, lifetime is modelled as a continuous random variable, for instance exponentially or Weibull distributed. However, in some cases, the features of the piece of equipment under investigation rather suggest the use of discrete probabilistic models. This happens for an equipment which only operates on cycles or on demand. In these cases, the lifetime is rather measured in number of cycles or number of demands before failure, therefore, in theory, discrete models should be more appropriate. This article aims at bringing some light to the practical interest for the reliability engineer in using two discrete models among the most popular: the Inverse Polya distribution (IPD), based on a Polya urn scheme, and the so-called Weibull-1 (W1) model. It is showed that, for different reasons, the practical use of both models should be restricted to specific industrial situations. In particular, when nothing is a priori known over the nature of ageing and/or data are heavily right-censored, they can remain of limited interest with respect to more flexible continuous lifetime models such as the usual Weibull distribution. Nonetheless, the intuitive meaning given to the IPD distribution favors its use by engineers in low (decelerated) ageing situations.


Scientific Reports | 2018

High-throughput ovarian follicle counting by an innovative deep learning approach

Charlotte Sonigo; Stéphane Jankowski; Olivier Yoo; Olivier Trassard; Nicolas Bousquet; Michael Grynberg; Isabelle Beau; Nadine Binart

The evaluation of the number of mouse ovarian primordial follicles (PMF) can provide important information about ovarian function, regulation of folliculogenesis or the impact of chemotherapy on fertility. This counting, usually performed by specialized operators, is a tedious, time-consuming but indispensable procedure.The development and increasing use of deep machine learning algorithms promise to speed up and improve this process. Here, we present a new methodology of automatically detecting and counting PMF, using convolutional neural networks driven by labelled datasets and a sliding window algorithm to select test data. Trained from a database of 9 millions of images extracted from mouse ovaries, and tested over two ovaries (3 millions of images to classify and 2 000 follicles to detect), the algorithm processes the digitized histological slides of a completed ovary in less than one minute, dividing the usual processing time by a factor of about 30. It also outperforms the measurements made by a pathologist through optical detection. Its ability to correct label errors enables conducting an active learning process with the operator, improving the overall counting iteratively. These results could be suitable to adapt the methodology to the human ovarian follicles by transfer learning.


SIAM/ASA Journal on Uncertainty Quantification | 2018

Approximation of Limit State Surfaces in Monotonic Monte Carlo Settings, with Applications to Classification

Nicolas Bousquet; Thierry Klein; Vincent Moutoussamy

This article investigates the theoretical convergence properties of the estimators produced by a numerical exploration of a monotonic function with multivariate random inputs in a structural reliability framework. The quantity to be estimated is a probability typically associated to an undesirable (unsafe) event and the function is usually implemented as a computer model. The estimators produced by a Monte Carlo numerical design are two subsets of inputs leading to safe and unsafe situations, the measures of which can be traduced as deterministic bounds for the probability. Several situations are considered, when the design is independent, identically distributed or not, or sequential. As a major consequence, a consistent estimator of the (limit state) surface separating the subsets under isotonicity and regularity arguments can be built, and its convergence speed can be exhibited. This estimator is built by aggregating semi-supervized binary classifiers chosen as constrained Support Vector Machines. Numerical experiments conducted on toy examples highlight that they work faster than recently developed monotonic neural networks with comparable predictable power. They are therefore more adapted when the computational time is a key issue.

Collaboration


Dive into the Nicolas Bousquet's collaboration.

Top Co-Authors

Avatar

Shuai Fu

University of Paris-Sud

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Emmanuel Chassot

Institut de recherche pour le développement

View shared research outputs
Top Co-Authors

Avatar

Thierry Klein

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel Duplisea

Fisheries and Oceans Canada

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mike O. Hammill

Fisheries and Oceans Canada

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge