Francisco Louzada
University of São Paulo
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Francisco Louzada.
Computational Statistics & Data Analysis | 2011
Francisco Louzada; Mari Roman; Vicente G. Cancho
In this paper, we proposed a new two-parameter lifetime distribution with increasing failure rate, the complementary exponential geometric distribution, which is complementary to the exponential geometric model proposed by Adamidis and Loukas (1998). The new distribution arises on a latent complementary risks scenario, in which the lifetime associated with a particular risk is not observable; rather, we observe only the maximum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its reliability and failure rate functions, moments, including the mean and variance, variation coefficient, and modal value. The parameter estimation is based on the usual maximum likelihood approach. We report the results of a misspecification simulation study performed in order to assess the extent of misspecification errors when testing the exponential geometric distribution against our complementary one in the presence of different sample size and censoring percentage. The methodology is illustrated on four real datasets; we also make a comparison between both modeling approaches.
Digital Signal Processing | 2017
Luca Martino; Jesse Read; Victor Elvira; Francisco Louzada
Abstract We design a sequential Monte Carlo scheme for the dual purpose of Bayesian inference and model selection. We consider the application context of urban mobility, where several modalities of transport and different measurement devices can be employed. Therefore, we address the joint problem of online tracking and detection of the current modality. For this purpose, we use interacting parallel particle filters, each one addressing a different model. They cooperate for providing a global estimator of the variable of interest and, at the same time, an approximation of the posterior density of each model given the data. The interaction occurs by a parsimonious distribution of the computational effort, with online adaptation for the number of particles of each filter according to the posterior probability of the corresponding model. The resulting scheme is simple and flexible. We have tested the novel technique in different numerical experiments with artificial and real data, which confirm the robustness of the proposed scheme.
Journal of Statistical Computation and Simulation | 2014
Cynthia A. V. Tojeiro; Francisco Louzada; Mari Roman; Patrick Borges
In this paper, we proposed a new three-parameters lifetime distribution with unimodal, increasing and decreasing hazard rate. The new distribution, the complementary Weibull geometric (CWG), is complementary to the Weibull-geometric (WG) model proposed by Barreto-Souza et al. (The Weibull-Geometric distribution, J. Statist. Comput. Simul. 1 (2010), pp. 1–13). The CWG distribution arises on a latent complementary risks scenarios, where the lifetime associated with a particular risk is not observable, rather we observe only the maximum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its reliability and hazard rate functions, moments, density of order statistics and their moments. We provide expressions for the Rényi and Shannon entropies. The parameter estimation is based on the usual maximum likelihood approach. We obtain the observed information matrix and discuss inferences issues. We report a hazard function comparison study between the WG distribution and our complementary one. The flexibility and potentiality of the new distribution is illustrated by means of three real dataset, where we also made a comparison between Weibull, WG and CWG modelling approach.
Brazilian Journal of Probability and Statistics | 2013
D José Flores; Patrick Borges; Vicente G. Cancho; Francisco Louzada
In this paper, we introduce the complementary exponential power series distributions, with failure rate either increasing, which is complementary to the exponential power series model proposed by Chahkandi & Ganjali (2009). The new class of distribution arises on a latent complementary risks scenarios, where the lifetime associated with a particular risk is not observable, rather we observe only the maximum lifetime value among all risks. This new class contains several distributions as particular case. The properties of the proposed distribution class are discussed such as quantiles, moments and order statistics. Estimation is carried out via maximum likelihood. Simulation results on maximum likelihood estimation are presented. An real application illustrate the usefulness of the new distribution class.
Signal Processing | 2017
Luca Martino; Victor Elvira; Francisco Louzada
The Effective Sample Size (ESS) is an important measure of efficiency of Monte Carlo methods such as Markov Chain Monte Carlo (MCMC) and Importance Sampling (IS) techniques. In the IS context, an approximation ESS ^ of the theoretical ESS definition is widely applied, involving the inverse of the sum of the squares of the normalized importance weights. This formula, ESS ^ , has become an essential piece within Sequential Monte Carlo (SMC) methods, to assess the convenience of a resampling step. From another perspective, the expression ESS ^ is related to the Euclidean distance between the probability mass described by the normalized weights and the discrete uniform probability mass function (pmf). In this work, we derive other possible ESS functions based on different discrepancy measures between these two pmfs. Several examples are provided involving, for instance, the geometric mean of the weights, the discrete entropy (including the perplexity measure, already proposed in literature) and the Gini coefficient among others. We list five theoretical requirements which a generic ESS function should satisfy, allowing us to classify different ESS measures. We also compare the most promising ones by means of numerical simulations.
Expert Systems With Applications | 2012
Francisco Louzada; Anderson Ara
Fraud is a global problem that has required more attention due to an accentuated expansion of modern technology and communication. When statistical techniques are used to detect fraud, whether a fraud detection model is accurate enough in order to provide correct classification of the case as a fraudulent or legitimate is a critical factor. In this context, the concept of bootstrap aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the adjusted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper, for the first time, we aim to present a pioneer study of the performance of the discrete and continuous k-dependence probabilistic networks within the context of bagging predictors classification. Via a large simulation study and various real datasets, we discovered that the probabilistic networks are a strong modeling option with high predictive capacity and with a high increment using the bagging procedure when compared to traditional techniques.
Journal of Probability and Statistics | 2013
Francisco Louzada; Vitor A.A. Marchi; James Carpenter
We proposed a new family of lifetime distributions, namely, complementary exponentiated exponential geometric distribution. This new family arises on a latent competing risk scenario, where the lifetime associated with a particular risk is not observable but only the maximum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its survival and hazard functions, moments, rth moment of the ith order statistic, mean residual lifetime, and modal value. Inference is implemented via a straightforwardly maximum likelihood procedure. The practical importance of the new distribution was demonstrated in three applications where our distribution outperforms several former lifetime distributions, such as the exponential, the exponential-geometric, the Weibull, the modified Weibull, and the generalized exponential-Poisson distribution.
Digital Signal Processing | 2016
Luca Martino; Victor Elvira; David Luengo; Jukka Corander; Francisco Louzada
Abstract Monte Carlo (MC) methods are widely used for Bayesian inference and optimization in statistics, signal processing and machine learning. A well-known class of MC methods are Markov Chain Monte Carlo (MCMC) algorithms. In order to foster better exploration of the state space, specially in high-dimensional applications, several schemes employing multiple parallel MCMC chains have been recently introduced. In this work, we describe a novel parallel interacting MCMC scheme, called orthogonal MCMC (O-MCMC), where a set of “vertical” parallel MCMC chains share information using some “horizontal” MCMC techniques working on the entire population of current states. More specifically, the vertical chains are led by random-walk proposals, whereas the horizontal MCMC techniques employ independent proposals, thus allowing an efficient combination of global exploration and local approximation. The interaction is contained in these horizontal iterations. Within the analysis of different implementations of O-MCMC, novel schemes in order to reduce the overall computational cost of parallel Multiple Try Metropolis (MTM) chains are also presented. Furthermore, a modified version of O-MCMC for optimization is provided by considering parallel Simulated Annealing (SA) algorithms. Numerical results show the advantages of the proposed sampling scheme in terms of efficiency in the estimation, as well as robustness in terms of independence with respect to initial values and the choice of the parameters.
Expert Systems With Applications | 2011
Francisco Louzada; Osvaldo Anacleto-Junior; Cecília Candolo; Josimara Mazucheli
Credit scoring modelling comprises one of the leading formal tools for supporting the granting of credit. Its core objective consists of the generation of a score by means of which potential clients can be listed in the order of the probability of default. A critical factor is whether a credit scoring model is accurate enough in order to provide correct classification of the client as a good or bad payer. In this context the concept of bootstraping aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the fitted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper we propose a new bagging-type variant procedure, which we call poly-bagging, consisting of combining predictors over a succession of resamplings. The study is derived by credit scoring modelling. The proposed poly-bagging procedure was applied to some different artificial datasets and to a real granting of credit dataset up to three successions of resamplings. We observed better classification accuracy for the two-bagged and the three-bagged models for all considered setups. These results lead to a strong indication that the poly-bagging approach may promote improvement on the modelling performance measures, while keeping a flexible and straightforward bagging-type structure easy to implement.
Applied Mathematics and Computation | 2013
Josimara Mazucheli; Francisco Louzada; M. E. Ghitany
The aim of this paper is to compare through Monte Carlo simulations the finite sample properties of the estimates of the parameters of the weighted Lindley distribution obtained by four estimation methods: maximum likelihood, method of moments, ordinary least-squares, and weighted least-squares. The bias and mean-squared error are used as the criterion for comparison. The study reveals that the ordinary and weighted least-squares estimation methods are highly competitive with the maximum likelihood method in small and large samples. Statistical analysis of two real data sets are presented to demonstrate the conclusion of the simulation results.