Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where A. Alexandre Trindade is active.

Publication


Featured researches published by A. Alexandre Trindade.


Electronic Journal of Statistics | 2010

The Hodrick-prescott Filter: A Special Case of Penalized Spline Smoothing

Robert L. Paige; A. Alexandre Trindade

We prove that the Hodrick-Prescott Filter (HPF), a commonly used method for smoothing econometric time series, is a special case of a linear penalized spline model with knots placed at all observed time points (except the first and last) and uncorrelated residuals. This equivalence then furnishes a rich variety of existing data-driven parameter estimation meth- ods, particularly restricted maximum likelihood (REML) and generalized cross-validation (GCV). This has profound implications for users of HPF who have hitherto typically relied on subjective choice, rather than estima- tion, for the smoothing parameter. By viewing estimates as roots of an ap- propriate quadratic estimating equation, we also present a new approach for constructing confidence intervals for the smoothing parameter. The method is akin to a parametric bootstrap where Monte Carlo simulation is replaced by saddlepoint approximation, and provides a fast and accurate alternative to exact methods, when they exist, e.g. REML. More importantly, it is also the only computationally feasible method when no other methods, exact or otherwise, exist, e.g. GCV. The methodology is demonstrated on the Gross National Product (GNP) series originally analyzed by Hodrick and Prescott (1997). With proper attention paid to residual correlation struc- ture, we show that REML-based estimation delivers an appropriate smooth for both the GNP series and its returns.


Computational Statistics & Data Analysis | 2007

Approximating the distributions of estimators of financial risk under an asymmetric Laplace law

A. Alexandre Trindade; Yun Zhu

Explicit expressions are derived for parametric and nonparametric estimators (NPEs) of two measures of financial risk, value-at-risk (VaR) and conditional value-at-risk (CVaR), under random sampling from the asymmetric Laplace (AL) distribution. Asymptotic distributions are established under very general conditions. Finite sample distributions are investigated by means of saddlepoint approximations. The latter are highly computationally intensive, requiring novel approaches to approximate moments and special functions that arise in the evaluation of the moment generating functions. Plots of the resulting density functions shed new light on the quality of the estimators. Calculations for CVaR reveal that the NPE enjoys greater asymptotic efficiency relative to the parametric estimator than is the case for VaR. An application of the methodology in modeling currency exchange rates suggests that the AL distribution is successful in capturing the peakedness, leptokurticity, and skewness, inherent in such data. A demonstrated superiority in the resulting parametric-based inferences delivers an important message to the practitioner.


Journal of Statistical Computation and Simulation | 2010

Time series models with asymmetric Laplace innovations

A. Alexandre Trindade; Yun Zhu; Beth Andrews

We propose autoregressive moving average (ARMA) and generalized autoregressive conditional heteroscedastic (GARCH) models driven by asymmetric Laplace (AL) noise. The AL distribution plays, in the geometric-stable class, the analogous role played by the normal in the alpha-stable class, and has shown promise in the modelling of certain types of financial and engineering data. In the case of an ARMA model we derive the marginal distribution of the process, as well as its bivariate distribution when separated by a finite number of lags. The calculation of exact confidence bands for minimum mean-squared error linear predictors is shown to be straightforward. Conditional maximum likelihood-based inference is advocated, and corresponding asymptotic results are discussed. The models are particularly suited for processes that are skewed, peaked, and leptokurtic, but which appear to have some higher order moments. A case study of a fund of real estate returns reveals that AL noise models tend to deliver a superior fit with substantially less parameters than normal noise counterparts, and provide both a competitive fit and a greater degree of numerical stability with respect to other skewed distributions.


Modelling and Simulation in Materials Science and Engineering | 2005

Statistical modelling of composition and processing parameters for alloy development

Alexandr Golodnikov; Yevgeny Macheret; A. Alexandre Trindade; Stan Uryasev; Grigoriy Zrazhevsky

We propose the use of regression models as a tool to reduce time and cost associated with the development and selection of new metallic alloys. A multiple regression model is developed which can accurately predict tensile yield strength of high strength low alloy steel based on its chemical composition and processing parameters. Quantile regression is used to model the fracture toughness response as measured by Charpy V-Notch (CVN) values, which exhibits substantial variability and is therefore not usefully modelled via standard regression with its focus on the mean. Using Monte-Carlo simulation, we determine that the three CVN values corresponding to each steel specimen can be plausibly modelled as observations from the 20th, 50th and 80th percentiles of the CVN distribution. Separate quantile regression models fitted at each of these percentile levels prove sufficiently accurate for ranking steels and selecting the best combinations of composition and processing parameters.


Medical Physics | 2006

CT volumetry of the skeletal tissues.

James M. Brindle; A. Alexandre Trindade; Jose C. Pichardo; Scott L. Myers; A.P. Shah; Wesley E. Bolch

Computed tomography (CT) is an important and widely used modality in the diagnosis and treatment of various cancers. In the field of molecular radiotherapy, the use of spongiosa volume (combined tissues of the bone marrow and bone trabeculae) has been suggested as a means to improve the patient-specificity of bone marrow dose estimates. The noninvasive estimation of an organ volume comes with some degree of error or variation from the true organ volume. The present study explores the ability to obtain estimates of spongiosa volume or its surrogate via manual image segmentation. The variation among different segmentation raters was explored and found not to be statistically significant (p value >0.05). Accuracy was assessed by having several raters manually segment a polyvinyl chloride (PVC) pipe with known volumes. Segmentation of the outer region of the PVC pipe resulted in mean percent errors as great as 15% while segmentation of the pipes inner region resulted in mean percent errors within ∼5%. Differences between volumes estimated with the high-resolution CT data set (typical of ex vivo skeletal scans) and the low-resolution CT data set (typical of in vivo skeletal scans) were also explored using both patient CT images and a PVC pipe phantom. While a statistically significant difference (p value <0.002) between the high-resolution and low-resolution data sets was observed with excised femoral heads obtained following total hip arthroplasty, the mean difference between high-resolution and low-resolution data sets was found to be only 1.24 and 2.18cm3 for spongiosa and cortical bone, respectively. With respect to differences observed with the PVC pipe, the variation between the high-resolution and low-resolution mean percent errors was a high as ∼20% for the outer region volume estimates and only as high as ∼6% for the inner region volume estimates. The findings from this study suggest that manual segmentation is a reasonably accurate and reliable means for the in vivo estimation of spongiosa volume. This work also provides a foundation for future studies where spongiosa volumes are estimated by various raters in more comprehensive CT data sets.


Journal of the American Statistical Association | 2013

Extending the State-Space Model to Accommodate Missing Values in Responses and Covariates

Arlene Naranjo; A. Alexandre Trindade; George Casella

This article proposes an extended state-space model for accommodating multivariate panel data. The novel aspect of this contribution is an adjustment to the classical model for multiple subjects that allows missingness in the covariates in addition to the responses. Missing covariate data are handled by a second state-space model nested inside the first to represent unobserved exogenous information. Relevant Kalman filter equations are derived, and explicit expressions are provided for both the E- and M-steps of an expectation-maximization (EM) algorithm, to obtain maximum (Gaussian) likelihood estimates of all model parameters. In the presence of missing data, the resulting EM algorithm becomes computationally intractable, but a simplification of the M-step leads to a new procedure that is shown to be an expectation/conditional maximization (ECM) algorithm under exogeneity of the covariates. Simulation studies reveal that the approach appears to be relatively robust to moderate percentages of missing data, even with fewer subjects and time points, and that estimates are generally consistent with the asymptotics. The methodology is applied to a dataset from a published panel study of elderly patients with impaired respiratory function. Forecasted values thus obtained may serve as an “early-warning” mechanism for identifying patients whose lung function is nearing critical levels. Supplementary materials for this article are available online.


conference on decision and control | 2014

Novel response surface methodologies with design of experiment for source localization in unknown spatial-temporal fields

Zhenyi Liu; Philip N. Smith; Trevor Park; A. Alexandre Trindade; Qing Hui

The spread of contaminant into the environment due to a high volume of leakage from point sources is a very real threat in the modern world. Since the contaminant can often be dangerous to human operators, multiple robotic agents equipped with suitable sensors are playing an increasingly important role in the so-called contaminant detection problem than before. This paper proposes a contaminant detection methodology which can not only locate the (possibly multiple) source of the contaminant, but also estimate the intensity in an environment evolving over both space and time. The method can deal with measurement noise and does not need any gradient information.


Journal of data science | 2006

Improved Tolerance Limits by Combining Analytical and Experimental Data: An Information Integration Methodology

A. Alexandre Trindade; Stan Uryasev

We propose a coherent methodology for integrating different sources of information on a response variable of interest, in order to accu- rately predict percentiles of its distribution. Under the assumption that one of the sources is more reliable than the other(s), the approach combines fac- tors formed from the data into an additive linear regression model. Quantile regression, designed for quantifying the goodness of fit precisely at a desired quantile, is used as the optimality criterion in model-fitting. Asymptotic confidence interval construction methods for the percentiles are adopted to compute statistical tolerance limits for the response. The approach is demon- strated on a materials science case study that pools together information on failure load from physical tests and computer model predictions. A small simulation study assesses the precision of the inferences. The methodology gives plausible percentile estimates. Resulting tolerance limits are close to nominal coverage probability levels.


Infection ecology & epidemiology | 2013

Phylogenetic analysis of pbp genes in treponemes

Tejpreet Chadha; A. Alexandre Trindade

Background: β-Lactamases are the main cause of bacterial resistance to penicillin, cephalosporins, and related β-lactam compounds. The presence of the novel penicillin-binding protein (pbp) Tp47 in Treponema pallidum has been reported to be a well-known mechanism for turnover of b-lactam antibiotics. Although, T. pallidum remains sensitive to penicillin, clinically significant resistance to macrolides has emerged in many developing countries. The genome sequence of T. pallidum has shown the presence of genes encoding pbp, but there are no current reports of the presence of mobile plasmids. Methods: The phylogenetic analysis is used to study the diversity of chromosomal pbp genes and its relatedness to Tp47 in Treponema species. Results: In our study, genes encoding penicillin-binding proteins that showed significant similarity to each other appeared in separate clusters. Conclusion: Tp47 showed no substantial similarity to other β-lactamases in treponemes. The relatedness of Treponema denticola to other treponemes, including T. pallidum, and the reported presence of natural mobile antibiotic determinants highlight the importance of investigating the diversity of pbp genes in Treponema species. This will lead to a greater understanding of its potential to develop additional antibiotic resistance via horizontal gene transfer that could seriously compromise the treatment and control of syphilis.


Archive | 2007

Estimating the Probability Distributions of Alloy Impact Toughness: a Constrained Quantile Regression Approach

Alexandr Golodnikov; Yevgeny Macheret; A. Alexandre Trindade; Stan Uryasev; Grigoriy Zrazhevsky

We extend our earlier work, Golodnikov et al [3] and Golodnikov et al [4], by estimating the entire probability distributions for the impact toughness characteristic of steels, as measured by Charpy V-Notch (CVN) at −84°C. Quantile regression, constrained to produce monotone quantile function and unimodal density function estimates, is used to construct the empirical quantiles as a function of various alloy chemical composition and processing variables. The estimated quantiles are used to produce an estimate of the underlying probability density function, rendered in the form of a histogram. The resulting CVN distributions are much more informative for alloy design than singular test data. Using the distributions to make decisions for selecting better alloys should lead to a more effective and comprehensive approach than the one based on the minimum value from a multiple of the three test, as is commonly practiced in the industry.

Collaboration


Dive into the A. Alexandre Trindade's collaboration.

Top Co-Authors

Avatar

Robert L. Paige

Missouri University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge