Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Xavier de Luna is active.

Publication


Featured researches published by Xavier de Luna.


Journal of the American Geriatrics Society | 2012

Genetic and Lifestyle Predictors of 15‐Year Longitudinal Change in Episodic Memory

Maria Josefsson; Xavier de Luna; Sara Pudas; Lars-Göran Nilsson; Lars Nyberg

To reveal distinct longitudinal trajectories in episodic memory over 15 years and to identify demographic, lifestyle, health‐related, and genetic predictors of stability or decline.


The Journal of Neuroscience | 2013

Brain Characteristics of Individuals Resisting Age-Related Cognitive Decline over Two Decades

Sara Pudas; Jonas Persson; Maria Josefsson; Xavier de Luna; Lars-Göran Nilsson; Lars Nyberg

Some elderly appear to resist age-related decline in cognitive functions, but the neural correlates of successful cognitive aging are not well known. Here, older human participants from a longitudinal study were classified as successful or average relative to the mean attrition-corrected cognitive development across 15–20 years in a population-based sample (n = 1561). Fifty-one successful elderly and 51 age-matched average elderly (mean age: 68.8 years) underwent functional magnetic resonance imaging while performing an episodic memory face–name paired-associates task. Successful older participants had higher BOLD signal during encoding than average participants, notably in the bilateral PFC and the left hippocampus (HC). The HC activation of the average, but not the successful, older group was lower than that of a young reference group (n = 45, mean age: 35.3 years). HC activation was correlated with task performance, thus likely contributing to the superior memory performance of successful older participants. The frontal BOLD response pattern might reflect individual differences present from young age. Additional analyses confirmed that both the initial cognitive level and the slope of cognitive change across the longitudinal measurement period contributed to the observed group differences in BOLD signal. Further, the differences between the older groups could not be accounted for by differences in brain structure. The current results suggest that one mechanism behind successful cognitive aging might be preservation of HC function combined with a high frontal responsivity. These findings highlight sources for heterogeneity in cognitive aging and may hold useful information for cognitive intervention studies.


Journal of Computational and Graphical Statistics | 2001

Robust Simulation-Based Estimation of ARMA Models

Xavier de Luna; Marc G. Genton

This article proposes a new approach to the robust estimation of a mixed autoregressive and moving average (ARMA) model. It is based on the indirect inference method that originally was proposed for models with an intractable likelihood function. The estimation algorithm proposed is based on an auxiliary autoregressive representation whose parameters are first estimated on the observed time series and then on data simulated from the ARMA model. To simulate data the parameters of the ARMA model have to be set. By varying these we can minimize a distance between the simulation-based and the observation-based auxiliary estimate. The argument of the minimum yields then an estimator for the parameterization of the ARMA model. This simulation-based estimation procedure inherits the properties of the auxiliary model estimator. For instance, robustness is achieved with GM estimators. An essential feature of the introduced estimator, compared to existing robust estimators for ARMA models, is its theoretical tractability that allows us to show consistency and asymptotic normality. Moreover, it is possible to characterize the influence function and the breakdown point of the estimator. In a small sample Monte Carlo study it is found that the new estimator performs fairly well when compared with existing procedures. Furthermore, with two real examples, we also compare the proposed inferential method with two different approaches based on outliers detection.


Scandinavian Journal of Statistics | 2003

Choosing a Model Selection Strategy

Xavier de Luna; Kostas Skouras

An important problem in statistical practice is the selection of a suitable statistical model. Several model selection strategies are available in the literature, having different asymptotic and small sample properties, depending on the characteristics of the data generating mechanism. These characteristics are difficult to check in practice and there is a need for a data-driven adaptive procedure to identify an appropriate model selection strategy for the data at hand. We call such an identification a model metaselection, and we base it on the analysis of recursive prediction residuals obtained from each strategy with increasing sample sizes. Graphical tools are proposed in order to study these recursive residuals. Their use is illustrated on real and simulated data sets. When necessary, an automatic metaselection can be performed by simply accumulating predictive losses. Asymptotic and small sample results are presented.


Journal of Nonparametric Statistics | 2006

Non-parametric adjustment for covariates when estimating a treatment effect

Eva Cantoni; Xavier de Luna

We consider a non-parametric model for estimating the effect of a binary treatment on an outcome variable while adjusting for an observed covariate. A naive procedure consists of performing two separate non-parametric regressions of the response on the covariate: one with the treated individuals and the other with the untreated. The treatment effect is then obtained by taking the difference between the two fitted regression functions. This article proposes a backfitting algorithm that uses all the data for the two abovementioned non-parametric regressions. We give finite sample theoretical results showing that the resulting estimator of the treatment effect can have lower variance. This improvement is not necessarily achieved at the cost of a larger bias. In all of the performed simulations, we observe that mean squared error is substantially lower for the proposed backfitting estimator. When more than one covariate is observed, our backfitting estimator can still be applied by using the propensity score (the probability of being treated for a given setup of the covariates). We illustrate the use of the backfitting estimator in a several-covariate situation with data on a training program for individuals having faced social and economic problems.


Statistics & Probability Letters | 2000

Robust simulation-based estimation

Marc G. Genton; Xavier de Luna

The simulation-based inferential method called indirect inference was originally proposed for statistical models whose likelihood is difficult or even impossible to compute and/or to maximize. In this paper, indirect estimation is proposed as a device to robustify the estimation for models where this is not possible or difficult with classical techniques such as M-estimators. We derive the influence function of the indirect estimator, and present results about its gross-error sensitivity and asymptotic variance. Two examples from time series are used for illustration.


Labour | 2014

Does Formal Education for Older Workers Increase Earnings? — Evidence Based on Rich Data and Long‐Term Follow‐Up

Anders Stenberg; Xavier de Luna; Olle Westerlund

Governments in Europe, Canada and the USA have expressed an ambition to stimulate education of older. In this paper, we analyse if there are effects on annual earnings of formal education for participants aged 42–55 at the time of enrolment in 1994–95. The analysis explores longitudinal population register data stretching from 1982 to 2007. The method used is difference-in-differences propensity score matching based on a rich set of covariates, including indicators of health and labor market marginalization. Our findings underline the importance of long follow up periods and imply positive effects for women, especially so for women with children, and no significant average earnings effects for men. These results differ from earlier studies but are stable to several alternative assumptions regarding unobservable characteristics. Data further indicate that the gender gap in our estimates may stem from differences in underlying reasons for enrolment.


Statistics and Computing | 2002

Simulation-based inference for simultaneous processes on regular lattices

Xavier de Luna; Marc G. Genton

The article proposes a simulation-based inferential method for simultaneous processes defined on a regular lattice. The focus is on spatio-temporal processes with a simultaneous component, that is such that contemporaneous spatial neighbors are potential explanatory variables in the model. The new method has the advantage of being simpler to implement than maximum likelihood and allows us to propose a robust estimator. We give asymptotic properties, present a Monte Carlo study and an illustrative example.


The Statistician | 2000

Prediction Intervals Based on Autoregression Forecasts

Xavier de Luna

The variability of parameter estimates is commonly neglected when constructing prediction intervals based on a parametric model for a time series. This practice is due to the complexity of conditioning the inference on information such as observed values of the underlying stochastic process. In this paper, conditional prediction intervals when using autoregression forecasts are proposed whose simple implementation will hopefully enable wide use. A simulation study illustrates the improvement over classical intervals in terms of empirical coverage.


Journal of Causal Inference | 2014

Testing for the Unconfoundedness Assumption Using an Instrumental Assumption

Xavier de Luna; Per Johansson

Abstract The identification of average causal effects of a treatment in observational studies is typically based either on the unconfoundedness assumption (exogeneity of the treatment) or on the availability of an instrument. When available, instruments may also be used to test for the unconfoundedness assumption. In this paper, we present a set of assumptions on an instrumental variable which allows us to test for the unconfoundedness assumption, although they do not necessarily yield nonparametric identification of an average causal effect. We propose a test for the unconfoundedness assumption based on the instrumental assumptions introduced and give conditions under which the test has power. We perform a simulation study and apply the results to a case study where the interest lies in evaluating the effect of job practice on employment.

Collaboration


Dive into the Xavier de Luna's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marc G. Genton

King Abdullah University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge