Álvaro E. Faria
Open University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Álvaro E. Faria.
conference on software maintenance and reengineering | 2005
Andrea Capiluppi; Álvaro E. Faria; Juan F. Ramil
This paper explores the relationship between cumulative change and complexity in an evolving open source system. The study involves measurements at the function and file level. In order to measure cumulative change, the approach used a metric termed release-touches, which counts the number of releases for which a given file has been modified. Based on the value of this metric, we ranked the files and used the ranking in order to identity two groups, the more stable and the less stable parts of the source code. Complexity was measured using two derivatives of the McCabe index. Histograms and distributions were visually and statistically analyzed. The results empirically suggest that at the file level there are correlations between high cumulative change, large size and high complexity. This paper provides an approach for identifying which functions need to be refactored first if one wishes to reduce the complexity of the system.
Journal of The Royal Statistical Society Series B-statistical Methodology | 2000
Jim Q. Smith; Álvaro E. Faria
A supra-Bayesian (SB) wants to combine the information from a group of k experts to produce her distribution of a probability θ. Each expert gives his counts of what he thinks are the numbers of successes and failures in a sequence of independent trials, each with probability θ of success. These counts, used as a surrogate for each experts own individual probability assessment (together with his associated level of confidence in his estimate), allow the SB to build various plausible conjugate models. Such models reflect her beliefs about the reliability of different experts and take account of different possible patterns of overlap of information between them. Corresponding combination rules are then obtained and compared with other more established rules and their properties examined.
Neurocomputing | 2016
J.M. Corrêa; Anselmo Chaves Neto; L.A. Teixeira Júnior; Edgar Manuel Carreño Franco; Álvaro E. Faria
It is well-known that causal forecasting methods that include appropriately chosen Exogenous Variables (EVs) very often present improved forecasting performances over univariate methods. However, in practice, EVs are usually difficult to obtain and in many cases are not available at all. In this paper, a new causal forecasting approach, called Wavelet Auto-Regressive Integrated Moving Average with eXogenous variables and Generalized Auto-Regressive Conditional Heteroscedasticity (WARIMAX-GARCH) method, is proposed to improve predictive performance and accuracy but also to address, at least in part, the problem of unavailable EVs. Basically, the WARIMAX-GARCH method obtains Wavelet EVs (WEVs) from Auto-Regressive Integrated Moving Average with eXogenous variables and Generalized Auto-Regressive Conditional Heteroscedasticity (ARIMAX-GARCH) models applied to Wavelet Components (WCs) that are initially determined from the underlying time series. The WEVs are, in fact, treated by the WARIMAX-GARCH method as if they were conventional EVs. Similarly to GARCH and ARIMA-GARCH models, the WARIMAX-GARCH method is suitable for time series exhibiting non-linear characteristics such as conditional variance that depends on past values of observed data. However, unlike those, it can explicitly model frequency domain patterns in the series to help improve predictive performance. An application to a daily time series of dam displacement in Brazil shows the WARIMAX-GARCH method to remarkably outperform the ARIMA-GARCH method, as well as the (multi-layer perceptron) Artificial Neural Network (ANN) and its wavelet version referred to as Wavelet Artificial Neural Network (WANN) as in [1], on statistical measures for both in-sample and out-of-sample forecasting.
Journal of Time Series Analysis | 2011
Swarup De; Álvaro E. Faria
Dynamic spatial Bayesian (DSB) models are proposed for the analytical modelling of radioactivity deposition after a nuclear accident. The proposed models are extensions of the multi‐variate time‐series dynamic linear models of West and Harrison (1997) to Markov random field processes. They combine the outputs from a long‐range atmospheric dispersal model with measured data (and prior information) to provide improved deposition prediction in space and time. Two versions of a Gaussian DSB model were applied to the radioactivity deposition in Bavaria over a 15 days period during the Chernobyl nuclear accident. One version had fixed functional forms for its spatial variances and covariances while the other allowed those to adapt and ‘learn’ from data in the conjugate Bayesian paradigm. There were two main sources of information for radioactivity deposition in our application: radioactivity measurements at a sparse set of 13 monitoring stations, and the numerical deposition evaluation of the atmospheric dispersal K‐model for the points of a 64×64 regular grid. We have analysed the temporal predictions (one‐step‐ahead forecasting) of those DSB models to show that the dispersal K‐model tended in general to underestimate the deposition levels at all times while the DSB models corrected for that although with different degrees of adjustment.
Annals of Statistics | 1997
Álvaro E. Faria; Jim Q. Smith
Journal of Forecasting | 2008
Álvaro E. Faria; Emmanuel Mubwandarikwa
Journal of Forecasting | 1995
Álvaro E. Faria; Reinaldo Castro Souza
Archive | 1996
Álvaro E. Faria; Jim Q. Smith
Archive | 2008
Álvaro E. Faria; Emmanuel Mubwandarikwa
Radiation Protection Dosimetry | 1997
Jim Q. Smith; Álvaro E. Faria; S. French; D. Ranyard; D. Vlesshhouwer; J. Bohunova; T. Duranova; M. Stubna; L. Dutton; C. Rojas; A. Sohier