Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where A. Ian McLeod is active.

Publication


Featured researches published by A. Ian McLeod.


International Journal of Forecasting | 1985

Forecasting monthly riverflow time series

Donald J. Noakes; A. Ian McLeod; Keith W. Hipel

Abstract Mean monthly flows from thirty rivers in North and South America are used to test the short-term forecasting ability of seasonal ARIMA, deseasonalized ARMA, and periodic autoregressive models. The series were split into two sections and models were calibrated to the first portion of the data. The models were then used to generate one-step-ahead forecasts for the second portion of the data. The forecast performance is compared using various measures of accuracy. The results suggest that a periodic autoregressive model, identified by using the partial autocorrelation function, provided the most accurate forecasts


Journal of the American Statistical Association | 1979

Distribution of the Residual Cross-Correlation in Univariate ARMA Time Series Models

A. Ian McLeod

Abstract Cross-correlations between univariate autoregressive moving average (ARMA) time series residuals are useful in the examination of relationships between time series (Pierce 1977a) and in the identification of dynamic regression models (Haugh and Box 1977). In this article, the asymptotic distribution of these residual cross-correlations is derived, and its application to the problem of testing for lagged relationships in the presence of instantaneous causality is discussed. Some results of a simulation study to investigate the accuracy of the asymptotic variances and covariances of the residual cross-correlations in finite samples are reported.


Stroke | 2014

Secular Trends in Ischemic Stroke Subtypes and Stroke Risk Factors

Chrysi Bogiatzi; Daniel G. Hackam; A. Ian McLeod; J. David Spence

Background and Purpose— Early diagnosis and treatment of a stroke improves patient outcomes, and knowledge of the cause of the initial event is crucial to identification of the appropriate therapy to maximally reduce risk of recurrence. Assumptions based on historical frequency of ischemic subtypes may need revision if stroke subtypes are changing as a result of recent changes in therapy, such as increased use of statins. Methods— We analyzed secular trends in stroke risk factors and ischemic stroke subtypes among patients with transient ischemic attack or minor or moderate stroke referred to an urgent transient ischemic attack clinic from 2002 to 2012. Results— There was a significant decline in low-density lipoprotein cholesterol and blood pressure, associated with a significant decline in large artery stroke and small vessel stroke. The proportion of cardioembolic stroke increased from 26% in 2002 to 56% in 2012 (P<0.05 for trend). Trends remained significant after adjusting for population change. Conclusions— With more intensive medical management in the community, a significant decrease in atherosclerotic risk factors was observed, with a significant decline in stroke/transient ischemic attack caused by large artery atherosclerosis and small vessel disease. As a result, cardioembolic stroke/transient ischemic attack has increased significantly. Our findings suggest that more intensive investigation for cardiac sources of embolism and greater use of anticoagulation may be warranted.


Journal of Time Series Analysis | 2012

Improved Multivariate Portmanteau Test

Esam Mahdi; A. Ian McLeod

A new portmanteau diagnostic test for vector autoregressive moving average (VARMA) models that is based on the determinant of the standardized multivariate residual autocorrelations is derived. The new test statistic may be considered an extension of the univariate portmanteau test statistic suggested by Peňa and Rodriguez (2002). The asymptotic distribution of the test statistic is derived as well as a chi‐square approximation. However, the Monte–Carlo test is recommended unless the series is very long. Extensive simulation experiments demonstrate the usefulness of this test as well as its improved power performance compared to widely used previous multivariate portmanteau diagnostic check. Two illustrative applications are given.


Computational Statistics & Data Analysis | 2006

Improved Peňa-Rodriguez portmanteau test

Jen-Wen Lin; A. Ian McLeod

Several problems with the diagnostic check suggested by Pena and Rodriguez [2002. A powerful portmanteau test of lack of fit for time series. J. Amer. Statist. Assoc. 97, 601-610.] are noted and an improved Monte-Carlo version of this test is suggested. It is shown that quite often the test statistic recommended by Pena and Rodriguez [2002. A powerful portmanteau test of lack of fit for time series. J. Amer. Statist. Assoc. 97, 601-610.] may not exist and their asymptotic distribution of the test does not agree with the suggested gamma approximation very well if the number of lags used by the test is small. It is shown that the convergence of this test statistic to its asymptotic distribution may be quite slow when the series length is less than 1000 and so a Monte-Carlo test is recommended. Simulation experiments suggest the Monte-Carlo test is usually more powerful than the test given by Pena and Rodriguez [2002. A powerful portmanteau test of lack of fit for time series. J. Amer. Statist. Assoc. 97, 601-610.] and often much more powerful than the Ljung-Box portmanteau test. Two illustrative examples of enhanced diagnostic checking with the Monte-Carlo test are given.


Handbook of Statistics | 2012

Time Series Analysis with R

A. Ian McLeod; Hao Yu; Esam Mahdi

Abstract A brief overview of the R statistical computing and programming environment is given that explains why many time series researchers in both applied and theoretical research may find R useful. The core features of R for basic time series analysis are outlined. Some intermediate level and advanced topics in time series analysis that are supported in R are discussed such as including state-space models, structural change, generalized linear models, threshold models, neural nets, co-integration, GARCH, wavelets, and stochastic differential equations. Numerous examples of beautiful graphs constructed using R for time series are shown. R code for reproducing all the graphs and tables is given on my homepage.


Canadian Water Resources Journal | 2012

The Effects of Climate Change on Extreme Precipitation Events in the Upper Thames River Basin: A Comparison of Downscaling Approaches

Leanna M. M. King; Sarah Irwin; Rubaiya Sarwar; A. Ian McLeod; Slobodan P. Simonovic

Future changes in climatic conditions from increasing greenhouse gas concentrations will have a major impact on the hydrologic cycle. It is important to understand and predict future changes in temperature and precipitation in order to effectively manage water resources. Atmosphere-Ocean coupled Global Climate Models (AOGCMs) are used widely to predict the effects of greenhouse-gas forcing on global climate conditions. However, their spatial and temporal resolutions are quite large so their outputs must be modified to represent local climate conditions. This process is called downscaling, and there are a variety of tools available to achieve this goal. This study compares three downscaling approaches, namely the Statistical DownScaling Model (SDSM), Long Ashton Research Station Weather Generator (LARS-WG), and the K-NN Weather Generator with Principal Component Analysis (WG-PCA). Each weather generator is used to simulate the historical climate for the Upper Thames River Basin in Ontario, Canada for use in a comparison of downscaling tools. Future climate conditions are simulated by LARS-WG and WG-PCA from six different AOGCMs, each with two to three emissions scenarios, for a total of 15 different models. In simulation of historical climate variability, the models generally perform better in terms of mean daily precipitation and total monthly precipitation. LARS-WG simulates precipitation events well but cannot reproduce means and variances in the daily temperature series. SDSM adequately simulates both temperatures and precipitation events. WG-PCA reproduces daily temperatures very well but overestimates the occurrence of some extreme precipitation events. Results are variable for the downscaling of AOGCMs; however, the downscaling tools generally predict a rise in winter, spring and fall precipitation totals, as well as an overall increase in mean annual precipitation in future decades.


International Journal of Forecasting | 1988

Forecasting annual geophysical time series

Donald J. Noakes; Keith W. Hipel; A. Ian McLeod; Carlos Jimenéz; Sidney Yakowitz

Abstract An important test of the adequecy of a stochastic model is its ability to forecast accurately. In hydrology as in many other disciplines, the performance of the model in producing one step ahead forecasts is of particular interest. The ability of several stationary nonseasonal time series models to produce accurate forecasts is examined in this paper. Statistical tests are employed to determine if the forecasts generated by a particular model are better than the forecasts produced by an alternative procedure. The results of the study indicate that for the data sets examined, there is no significant difference in forecast performance between the nonseasonal autoregressive moving average model and a nonparametric regression model.


Environmental Management | 1978

Assessment of environmental impacts part two. Data collection

Dennis P. Lettenmaier; Keith W. Hipel; A. Ian McLeod

Intervention analysis is a relatively new branch of time series analysis. The power of this technique, which gives the probability that changes in mean level can be distinguished from natural data variability, is quite sensitive to the way the data are collected. The principal independent variables influenced by the data collection design are overall sample size, sampling frequency, and the relative length of record before the occurrence of the event (intervention) that is postulated to have caused a change in mean process level.For three of the four models investigated, data should be collected so that the post-intervention record is substantially longer than the pre-intervention record. This is in conflict with the intuitive approach, which would be to collect equal amounts of data before and after the intervention. The threshold (minimum) level of change that can be detected is quite high unless sample sizes of at least 50 and preferably 100 are available; this minimum level is dependent on the complexity of the model required to describe the response of the process mean to the intervention. More complex models tend to require larger sample sizes for the same threshold detectable change level.Uniformity of sampling frequency is a key consideration. Environmental data collection programs have not historically been oriented toward data analysis using time series techniques, thus eliminating a potentially powerful tool from use in many environmental assessment applications.


Environmental Management | 1978

Assessment of environmental impacts part one. Intervention analysis

Keith W. Hipel; Dennis P. Lettenmaier; A. Ian McLeod

Intervention analysis is a rigorous statistical method for analyzing the effects of man-induced or natural changes on the environment. For instance, it may be necessary to determine whether a newly installed pollution control device significantly reduces the former mean level of a pollutant. By using intervention analysis, the actual change in the pollutant levels can be statistically determined. Previously, no comprehensive method was available to assess changes in the environment. Intervention analysis is an advanced type of Box-Jenkins model. A genpral description of Box-Jenkins models and their extensions is given. Also, the importance of adhering to sound modeling principles when fitting a stochastic model to a time series is emphasized. Following a discussion of intervention models, three applications of intervention analysis to environmental problems are given. Two applications deal with the environmental effects of man-made projects, while the third example demonstrates how a forest fire can affect the flow regime of a river.

Collaboration


Dive into the A. Ian McLeod's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Evelyn Vingilis

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

Donald J. Noakes

Thompson Rivers University

View shared research outputs
Top Co-Authors

Avatar

Chrysi Bogiatzi

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

Daniel G. Hackam

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

Hao Yu

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

Ian B. MacNeill

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

J. David Spence

Robarts Research Institute

View shared research outputs
Top Co-Authors

Avatar

Slobodan P. Simonovic

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

Aizhan Meirambayeva

University of Western Ontario

View shared research outputs
Researchain Logo
Decentralizing Knowledge