Ray D. Nelson
Brigham Young University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ray D. Nelson.
The American Statistician | 1998
Jerry L. Hintze; Ray D. Nelson
Abstract Many modifications build on Tukeys original box plot. A proposed further adaptation, the violin plot, pools the best statistical features of alternative graphical representations of batches of data. It adds the information available from local density estimates to the basic summary statistics inherent in box plots. This marriage of summary statistics and density shape into a single plot provides a useful tool for data analysis and exploration.
IEEE Transactions on Neural Networks | 1997
James V. Hansen; Ray D. Nelson
Ever since the initial planning for the 1997 Utah legislative session, neural-network forecasting techniques have provided valuable insights for analysts forecasting tax revenues. These revenue estimates are critically important since agency budgets, support for education, and improvements to infrastructure all depend on their accuracy. Underforecasting generates windfalls that concern taxpayers, whereas overforecasting produces budget shortfalls that cause inadequately funded commitments. The pattern finding ability of neural networks gives insightful and alternative views of the seasonal and cyclical components commonly found in economic time series data. Two applications of neural networks to revenue forecasting clearly demonstrate how these models complement traditional time series techniques. In the first, preoccupation with a potential downturn in the economy distracts analysis based on traditional time series methods so that it overlooks an emerging new phenomenon in the data. In this case, neural networks identify the new pattern that then allows modification of the time series models and finally gives more accurate forecasts. In the second application, data structure found by traditional statistical tools allows analysts to provide neural networks with important information that the networks then use to create more accurate models. In summary, for the Utah revenue outlook, the insights that result from a portfolio of forecasts that includes neural networks exceeds the understanding generated from strictly statistical forecasting techniques. In this case, the synergy clearly results in the whole of the portfolio of forecasts being more accurate than the sum of the individual parts.
computational intelligence | 1999
James V. Hansen; James B. McDonald; Ray D. Nelson
Neural networks whose architecture is determined by genetic algorithms outperform autoregressive integrated moving average forecasting models in six different time series examples. Refinements to the autoregressive integrated moving average model improve forecasting performance over standard ordinary least squares estimation by 8% to 13%. In contrast, neural networks achieve dramatic improvements of 10% to 40%. Additionally, neural networks give evidence of detecting patterns in data which remain hidden to the autoregression and moving average models. The consequent forecasting potential of neural networks makes them a very promising addition to the variety of techniques and methodologies used to anticipate future movements in time series.
Journal of the Operational Research Society | 2003
James V. Hansen; Ray D. Nelson
Operations and other business decisions often depend on accurate time-series forecasts. These time series usually consist of trend-cycle, seasonal, and irregular components. Existing methodologies attempt to first identify and then extrapolate these components to produce forecasts. The proposed process partners this decomposition procedure with neural network methodologies to combine the strengths of economics, statistics, and machine learning research. Stacked generalization first uses transformations and decomposition to pre-process a time series. Then a time-delay neural network receives the resulting components as inputs. The outputs of this neural network are then input to a backpropagation algorithm that synthesizes the processed components into a single forecast. Genetic algorithms guide the architecture selection for both the time-delay and backpropagation neural networks. The empirical examples used in this study reveal that the combination of transformation, feature extraction, and neural networks through stacked generalization gives more accurate forecasts than classical decomposition or ARIMA models. Scope and Purpose. The research reported in this paper examines two concurrent issues. The first evaluates the performance of neural networks in forecasting time series. The second assesses the use of stacked generalization as a way of refining this process. The methodology is applied to four economic and business time series. Those studying time series and neural networks, particularly in terms of combining tools from the statistical community with neural network technology, will find this paper relevant.
Neurocomputing | 2002
James V. Hansen; Ray D. Nelson
Abstract Data mining is the search for valuable information in large volumes of data. Finding patterns in time series databases is important to a variety of applications, including stock market trading and budget forecasting. This paper reports on an extension of neural network methods for planning and budgeting in the State of Utah. In particular, historical time series are analyzed using stacked generalization, a methodology devised to aid in developing models that generalize well to future time periods. Stacked generalization is compared to ARIMA and to stand-alone neural networks. The results are consistent and suggest promise for the stacked generalization method in other time series domains.
Journal of Experimental and Theoretical Artificial Intelligence | 2003
James V. Hansen; Ray D. Nelson
Time-series analysis is important to a wide range of disciplines transcending both the physical and social sciences. Statistical models have sound theoretical bases and have been successfully used in a number of problem domains. More recently, machine-learning models such as neural networks have been suggested as offering potential for time-series analysis. Results of neural network empirical testing have thus far been mixed. This paper proposes melding useful parameters from the statistical ARIMA model with neural networks of two types: multilevel perceptrons (MLPs) and radial basis functions (RBFs). Tests are run on a range of time-series problems that exhibit many common patterns encountered by analysts. The results suggest that hybrids of the type proposed may yield better outcomes than either model by itself.
Communications in Statistics-theory and Methods | 1989
James B. McDonald; Ray D. Nelson
The leptokurtosls of many security market return distributions can contaminate ordinary least squares estimates of the β coefficient of the market model. Partially adaptive estimation techniques accommodate the possibility of fat tailed distributions. this methodology limits the influence of extremely large residuals and yields estimates which are both statistically and practically different from ordinary least squares.
Journal of the Operational Research Society | 2006
James V. Hansen; James B. McDonald; Ray D. Nelson
The importance of predicting future values of a time-series transcends a range of disciplines. Economic and business time-series are typically characterized by trend, cycle, seasonal, and random components. Powerful methods have been developed to capture these components by specifying and estimating statistical models. These methods include exponential smoothing, autoregressive integrated moving average (ARIMA), and partially adaptive estimated ARIMA models. New research in pattern recognition through machine learning offers innovative methodologies that can improve forecasting performance. This paper presents a study of the comparative results of time-series analysis on nine problem domains, each of which exhibits differing time-series characteristics. Comparative analyses use ARIMA selection employing an intelligent agent, ARIMA estimation through partially adaptive methods, and support vector machines. The results find that support vector machines weakly dominate the other methods and achieve the best results in eight of nine different data sets.
Communications in Statistics-theory and Methods | 1993
James B. McDonald; Ray D. Nelson
Leptokurtosis and skewness characterize the distributions of the returns for many financial instruments traded in security markets. These departures from normality can adversely affect the efficiency of least squares estimates of the βs in the single index or market model. The proposed new partially adaptive estimation techniques accommodate skewed and fat tailed distributions. The empirical investigation, which is the first application of this procedure in regression models, reveals that both skewness and kurtosis can affect β estimates.
Public Finance Review | 2010
Gary C. Cornia; Scott D. Grimshaw; Ray D. Nelson; Lawrence C. Walters
Because retail sales taxes generate substantial revenue for many local governments, public officials contemplating differential local option tax rates must carefully assess the potential impacts of such decisions on purchasing decisions. The authors use a unique pooled time series to examine these impacts and apply a methodology that permits an analysis of the effects on purchasing decisions of sales tax rate differences across numerous consumer goods. The results indicate that the response to sales tax rate differences depends on the general characteristics of the goods being purchased. A unique variable that controls for the distance to the next significant alternative for making a purchase also provides key insights. The observed significance for this variable and its interaction with tax rates has significant public policy implications.