Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Robert E. Dorsey is active.

Publication


Featured researches published by Robert E. Dorsey.


Journal of Business & Economic Statistics | 1995

Genetic Algorithms for Estimation Problems With Multiple Optima, Nondifferentiability, and Other Irregular Features

Robert E. Dorsey; Walter J. Mayer

The genetic algorithm is examined as a method for solving optimization problems in econometric estimation. It does not restrict either the form or regularity of the objective function, allows a reasonably large parameter space, and does not rely on a point-to-point search. The performance is evaluated through two sets of experiments on standard test problems as well as econometric problems from the literature. First, alternative genetic algorithms that vary over mutation and crossover rates, population sizes, and other features are contrasted. Second, the genetic algorithm is compared to Nelder–Mead simplex, simulated annealing, adaptive random search, and MSCORE.


decision support systems | 1998

Toward global optimization of neural networks: a comparison of the genetic algorithm and backpropagation

Randall S. Sexton; Robert E. Dorsey; John D. Johnson

Abstract The recent surge in activity of neural network research in business is not surprising since the underlying functions controlling business data are generally unknown and the neural network offers a tool that can approximate the unknown function to any degree of desired accuracy. The vast majority of these studies rely on a gradient algorithm, typically a variation of backpropagation, to obtain the parameters (weights) of the model. The well-known limitations of gradient search techniques applied to complex nonlinear optimization problems such as artificial neural networks have often resulted in inconsistent and unpredictable performance. Many researchers have attempted to address the problems associated with the training algorithm by imposing constraints on the search space or by restructuring the architecture of the neural network. In this paper we demonstrate that such constraints and restructuring are unnecessary if a sufficiently complex initial architecture and an appropriate global search algorithm is used. We further show that the genetic algorithm cannot only serve as a global search algorithm but by appropriately defining the objective function it can simultaneously achieve a parsimonious architecture. The value of using the genetic algorithm over backpropagation for neural network optimization is illustrated through a Monte Carlo study which compares each algorithm on in-sample, interpolation, and extrapolation data for seven test functions.


European Journal of Operational Research | 1999

Optimization of neural networks: A comparative analysis of the genetic algorithm and simulated annealing

Randall S. Sexton; Robert E. Dorsey; John D. Johnson

The escalation of Neural Network research in Business has been brought about by the ability of neural networks, as a tool, to closely approximate unknown functions to any degree of desired accuracy. Although, gradient based search techniques such as back-propagation are currently the most widely used optimization techniques for training neural networks, it has been shown that these gradient techniques are severely limited in their ability to find global solutions. Global search techniques have been identified as a potential solution to this problem. In this paper we examine two well known global search techniques, Simulated Annealing and the Genetic Algorithm, and compare their performance. A Monte Carlo study was conducted in order to test the appropriateness of these global search techniques for optimizing neural networks.


decision support systems | 2000

Reliable classification using neural networks: a genetic algorithm and backpropagation comparison

Randall S. Sexton; Robert E. Dorsey

Abstract Although, the genetic algorithm (GA) has been shown to be a superior neural network (NN) training method on computer-generated problems, its performance — on real world classification data sets is untested. To gain confidence that this alternative training technique is suitable for classification problems, a collection of 10 benchmark real world data sets were used in an extensive Monte Carlo study that compares backpropagation (BP) with the GA for NN training. We find that the GA reliably outperforms the commonly used BP algorithm as an alternative NN training technique. While this does not prove that the GA will always dominate BP, this demonstrated reliability with real world problems enables managers to use NNs trained with GAs as decision support tools with a greater degree of confidence.


Public Choice | 1992

The Voluntary Contributions Mechanism with Real Time Revisions

Robert E. Dorsey

Experimental results are presented showing the effects of allowing real time revisions of voluntary contributions for the provision of a public good. Four public good payoff functions are examined, each of which generates specific equilibria. Evidence of increased provision of the public good is demonstrated for: (i) the case in which revisions are limited to increases and a provision point exists, and also (ii) when there is a high initial marginal return from the public good.


Journal of Business Research | 2004

Using an artificial neural network trained with a genetic algorithm to model brand share

Kelly E Fish; John D. Johnson; Robert E. Dorsey; Jeffery G Blodgett

Abstract We introduce a new architectural approach to artificial neural network (ANN) choice modeling. The standard ANN design with a polychotomous situation requires an output variable for each alternative. We reconfigure our feedforward network to contain only one output node for a six-level choice problem and network performance improves considerably. We conclude that a simpler ANN architecture leads to better generalization in the case of multilevel choice. We then use a feedforward ANN trained with a genetic algorithm to model individual consumer choices and brand share in a retail coffee market. A well-known choice model is replicated while the computer-processing technique is altered from multinomial logit (MNL) to feedforward ANNs trained with the standard backpropagation algorithm and a genetic algorithm. The ANN trained with our genetic algorithm outperforms both MNL and the backpropagation trained ANN.


decision support systems | 2004

Simultaneous optimization of neural network function and architecture algorithm

Randall S. Sexton; Robert E. Dorsey; Naheel A. Sikander

A major limitation to current artificial neural network (NN) research is the inability to adequately identify unnecessary weights in the solution. If a method were found that would allow unnecessary weights to be identified, decision-makers would gain crucial information about the problem at hand as well as benefit by having a network that was more effective and efficient. The Neural Network Simultaneous Optimization Algorithm (NNSOA) is proposed for supervised training in multilayer feedforward neural networks. We demonstrate with Monte Carlo studies that the NNSOA can be used to obtain both a global solution and simultaneously identify a parsimonious network structure.


Archive | 1994

The Use of Artificial Neural Networks for Estimation of Decision Surfaces in First Price Sealed Bid Auctions

Robert E. Dorsey; John D. Johnson; Mark Van Boening

Artificial neural networks, optimized using genetic algorithms, are used to estimate bid functions for first price sealed bid auctions. Data generated in experimental markets is used for two means of estimating the bid function. First, the neural network provides a best fit to the data, thus estimating the bid function that subjects were using. Alternative objective functions are used for the neural network to demonstrate the effect on the resultant bid function. Second, the neural network is optimized using profit maximization as the objective function to identify the optimal bid function given the bids of the experimental subjects.


decision support systems | 1994

A decision support system for in-sample simultaneous equation systems forecasting using artificial neural systems

Louis E. Caporaletti; Robert E. Dorsey; John D. Johnson; William A. Powell

Abstract Decision support systems have been proposed for many forecasting applications. Unfortunately no work has been done in the development of decision support systems for simultaneous equation systems (SESs) forecasting, a very complex and difficult forecasting problem. In this paper the applicability of an artificial intelligence technology, artificial neural systems, for decision support in SESs forecasting is shown. The discussion is focused on the multi-layer feed-forward neural network (MLFFNN). Performance of the MLFFNN versus traditional methods of SES forecasting is evaluated by comparing their in-sample forecast accuracy in a Monte Carlo experiment and on Kleins Model 1.


The Engineering Economist | 1993

Reputation, Information and Project Termination in Capital Budgeting

John Dobson; Robert E. Dorsey

ABSTRACT The Net Present Value (NPV) rule of financial theory gives management a decisive criterion for choosing between abandonment versus continuation of capital projects. There is extensive evidence, however, that management chooses to delay the abandonment of unprofitable projects. This paper attempts to explain managements reluctance to abide by the NPV criterion. The concept of a Reputation Adjusted Net Present Value is introduced in an environment where management knows more about the true value of a project than do stakeholders. The model indicates that, in such an environment, the continuation of a negative NPV project may maximize firm value.

Collaboration


Dive into the Robert E. Dorsey's collaboration.

Top Co-Authors

Avatar

John D. Johnson

University of Mississippi

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Walter J. Mayer

University of Mississippi

View shared research outputs
Top Co-Authors

Avatar

Bahram Alidaee

University of Mississippi

View shared research outputs
Top Co-Authors

Avatar

Haixin Hu

University of Mississippi

View shared research outputs
Top Co-Authors

Avatar

Hui-chen Wang

University of Mississippi

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John Dobson

California Polytechnic State University

View shared research outputs
Top Co-Authors

Avatar

Kelly E Fish

Arkansas State University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge