Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Steven M. Corns is active.

Publication


Featured researches published by Steven M. Corns.


IEEE Transactions on Evolutionary Computation | 2006

Graph-based evolutionary algorithms

Kenneth M. Bryden; Daniel Ashlock; Steven M. Corns; Stephen J. Willson

Evolutionary algorithms use crossover to combine information from pairs of solutions and use selection to retain the best solutions. Ideally, crossover takes distinct good features from each of the two structures involved. This process creates a conflict: progress results from crossing over structures with different features, but crossover produces new structures that are like their parents and so reduces the diversity on which it depends. As evolution continues, the algorithm searches a smaller and smaller portion of the search space. Mutation can help maintain diversity but is not a panacea for diversity loss. This paper explores evolutionary algorithms that use combinatorial graphs to limit possible crossover partners. These graphs limit the speed and manner in which information can spread giving competing solutions time to mature. This use of graphs is a computationally inexpensive method of picking a global level of tradeoff between exploration and exploitation. The results of using 26 graphs with a diverse collection of graphical properties are presented. The test problems used are: one-max, the De Jong functions, the Griewangk function in three to seven dimensions, the self-avoiding random walk problem in 9, 12, 16, 20, 25, 30, and 36 dimensions, the plus-one-recall-store (PORS) problem with n=15,16, and 17, location of length-six one-error-correcting DNA barcodes, and solving a simple differential equation semi-symbolically. The choice of combinatorial graph has a significant effect on the time-to-solution. In the cases studied, the optimal choice of graph improved solution time as much as 63-fold with typical impact being in the range of 15% to 100% variation. The graph yielding superior performance is found to be problem dependent. In general, the optimal graph diameter increases and the optimal average degree decreases with the complexity and difficulty of the fitness landscape. The use of diverse graphs as population structures for a collection of problems also permits a classification of the problems. A phylogenetic analysis of the problems using normalized time to solution on each graph groups the numerical problems as a clade together with one-max; self-avoiding walks form a clade with the semisymbolic differential equation solution; and the PORS and DNA barcode problems form a superclade with the numerical problems but are substantially distinct from them. This novel form of analysis has the potential to aid researchers choosing problems for a test suite


Risk Analysis | 2008

Risk-Based Analysis of the Danish Pork Salmonella Program : Past and Future

H. Scott Hurd; Claes Enøe; Lene Lund Sørensen; Henrik Wachman; Steven M. Corns; Kenneth M. Bryden; Matthias Grenier

The Danish pork Salmonella control program was initiated in 1993 in response to a prominent pork-related outbreak in Copenhagen. It involved improved efforts at slaughter hygiene (postharvest) and on-farm (preharvest) surveillance and control. After 10 years, 95 million Euros, significant reductions in seropositive herds, Salmonella positive carcasses, and pork-attributable human cases (PAHC), questions have arisen about how best to continue this program. The objective of this study was to provide some analysis and information to address these questions. The methods used include a computer simulation model constructed of a series of Excel workbooks, one for each simulated year and scenario (http://www.ifss.iastate/DanSalmRisk). Each workbook has three modules representing the key processes affecting risk: seropositive pigs leaving the farm (Production), carcass contamination after slaughter (Slaughter), and PAHC of Salmonella (Attribution). Parameter estimates are derived from an extensive farm-to-fork database collected by industry and government and managed by the Danish Zoonosis Centre (http://www.food.dtu.dk). Retrospective (1994-2003) and prospective (2004-2013) simulations were evaluated. The retrospective simulations showed that, except for the first few years (1994-1998), the on-farm program had minimal impact in reducing the number of positive carcasses and PAHC. Most of the reductions in PAHC up to 2003 were, according to this analysis, due to various improvements in abattoir processes. Prospective simulations showed that minimal reductions in human health risk (PAHC) could be achieved with on-farm programs alone. Carcass decontamination was shown as the most effective means of reducing human risk, reducing PAHC to about 10% of the simulated 2004 level.


Natural Hazards Review | 2015

Framework for Modeling Urban Restoration Resilience Time in the Aftermath of an Extreme Event

Suzanna K. Long; Thomas G. Shoberg; Steven M. Corns; Hector J. Carlo

AbstractThe impacts of extreme events continue long after the emergency response has terminated. Effective reconstruction of supply-chain strategic infrastructure (SCSI) elements is essential for postevent recovery and the reconnectivity of a region with the outside. This study uses an interdisciplinary approach to develop a comprehensive framework to model resilience time. The framework is tested by comparing resilience time results for a simulated EF-5 tornado with ground truth data from the tornado that devastated Joplin, Missouri, on May 22, 2011. Data for the simulated tornado were derived for Overland Park, Johnson County, Kansas, in the greater Kansas City, Missouri, area. Given the simulated tornado, a combinatorial graph considering the damages in terms of interconnectivity between different SCSI elements is derived. Reconstruction in the aftermath of the simulated tornado is optimized using the proposed framework to promote a rapid recovery of the SCSI. This research shows promising results when...


10th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference | 2004

Training Finite State Classifiers to Improve PCR Primer Design

Daniel Ashlock; Kenneth M. Bryden; Steven M. Corns; Tsui-Jung Wen

We present results on training finite state machines as classifiers for polymerase chain reaction primers. The goal is to decrease the number of primers that fail to amplify correctly. Finite state classifiers are trained with a novel evolutionary algorithm that uses an incremental fitness reward system and multi-population hybridization The system presented here creates a post-production add-on to a standard primer picking program intended to compensate for organism and lab specific factors.


Procedia Computer Science | 2011

Model Development of a Virtual Learning Environment to Enhance Lean Education

Akalpit Gadre; Elizabeth A. Cudney; Steven M. Corns

Abstract Modern day industry is becoming leaner by the day. This demands engineers with an in-depth understanding of lean philosophies. Current methods for teaching lean include hands-on projects and simulation. However, simulation games available in the market lack simplicity, ability to store the results, and modeling power. The goal of this research is to develop a virtual simulation platform which would enable students to perform various experiments by applying lean concepts. The design addresses these deficiencies through the use of VE-Suite, a virtual engineering software. The design includes user-friendly dialogue boxes, graphical models of machines, performance display gauges, and an editable layout. The platform uses laws of operations management such as Littles law, economic order quantity (EOQ) models, and cycle time. These laws enable students to implement various lean concepts such as pull system, just-in-time (JIT), single piece flow, single minute exchange of dies (SMED), kaizen, kanban, U-layout, by modifying the process parameters such as process times, setup times, layout, number, and placement of machines. The simulation begins with a traditional push type mass production line and the students improve the line by implementing lean techniques. Thus, students experience the advantages of lean real time while facing the real life problems encountered in implementing it.


computational intelligence in bioinformatics and computational biology | 2010

Improved PCR design for mouse DNA by training finite state machines

Salik R. Yadav; Steven M. Corns

This project presents an updated method for classification of polymerase chain reaction primers in mice using finite state classifiers. This is done to compensate for many lab, organism and chemical specific factors that are costly. Using Finite State Classifiers can help decrease the number of primers that fail to amplify correctly. For training these classifiers, five different evolutionary algorithms that use an incremental fitness reward are used. Variations to the number of generations and the values in the fitness reward are examined, and the resulting designs are presented. By controlling the fitness reward correctly, there is a potential to develop classifiers with a high likelihood of accepting only good primers. The proposed tool can act as a post-production add-on to the standard primer picking algorithm for gene expression detection in mice to compensate for local factors that may induce errors.


world congress on computational intelligence | 2008

Small population effects and hybridization

Daniel Ashlock; Kenneth M. Bryden; Steven M. Corns

This paper examines the confluence of two lines of research that seek to improve the performance of evolutionary computation systems through management of information flow. The first is hybridization; the second is using small population effects. Hybridization consists of restarting evolutionary algorithms with copies of best-of-population individuals drawn from many populations. Small population effects occur when an evolutionary algorithmpsilas performance, either speed or probability of premature convergence, is improved by use of a very small population. This paper presents a structure for evolutionary computation called a blender which performs hybridization of many small populations. The blender algorithm is tested on the PORS and Tartarus tasks. Substantial and significant effects result from varying the size of the small populations used and from varying the frequency with which hybridization is performed. The major effect results from changing the frequency of hybridization; the impact of population size is more modest. The parameter settings which yield best performance of the blender algorithm are remarkably consistent across all seven sets of experiments performed. Blender performance is found to be superior to other algorithms for six cases of the PORS problem. For Tartarus, blender performs well, but not as well as the previous hybridization experiments that motivated its development.


ieee international conference on evolutionary computation | 2006

Improving Design Diversity Using Graph Based Evolutionary Algorithms

Steven M. Corns; Daniel Ashlock; Douglas S. McCorkle; Kenneth M. Bryden

Graph based evolutionary algorithms (GBEAs) have been shown to have superior performance to evolutionary algorithms on a variety of evolutionary computation test problems as well as on some engineering applications. One of the motivations for creating GBEAs was to produce a diversity of solutions with little additional computational cost. This paper tests that feature of GBEAs on three problems: a real-valued multi-modal function of varying dimension, the plus-one-recall-store (PORS) problem, and an applied engineering design problem. For all of the graphs studied the number of different solutions increased as the connectivity of the graph underlying the algorithm decreased. This indicates that the choice of graph can be used to control the diversity of solutions produced. The availability of multiple solutions is an asset in a product realization system, making it possible for an engineer to explore design alternatives.


congress on evolutionary computation | 2012

An Exponential Moving Average algorithm

David Haynes; Steven M. Corns; Ganesh Kumar Venayagamoorthy

Techniques to reduce the search space when an optimizer seeks an optimal value are studied in this paper. A new mutation technique called the “Exponential Moving Average” algorithm (EMA) is introduced. The performance of EMA algorithms is compared to two other similar Computational Intelligence (CI) algorithms (an ordinary Evolutionary Algorithm (EA) and a “Mean-Variance Optimization” (MVO)) to solve a multi-dimensional problem which has a large search space. The classic Sudoku puzzle is chosen as the problem with a large search space.


congress on evolutionary computation | 2010

A comparative study of diversity in evolutionary algorithms

Jayakanth Jayachandran; Steven M. Corns

For many evolutionary algorithms a key obstacle to finding the global optima is insufficient solution diversity, causing the algorithm to become mired in a local optima. Solution diversity can be influenced by algorithm parameters including population size, mutation operator and diversity preservation techniques. This study examines the combined effect of population size, mutation value and the geography imposed by the combinatorial graphs on a set of five standard evolutionary algorithm problems. A trade off can be seen between the initial diversity of the population size, introduction of new diversity from mutation, and the preservation of diversity from combinatorial graph. With an appropriate fusion of these three factors a level of diversity can be achieved to decrease the time to find the global optima.

Collaboration


Dive into the Steven M. Corns's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Elizabeth A. Cudney

Missouri University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Cihan H. Dagli

Missouri University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Ivan G. Guardiola

Missouri University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Suzanna K. Long

Missouri University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

David Haynes

Missouri University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Thomas G. Shoberg

United States Geological Survey

View shared research outputs
Top Co-Authors

Avatar

Hector J. Carlo

University of Puerto Rico at Mayagüez

View shared research outputs
Top Co-Authors

Avatar

Dinesh Kanigolla

Missouri University of Science and Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge