Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Charles Newton is active.

Publication


Featured researches published by Charles Newton.


congress on evolutionary computation | 2001

PDE: a Pareto-frontier differential evolution approach for multi-objective optimization problems

Hussein A. Abbass; Ruhul A. Sarker; Charles Newton

The use of evolutionary algorithms (EAs) to solve problems with multiple objectives (known as multi-objective optimization problems (MOPs)) has attracted much attention. Being population based approaches, EAs offer a means to find a group of Pareto-optimal solutions in a single run. Differential evolution (DE) is an EA that was developed to handle optimization problems over continuous domains. The objective of this paper is to introduce a novel Pareto-frontier differential evolution (PDE) algorithm to solve MOPs. The solutions provided by the proposed algorithm for two standard test problems, outperform the Strength Pareto Evolutionary Algorithm, one of the state-of-the-art evolutionary algorithms for solving MOPs.


Online Information Review | 2002

Data Mining: A Heuristic Approach

Hussein A. Abbass; Charles Newton; Ruhul A. Sarker

From the Publisher: Real-life problems are known to be messy, dynamic and multi-objective, and involve high levels of uncertainty and constraints. Because traditional problem-solving methods are no longer capable of handling this level of complexity, heuristic search methods have attracted increasing attention in recent years for solving such problems. Inspired by nature, biology, statistical mechanics, physics and neuroscience, heuristic techniques are used to solve many problems where traditional methods have failed. Data Mining: A Heuristic Approach is a repository for the applications of these techniques in the area of data mining.


Computers & Operations Research | 2002

A new evolutionary approach to cutting stock problems with and without contiguity

Ko-Hsin Liang; Xin Yao; Charles Newton; David Hoffman

Evolutionary algorithms (EAs) have been applied to many optimization problems successfully in recent years. The genetic algorithm (GAs) and evolutionary programming (EP) are two different types of EAs. GAs use crossover as the primary search operator and mutation as a background operator, while EP uses mutation as the primary search operator and does not employ any crossover. This paper proposes a novel EP algorithm for cutting stock problems with and without contiguity. Two new mutation operators are proposed. Experimental studies have been carried out to examine the effectiveness of the EP algorithm. They show that EP can provide a simple yet more effective alternative to GAs in solving cutting stock problems with and without contiguity. The solutions found by EP are significantly better (in most cases) than or comparable to those found by GAs.


annual conference on computers | 2002

A genetic algorithm for solving economic lot size scheduling problem

Ruhul A. Sarker; Charles Newton

The purpose of this research is to determine an optimal batch size for a product and purchasing policy of associated raw materials. Like most other practical situation, this manufacturing firm has a limited storage space and transportation fleet of known capacity. The mathematical formulation of the problem indicates that the model is a constrained nonlinear integer program. Considering the complexity of solving such model, we investigate the use of genetic algorithms (GAs) for solving this model. We develop GA code with three different penalty functions usually used for constraint optimizations. The model is also solved using an existing commercial optimization package to compare the solution. The detailed computational results are presented.


International Journal of Computational Intelligence and Applications | 2003

EVOLUTIONARY OPTIMIZATION (EvOpt): A BRIEF REVIEW AND ANALYSIS

Ruhul A. Sarker; Joarder Kamruzzaman; Charles Newton

Evolutionary Computation (EC) has attracted increasing attention in recent years, as powerful computational techniques, for solving many complex real-world problems. The Operations Research (OR)/Optimization community is divided on the acceptability of these techniques. One group accepts these techniques as potential heuristics for solving complex problems and the other rejects them on the basis of their weak mathematical foundations. In this paper, we discuss the reasons for using EC in optimization. A brief review of Evolutionary Algorithms (EAs) and their applications is provided. We also investigate the use of EAs for solving a two-stage transportation problem by designing a new algorithm. The computational results are analyzed and compared with conventional optimization techniques.


Applied Intelligence | 2001

Adapting Self-Adaptive Parameters in Evolutionary Algorithms

Ko-Hsin Liang; Xin Yao; Charles Newton

The lognormal self-adaptation has been used extensively in evolutionary programming (EP) and evolution strategies (ES) to adjust the search step size for each objective variable. However, it was discovered in our previous study (K.-H. Liang, X. Yao, Y. Liu, C. Newton, and D. Hoffman, in Evolutionary Programming VII. Proc. of the Seventh Annual Conference on Evolutionary Programming, vol. 1447, edited by V. Porto, N. Saravanan, D. Waagen, and A. Eiben, Lecture Notes in Computer Science, Springer: Berlin, pp. 291–300, 1998) that such self-adaptation may rapidly lead to a search step size that is far too small to explore the search space any further, and thus stagnates search. This is called the loss of step size control. It is necessary to use a lower bound of search step size to avoid this problem. Unfortunately, the optimal setting of lower bound is highly problem dependent. This paper first analyzes both theoretically and empirically how the step size control was lost. Then two schemes of dynamic lower bound are proposed. The schemes enable the EP algorithm to adjust the lower bound dynamically during evolution. Experimental results are presented to demonstrate the effectiveness and efficiency of the dynamic lower bound for a set of benchmark functions.


congress on evolutionary computation | 1999

Combining landscape approximation and local search in global optimization

Ko-Hsin Liang; Xin Yao; Charles Newton

Local search techniques have been applied in variant global optimization methods. The effect of local search to the function landscape can make multimodal problems easier to solve. For evolutionary algorithms, the usage of the step size control concept normally will result in failure by the individual to escape from the local optima during the final stage. We propose an algorithm combining landscape approximation and local search (LALS) which is designed to tackle those difficult multimodal problems. We demonstrate that LALS can solve problems with very rough landscapes and also that LALS has very good global reliability.


Archive | 2002

Heuristics and optimization for knowledge discovery

Hussein A. Abbass; Charles Newton; Ruhul A. Sarker

Heuristic and Optimization for Knowledge Discovery A Heuristic Algorithm for Feature Selection Based on Optimization Techniques Cost-Sensitive Classification using Decision Trees, Boosting and MetaCost Heuristic Search-Based Stacking of Classifiers Designing Component-Based Heuristic Search Engines for Knowledge Discovery Clustering Mixed Incomplete Data Bayesian Learning The Role of Sampling in Data Mining The Gamma Test Neural Networks How to Train Multilayer Perceptrons Efficiently with Large Data Cluster Analysis of Marketing Data Examining On-Line Shopping Heuristics in Medical Data Mining Understanding Credit Card Users Behaviour Heuristic Knowledge Discovery for Archaeological Data Using Cultural Algorithms and Rough Sets


Evolutionary Programming | 1998

An Experimental Investigation of Self-Adaptation in Evolutionary Programming

Ko-Hsin Liang; Xin Yao; Yong Liu; Charles Newton; David Hoffman

Evolutionary programming (EP) has been widely used in numerical optimization in recent years. One of EPs key features is its self-adaptation scheme. In EP, mutation is typically the only operator used to generate new offspring. The mutation is often implemented by adding a random number from a certain distribution (e.g., Gaussian in the case of classical EP) to the parent. An important parameter of the Gaussian distribution is its standard deviation (or equivalently the variance). In the widely used self-adaptation scheme of EP, this parameter is evolved, rather than manually fixed, along with the objective variables. This paper investigates empirically how well the self-adaptation scheme works on a set of benchmark functions. Some anomalies have been observed in the empirical studies, which demonstrate that the self-adaptation scheme may not work as well as hoped for some functions. An experimental evaluation of an existing simple fix to the problem is also carried out in this paper.


International Transactions in Operational Research | 2001

Genetic Algorithms for Solving a Class of Constrained Nonlinear Integer Programs

Ruhul A. Sarker; Thomas Philip Runarsson; Charles Newton

We consider a class of constrained nonlinear integer programs, which arise in manufacturing batch-sizing problems with multiple raw materials. In this paper, we investigate the use of genetic algorithms (GAs) for solving these models. Both binary and real coded genetic algorithms with six different penalty functions are developed. The real coded genetic algorithm works well for all six penalty functions compared to binary coding. A new method to calculate the penalty coefficient is also discussed. Numerical examples are provided and computational experiences are discussed.

Collaboration


Dive into the Charles Newton's collaboration.

Top Co-Authors

Avatar

Ruhul A. Sarker

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar

Ko-Hsin Liang

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar

Xin Yao

University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Hussein A. Abbass

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar

David Hoffman

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Joarder Kamruzzaman

Federation University Australia

View shared research outputs
Top Co-Authors

Avatar

Yong Liu

University of New South Wales

View shared research outputs
Researchain Logo
Decentralizing Knowledge