Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Peter Grogono is active.

Publication


Featured researches published by Peter Grogono.


IEEE Transactions on Evolutionary Computation | 2010

Bi-Objective Multipopulation Genetic Algorithm for Multimodal Function Optimization

Jie Yao; Nawwaf N. Kharma; Peter Grogono

This paper describes the latest version of a bi-objective multipopulation genetic algorithm (BMPGA) aiming to locate all global and local optima on a real-valued differentiable multimodal landscape. The performance of BMPGA is compared against four multimodal GAs on five multimodal functions. BMPGA is distinguished by its use of two separate but complementary fitness objectives designed to enhance the diversity of the overall population and exploration of the search space. This is coupled with a multipopulation and clustering scheme, which focuses selection within the various sub-populations and results in effective identification and retention of the optima of the target functions as well as improved exploitation within promising areas. The results of the empirical comparison provide clear evidence that supports the conclusion that BMPGA is better than the other GAs in terms of overall effectiveness, applicability, and reliability. The practical value of BMPGA has already been demonstrated in applications to multiple ellipses and elliptic objects detection in microscopic imagery.


Pattern Analysis and Applications | 2005

A multi-population genetic algorithm for robust and fast ellipse detection

Jie Yao; Nawwaf N. Kharma; Peter Grogono

This paper discusses a novel and effective technique for extracting multiple ellipses from an image, using a genetic algorithm with multiple populations (MPGA). MPGA evolves a number of subpopulations in parallel, each of which is clustered around an actual or perceived ellipse in the target image. The technique uses both evolution and clustering to direct the search for ellipses—full or partial. MPGA is explained in detail, and compared with both the widely used randomized Hough transform (RHT) and the sharing genetic algorithm (SGA). In thorough and fair experimental tests, using both synthetic and real-world images, MPGA exhibits solid advantages over RHT and SGA in terms of accuracy of recognition—even in the presence of noise or/and multiple imperfect ellipses in an image—and speed of computation.


Expert Systems With Applications | 1990

Verifying, validating, and measuring the performance of expert systems

Ching Y. Suen; Peter Grogono; Raijan Shinghal; François Coallier

Abstract The use of expert systems has increased rapidly during the last few years. There is a growing need for systematic and reliable techniques for evaluating both expert system shells and complete expert systems. In this paper, we discuss evaluation strategies from several points of view: classification, validation, verification, and performance analysis. We note that ther are several respects in which expert system evaluation is similar to software evaluation in general and, consequently, that it may be possible to apply established software engineering techniques to expert system evaluation. In particular, formal analysis is replacing (or enhancing) traditional testing of conventional software. We believe that increasing formalization is an important trend and we indicate ways in which it could be carried further.


Expert Systems With Applications | 1991

Specifying an expert system

Aïda Batarekh; Alun D. Preece; Anne Bennett; Peter Grogono

Abstract The success of numerous expert systems in practical applications warrants a more formal approach to their development and evaluation. Reliability assurance of expert systems requires a methodology for the specification and evaluation of these systems. Expert systems are a new class of software system, but some traditional techniques of software development may be adapted to their construction. However, the specification of an expert system differs from that of a more traditional software program in that parts of the specification are permitted to be only partially described when development starts. Specifications have two important purposes: as contracts between suppliers and clients, and as blueprints for implementation. A specification consists of a problem specification and a solution specification. The problem specification plays the role of contract and states explicitly what the problem to be solved is, and the constraints that the final product must satisfy. The solution specification plays the role of blueprint and has two major aspects: analyzing how a human expert solves the problem, and proposing an equivalent automated solution. We propose an approach to the specification of expert systems that is flexible, yet rigorous enough to cover the important features of a wide range of potential expert system applications. We describe fully each of the components of an expert system specification and we relate specification to the issues of evaluation and maintenance of expert systems.


international conference on pattern recognition | 2004

Fast robust GA-based ellipse detection

Jie Yao; Nawwaf N. Kharma; Peter Grogono

This paper discusses a novel and effective technique for extracting multiple ellipses from an image, using a multi-population genetic algorithm (MPGA). MPGA evolves a number of subpopulations in parallel, each of which is clustered around an actual or perceived ellipse. It utilizes both evolution and clustering to direct the search for ellipses - full or partial. MPGA is explained in detail, and compared with both the widely used randomized Hough transform (RHT) and the sharing genetic algorithm (SGA). In thorough and fair experimental tests, utilizing both synthetic and real-world images, MPGA exhibits solid advantages over RHT and SGA in terms of accuracy of recognition - even in the presence of noise or/and multiple imperfect ellipses, as well as speed of computation.


automated software engineering | 2004

Using a genetic algorithm and formal concept analysis to generate branch coverage test data automatically

Susan Khor; Peter Grogono

Automatic test generators (ATGs) are an important support tool for large-scale software development. Contemporary ATGs include JTest that does white box testing down to the method level only and black box testing if a specification exists, and AETG that tests pairwise interactions among input variables. The first automatic test generation approaches were static, based on symbolic execution (Clarke, 1976). Korel suggested a dynamic approach to automatic test data generation using function minimization and directed search (Korel, 1990). A dynamic approach can handle array, pointer, function and other dynamic constructs more accurately than a static approach but it may also be more expensive since the program under test is executed repeatedly. Subsequent ATGs explored the use of genetic algorithms (Jones et al., 1996; Michael et al., 2001; Pargas et al., 1999) and simulated annealing (Tracey et al., 1998). These ATGs address the problem of producing test data for low level code coverage like statement, branch and condition/decision and depend on branch function (Korel, 1990) style instrumentation (Jones et al., 1996; Michael et al., 2001) and/or the program graph (Jones et al., 1996; Pargas et al., 1999). Unlike previous work, our ATG, called genet, produces test data for branch coverage with simpler instrumentation than branch functions, does not use program graphs, and is programming language independent, genet uses a genetic algorithm (GA) (Holland, 1975) to search for tests and formal concept analysis (FCA) (Ganter and Wille, 1999) to organize the relationships between tests and their execution traces. The combination of GA with FCA is novel. Further, genet extends the opportunistic approach of GADGET (Michael et al., 2001) by targeting several uncovered branches simultaneously. The relationships that genet learns provides useful insights for test selection, test maintenance and debugging


genetic and evolutionary computation conference | 2007

Environment as a spatial constraint on the growth of structural form

Taras Kowaliw; Peter Grogono; Nawwaf N. Kharma

We explore the use of the developmental environment as a spatial constraint on a model of Artificial Embryogeny, applied to the growth of structural forms. A Deva model is used to translate genotype to phenotype, allowing a Genetic Algorithm to evolve Plane Trusses. Genomes are expressed in one of several developmental environments, and selected using a fitness function favouring stability, height, and distribution of pressure. Positive results are found in nearly all cases, demonstrating that environment can be used as an effective spatial constraint on development. Further experiments take genomes evolved in some environment and transplant them into different environments, or re-grow them at different phenotypic sizes; It is shown that while some genomes are highly specialized for the particular environment in which they evolved, others may be re-used in a different context without significant re-design, retaining the majority of their original utility. This strengthens the notion that growth via Artificial Embryogeny can be resistant to perturbations in environment, and that good designs may be re-used in a variety of contexts.


working conference on reverse engineering | 1995

Retrieving information from data flow diagrams

Gregory Butler; Peter Grogono; Rajjan Shinghal; Indra A. Tjandra

For reverse engineering, we need tools that can extract information from documents written before routine digital storage was feasible. Documents contain both text and diagrams; data flow diagrams play a prominent role in software documents. Using current techniques, it is possible to recover the information in a data flow diagram by scanning the printed document and processing the data obtained. Feature extraction and syntax analysis enable the construction of a validated, formalized model of the data flow diagram. Understanding, however, requires a semantic interpretation. We describe a semantic model for data flow diagrams using general techniques that can be applied to other kinds of diagram. We argue that a semantic model is an essential component of a document understanding system; without a semantic model, it is difficult or impossible to extract useful information from a document.


genetic and evolutionary computation conference | 2004

Bluenome: A Novel Developmental Model of Artificial Morphogenesis

Taras Kowaliw; Peter Grogono; Nawwaf N. Kharma

The Bluenome Model of Development is introduced. The Bluenome model is a developmental model of Artificial Morphogenesis, inspired by biological development, instantiating a subset of two-dimensional Cellular Automata. The Bluenome model is cast as a general model, one which generates organizational topologies for finite sets of component types, assuming only local interactions between components. Its key feature is that there exists no relation between genotypic complexity and phenotypic complexity, implying its potential application in high-dimensional evolutionary problems. Additionally, genomes from the Bluenome Model are shown to be capable of re-development in differing environments, retaining many relevant phenotypic properties.


SDL '97: Time for Testing#R##N#SDL, MSC and Trends | 1997

Deriving an SDL Specification with a Given Architecture from a Set of MSCs

Gabriel Robert; Ferhat Khendek; Peter Grogono

Publisher Summary This chapter introduces a new synthesis approach that allows systematic derivation of Specification Description Language (SDL) processes from a set of message sequence chart (MSC). The chapter takes into account the architecture of the target SDL specification, and ensures, by construction, consistency between the SDL specification and the message sequence chart (MSC) specification. The SDL specification generated is free of deadlocks. The chapter explains that the development of distributed systems goes through many phases. The initial phase, requirement analysis and specification, determines the functional and nonfunctional requirements in a specification with a high level of abstraction. In the design phase, the system development starts with an abstract specification of the design which is then refined step by step toward the implementation which is finally tested before its deployment. Formal description techniques play an increasingly important role in the development life cycle of distributed systems, especially telecommunication systems.

Collaboration


Dive into the Peter Grogono's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge