Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Assem Kaylani is active.

Publication


Featured researches published by Assem Kaylani.


winter simulation conference | 2007

Supply chain simulation modeling made easy: an innovative approach

Dayana Cope; Mohamed Sam Fayez; Mansooreh Mollaghasemi; Assem Kaylani

Simulation modeling and analysis requires skills and scientific background to be implemented. This is vital for this powerful methodology to deliver value to the company adopting it. There are several practices to implement and rely on simulation modeling for strategic and operational decision making, including hiring simulation engineers, building internal simulation team, or contract consultants. These practices are different in terms of budget, time to implement, and returns. In this paper, an innovative approach is described that provide a simulation solution that is affordable at the same time can be quickly implemented, it consists of generic interface that captures the information and structure of the supply chain then automatically generates simulation models. The user, which not necessarily a simulation expert, can quickly jump to the analysis and evaluation of scenarios. The paper presents a case study where the approach was implemented to model, simulate, and analyze NASAs Space Exploration Supply-Chain.


IEEE Transactions on Neural Networks | 2010

An Adaptive Multiobjective Approach to Evolving ART Architectures

Assem Kaylani; Michael Georgiopoulos; Mansooreh Mollaghasemi; Georgios C. Anagnostopoulos; Christopher Sentelle; Mingyu Zhong

In this paper, we present the evolution of adaptive resonance theory (ART) neural network architectures (classifiers) using a multiobjective optimization approach. In particular, we propose the use of a multiobjective evolutionary approach to simultaneously evolve the weights and the topology of three well-known ART architectures; fuzzy ARTMAP (FAM), ellipsoidal ARTMAP (EAM), and Gaussian ARTMAP (GAM). We refer to the resulting architectures as MO-GFAM, MO-GEAM, and MO-GGAM, and collectively as MO-GART. The major advantage of MO-GART is that it produces a number of solutions for the classification problem at hand that have different levels of merit [accuracy on unseen data (generalization) and size (number of categories created)]. MO-GART is shown to be more elegant (does not require user intervention to define the network parameters), more effective (of better accuracy and smaller size), and more efficient (faster to produce the solution networks) than other ART neural network architectures that have appeared in the literature. Furthermore, MO-GART is shown to be competitive with other popular classifiers, such as classification and regression tree (CART) and support vector machines (SVMs).


Neural Networks | 2007

GFAM: Evolving Fuzzy ARTMAP neural networks

Ahmad Al-Daraiseh; Assem Kaylani; Michael Georgiopoulos; Mansooreh Mollaghasemi; Annie S. Wu; Georgios C. Anagnostopoulos

This paper focuses on the evolution of Fuzzy ARTMAP neural network classifiers, using genetic algorithms, with the objective of improving generalization performance (classification accuracy of the ART network on unseen test data) and alleviating the ART category proliferation problem (the problem of creating more than necessary ART network categories to solve a classification problem). We refer to the resulting architecture as GFAM. We demonstrate through extensive experimentation that GFAM exhibits good generalization and is of small size (creates few ART categories), while consuming reasonable computational effort. In a number of classification problems, GFAM produces the optimal classifier. Furthermore, we compare the performance of GFAM with other competitive ARTMAP classifiers that have appeared in the literature and addressed the category proliferation problem in ART. We illustrate that GFAM produces improved results over these architectures, as well as other competitive classifiers.


Neurocomputing | 2009

AG-ART: An adaptive approach to evolving ART architectures

Assem Kaylani; Michael Georgiopoulos; Mansooreh Mollaghasemi; Georgios C. Anagnostopoulos

This paper focuses on classification problems, and in particular on the evolution of ARTMAP architectures using genetic algorithms, with the objective of improving generalization performance and alleviating the adaptive resonance theory (ART) category proliferation problem. In a previous effort, we introduced evolutionary fuzzy ARTMAP (FAM), referred to as genetic Fuzzy ARTMAP (GFAM). In this paper we apply an improved genetic algorithm to FAM and extend these ideas to two other ART architectures; ellipsoidal ARTMAP (EAM) and Gaussian ARTMAP (GAM). One of the major advantages of the proposed improved genetic algorithm is that it adapts the GA parameters automatically, and in a way that takes into consideration the intricacies of the classification problem under consideration. The resulting genetically engineered ART architectures are justifiably referred to as AG-FAM, AG-EAM and AG-GAM or collectively as AG-ART (adaptive genetically engineered ART). We compare the performance (in terms of accuracy, size, and computational cost) of the AG-ART architectures with GFAM, and other ART architectures that have appeared in the literature and attempted to solve the category proliferation problem. Our results demonstrate that AG-ART architectures exhibit better performance than their other ART counterparts (semi-supervised ART) and better performance than GFAM. We also compare AG-ARTs performance to other related results published in the classification literature, and demonstrate that AG-ART architectures exhibit competitive generalization performance and, quite often, produce smaller size classifiers in solving the same classification problems. We also show that AG-ARTs performance gains are achieved within a reasonable computational budget.


Journal of the Operational Research Society | 2008

A generic environment for modelling future launch operations—GEM-FLO: a success story in generic modelling

Assem Kaylani; Mansooreh Mollaghasemi; Dayana Cope; Sam Fayez; Ghaith Rabadi; Martin J. Steele

Several NASA programs have been established to study and improve the current launch capability to meet the need for more aggressive space exploration in the future. Numerous launch systems have been proposed by different government and commercial organizations with the potential goal of replacing the Space Shuttle. NASA must evaluate new designs and technologies with the objective of improving upon todays Shuttle cost, performance, and turnaround time, before the government or commercial organizations pursue the large undertaking of a new launch system. To address this issue, the Generic Simulation Environment for Modelling Future Launch Operations (GEM-FLO) was developed to accurately predict processing turnaround times and other effectiveness criteria and support making key business and program decisions. GEM-FLO utilizes a generic modelling paradigm to provide a single platform for modelling different designs, which helped significantly cut the cost of these studies. This paper documents a success story in generic simulation modelling.


international symposium on neural networks | 2007

Genetic Optimization of ART Neural Network Architectures

Assem Kaylani; Ahmad Al-Daraiseh; Michael Georgiopoulos; Mansooreh Mollaghasemi; Georgios C. Anagnostopoulos; Annie S. Wu

This paper focuses on the evolution of ARTMAP architectures, using genetic algorithms, with the objective of improving generalization performance and alleviating the ART category proliferation problem. We refer to the resulting architectures as GFAM, GEAM, and GGAM. We demonstrate through extensive experimentation that evolved ARTMAP architectures exhibit good generalization and are of small size, while consuming reasonable computational effort to produce an optimal or a sub-optimal network. Furthermore, we compare the performance of GFAM, GEAM and GGAM with other competitive ARTMAP architectures that have appeared in the literature and addressed the category proliferation problem in ART. This comparison indicates that GFAM, GEAM and GGAM have superior performance (generalize better, are of smaller size, and require less computations) compared with other competitive ARTMAP architectures.


congress on evolutionary computation | 2010

Multi-objective memetic evolution of ART-based classifiers

Rong Li; Timothy R. Mersch; Oriana X. Wen; Assem Kaylani; Georgios C. Anagnostopoulos

In this paper we present a novel framework for evolving ART-based classification models, which we refer to as MOME-ART. The new training framework aims to evolve populations of ART classifiers to optimize both their classification error and their structural complexity. Towards this end, it combines the use of interacting sub-populations, some traditional elements of genetic algorithms to evolve these populations and a simulated annealing process used for solution refinement to eventually give rise to a multi-objective, memetic evolutionary framework. In order to demonstrate its capabilities, we utilize the new framework to train populations of semi-supervised Fuzzy ARTMAP and compare them with similar networks trained via the recently published MO-GART framework, which has been shown as being very effective in yielding high-quality ART-based classifiers. The experimental results show clear advantages of MOME-ART in terms of Pareto Front quality and density, as well as parsimony properties of the resulting classifiers.


world congress on computational intelligence | 2008

MO-GART: Multiobjective genetic ART architectures

Assem Kaylani; Michael Georgiopoulos; Mansooreh Mollaghasemi; Georgios C. Anagnostopoulos

In this work we present, for the first time, the evolution of ART Neural Network architectures (classifiers) using a multiobjective optimization approach. In particular, we propose the use of a multiobjective evolutionary approach to evolve simultaneously the weights, as well as the topology of three well-known ART architectures; Fuzzy ARTMAP (FAM), Ellipsoidal ARTMAP (EAM) and Gaussian ARTMAP (GAM). We refer to the resulting architectures as MO-GFAM, MOGEAM, or MO-GGAM, and collectively as MO-GART. The major advantage of MO-GART is that it produces a number of solutions for the classification problem at hand that have different levels of merit (accuracy on unseen data (generalization) and size (number of categories created)). MO-GART is shown to be more elegant (does not require user intervention to define the network parameters), more effective (of better accuracy and smaller size), and more efficient (faster to produce the solution networks) than other ART neural network architectures that have appeared in the literature.


winter simulation conference | 2004

An integrated estimation and modeling environment for the design of the orbital space plane

Dayana Cope; Mansooreh Mollaghasemi; Assem Kaylani; Alex J. Ruiz-Torres; Martin J. Steele; Marcella L. Cowen

The development of simulation models can be time consuming and highly dependant on system data being widely available. When using simulation modeling to analyze future systems, system data may not be available for the system under study and simulation results are often needed within a short time frame to support early system design efforts. This paper presents a parametric estimation/generic simulation integrated environment developed to facilitate the rapid development of valid simulation models for the Orbital Space Vehicle ground processing operations.


international symposium on neural networks | 2011

Multi-objective evolutionary optimization of exemplar-based classifiers: A PNN test case

Talitha Rubio; Tiantian Zhang; Michael Georgiopoulos; Assem Kaylani

In this paper the major principles to effectively design a parameter-less, multi-objective evolutionary algorithm that optimizes a population of probabilistic neural network (PNN) classifier models are articulated; PNN is an example of an exemplar-based classifier. These design principles are extracted from experiences, discussed in this paper, which guided the creation of the parameter-less multi-objective evolutionary algorithm, named MO-EPNN (multi-objective evolutionary probabilistic neural network). Furthermore, these design principles are also corroborated by similar principles used for an earlier design of a parameter-less, multi-objective genetic algorithm used to optimize a population of ART (adaptive resonance theory) models, named MO-GART (multi-objective genetically optimized ART); the ART classifier model is another example of an exemplar-based classifier model. MO-EPNNs performance is compared to other popular classifier models, such as SVM (Support Vector Machines) and CART (Classification and Regression Trees), as well as to an alternate competitive method to genetically optimize the PNN. These comparisons indicate that MO-EPNNs performance (generalization on unseen data and size) compares favorably to the aforementioned classifier models and to the alternate genetically optimized PNN approach. MO-EPPNs good performance, and MO-GARTs earlier reported good performance, both of whose design relies on the same principles, gives credence to these design principles, delineated in this paper.

Collaboration


Dive into the Assem Kaylani's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michael Georgiopoulos

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ahmad Al-Daraiseh

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar

Annie S. Wu

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar

Christopher Sentelle

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mingyu Zhong

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar

Mohamed Sam Fayez

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge