Aboul Ella Hassanien
Cairo University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Aboul Ella Hassanien.
Sensors | 2011
Ashraf Darwish; Aboul Ella Hassanien
Wireless sensor network (WSN) technologies are considered one of the key research areas in computer science and the healthcare application industries for improving the quality of life. The purpose of this paper is to provide a snapshot of current developments and future direction of research on wearable and implantable body area network systems for continuous monitoring of patients. This paper explains the important role of body sensor networks in medicine to minimize the need for caregivers and help the chronically ill and elderly people live an independent life, besides providing people with quality care. The paper provides several examples of state of the art technology together with the design considerations like unobtrusiveness, scalability, energy efficiency, security and also provides a comprehensive analysis of the various benefits and drawbacks of these systems. Although offering significant benefits, the field of wearable and implantable body sensor networks still faces major challenges and open research problems which are investigated and covered, along with some proposed solutions, in this paper.
Future Generation Computer Systems | 2010
Hongbo Liu; Ajith Abraham; Aboul Ella Hassanien
Grid computing is a computational framework used to meet growing computational demands. This paper introduces a novel approach based on Particle Swarm Optimization (PSO) for scheduling jobs on computational grids. The representations of the position and velocity of the particles in conventional PSO is extended from the real vectors to fuzzy matrices. The proposed approach is to dynamically generate an optimal schedule so as to complete the tasks within a minimum period of time as well as utilizing the resources in an efficient way. We evaluate the performance of the proposed PSO algorithm with a Genetic Algorithm (GA) and Simulated Annealing (SA) approach. Empirical results illustrate that an important advantage of the PSO algorithm is its speed of convergence and the ability to obtain faster and feasible schedules.
international conference of the ieee engineering in medicine and biology society | 2009
Aboul Ella Hassanien; Ajith Abraham; James F. Peters; Gerald Schaefer; Christopher J. Henry
This paper presents a review of the current literature on rough-set- and near-set-based approaches to solving various problems in medical imaging such as medical image segmentation, object extraction, and image classification. Rough set frameworks hybridized with other computational intelligence technologies that include neural networks, particle swarm optimization, support vector machines, and fuzzy sets are also presented. In addition, a brief introduction to near sets and near images with an application to MRI images is given. Near sets offer a generalization of traditional rough set theory and a promising approach to solving the medical image correspondence problem as well as an approach to classifying perceptual objects by means of features in solving medical imaging problems. Other generalizations of rough sets such as neighborhood systems, shadowed sets, and tolerance spaces are also briefly considered in solving a variety of medical imaging problems. Challenges to be addressed and future directions of research are identified and an extensive bibliography is also included.
Image and Vision Computing | 2007
Aboul Ella Hassanien
This paper introduces a hybrid scheme that combines the advantages of fuzzy sets and rough sets in conjunction with statistical feature extraction techniques. An application of breast cancer imaging has been chosen and hybridization scheme have been applied to see their ability and accuracy to classify the breast cancer images into two outcomes: cancer or non-cancer. The introduced scheme starts with fuzzy image processing as pre-processing techniques to enhance the contrast of the whole image; to extracts the region of interest and then to enhance the edges surrounding the region of interest. A subsequently extract features from the segmented regions of the interested regions using the gray-level co-occurrence matrix is presented. Rough sets approach for generation of all reducts that contains minimal number of attributes and rules is introduced. Finally, these rules can then be passed to a classifier for discrimination for different regions of interest to test whether they are cancer or non-cancer. To measure the similarity, a new rough set distance function is presented. The experimental results show that the hybrid scheme applied in this study perform well reaching over 98% in overall accuracy with minimal number of generated rules. (This paper was not presented at any IFAC meeting).
Neurocomputing | 2016
Eid Emary; Hossam M. Zawbaa; Aboul Ella Hassanien
In this work, a novel binary version of the grey wolf optimization (GWO) is proposed and used to select optimal feature subset for classification purposes. Grey wolf optimizer (GWO) is one of the latest bio-inspired optimization techniques, which simulate the hunting process of grey wolves in nature. The binary version introduced here is performed using two different approaches. In the first approach, individual steps toward the first three best solutions are binarized and then stochastic crossover is performed among the three basic moves to find the updated binary grey wolf position. In the second approach, sigmoidal function is used to squash the continuous updated position, then stochastically threshold these values to find the updated binary grey wolf position. The two approach for binary grey wolf optimization (bGWO) are hired in the feature selection domain for finding feature subset maximizing the classification accuracy while minimizing the number of selected features. The proposed binary versions were compared to two of the common optimizers used in this domain namely particle swarm optimizer and genetic algorithms. A set of assessment indicators are used to evaluate and compared the different methods over 18 different datasets from the UCI repository. Results prove the capability of the proposed binary version of grey wolf optimization (bGWO) to search the feature space for optimal feature combinations regardless of the initialization and the used stochastic operators.
Archive | 2009
Ajith Abraham; Aboul Ella Hassanien; Patrick Siarry; Andries P. Engelbrecht
Learning methods and approximation algorithms are fundamental tools that deal with computationally hard problems and problems in which the input is gradually disclosed over time. Both kinds of problems have a large number of applications arising from a variety of fields, such as algorithmic game theory, approximation classes, coloring and partitioning, competitive analysis, computational finance, cuts and connectivity, geometric problems, inapproximability results, mechanism design, network design, packing and covering, paradigms for design and analysis of approximation and online algorithms, randomization techniques, real-world applications, scheduling problems and so on. The past years have witnessed a large number of interesting applications using various techniques of Computational Intelligence such as rough sets, connectionist learning; fuzzy logic; evolutionary computing; artificial immune systems; swarm intelligence; reinforcement learning, intelligent multimedia processing etc.. In spite of numerous successful applications of Computational Intelligence in business and industry, it is sometimes difficult to explain the performance of these techniques and algorithms from a theoretical perspective. Therefore, we encouraged authors to present original ideas dealing with the incorporation of different mechanisms of Computational Intelligent dealing with Learning and Approximation algorithms and underlying processes.
soft computing | 2015
Ahmad Taher Azar; Aboul Ella Hassanien
Massive and complex data are generated every day in many fields. Complex data refer to data sets that are so large that conventional database management and data analysis tools are insufficient to deal with them. Managing and analysis of medical big data involve many different issues regarding their structure, storage and analysis. In this paper, linguistic hedges neuro-fuzzy classifier with selected features (LHNFCSF) is presented for dimensionality reduction, feature selection and classification. Four real-world data sets are provided to demonstrate the performance of the proposed neuro-fuzzy classifier. The new classifier is compared with the other classifiers for different classification problems. The results indicated that applying LHNFCSF not only reduces the dimensions of the problem, but also improves classification performance by discarding redundant, noise-corrupted, or unimportant features. The results strongly suggest that the proposed method not only help reducing the dimensionality of large data sets but also can speed up the computation time of a learning algorithm and simplify the classification tasks.
Journal of the Association for Information Science and Technology | 2004
Aboul Ella Hassanien
Rough set theory is a relatively new intelligent technique used in the discovery of data dependencies; it evaluates the importance of attributes, discovers the patterns of data, reduces all redundant objects and attributes, and seeks the minimum subset of attributes. Moreover, it is being used for the extraction of rules from databases. In this paper, we present a rough set approach to attribute reduction and generation of classification rules from a set of medical datasets. For this purpose, we first introduce a rough set reduction technique to find all reducts of the data that contain the minimal subset of attributes associated with a class label for classification. To evaluate the validity of the rules based on the approximation quality of the attributes, we introduce a statistical test to evaluate the significance of the rules. Experimental results from applying the rough set approach to the set of data samples are given and evaluated. In addition, the rough set classification accuracy is also compared to the well-known ID3 classifier algorithm. The study showed that the theory of rough sets is a useful tool for inductive learning and a valuable aid for building expert systems.
Neural Computing and Applications | 2014
Hossam M. Moftah; Ahmad Taher Azar; Eiman Tamah Al-Shammari; Neveen I. Ghali; Aboul Ella Hassanien; Mahmoud Shoman
Image segmentation is vital for meaningful analysis and interpretation of the medical images. The most popular method for clustering is k-means clustering. This article presents a new approach intended to provide more reliable magnetic resonance (MR) breast image segmentation that is based on adaptation to identify target objects through an optimization methodology that maintains the optimum result during iterations. The proposed approach improves and enhances the effectiveness and efficiency of the traditional k-means clustering algorithm. The performance of the presented approach was evaluated using various tests and different MR breast images. The experimental results demonstrate that the overall accuracy provided by the proposed adaptive k-means approach is superior to the standard k-means clustering technique.
Applied Soft Computing | 2014
Aboul Ella Hassanien; Hossam M. Moftah; Ahmad Taher Azar; Mahmoud Shoman
This article introduces a hybrid approach that combines the advantages of fuzzy sets, ant-based clustering and multilayer perceptron neural networks (MLPNN) classifier, in conjunction with statistical-based feature extraction technique. An application of breast cancer MRI imaging has been chosen and hybridization system has been applied to see their ability and accuracy to classify the breast cancer images into two outcomes: Benign or Malignant. The introduced hybrid system starts with an algorithm based on type-II fuzzy sets to enhance the contrast of the input images. This is followed by an improved version of the classical ant-based clustering algorithm, called adaptive ant-based clustering to identify target objects through an optimization methodology that maintains the optimum result during iterations. Then, more than twenty statistical-based features are extracted and normalized. Finally, a MLPNN classifier was employed to evaluate the ability of the lesion descriptors for discrimination of different regions of interest to determine whether the cancer is Benign or Malignant. To evaluate the performance of presented approach, we present tests on different breast MRI images. The experimental results obtained, show that the adaptive ant-based segmentation is superior to the classical ant-based clustering technique and the overall accuracy offered by the employed hybrid technique confirm that the effectiveness and performance of the proposed hybrid system is high.