Mohamed Elhoseny
Mansoura University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mohamed Elhoseny.
IEEE Communications Letters | 2015
Mohamed Elhoseny; Xiaohui Yuan; Zhengtao Yu; Cunli Mao; Hamdy K. Elminir; A. M. Riad
In a heterogeneous Wireless Sensor Network (WSN), factors such as initial energy, data processing capability, etc. greatly influence the network lifespan. Despite the success of various clustering strategies of WSN, the numerous possible sensor clusters make searching for an optimal network structure an open challenge. In this paper, we propose a Genetic Algorithm based method that optimizes heterogeneous sensor node clustering. Compared with five state-of-the-art methods, our proposed method greatly extends the network life, and the average improvement with respect to the second best performance based on the first-node-die and the last-node-die is 33.8% and 13%, respectively. The balanced energy consumption greatly improves the network life and allows the sensor energy to deplete evenly. The computational efficiency of our method is comparable to the others and the overall average time across all experiments is 0.6 seconds with a standard deviation of 0.06.
Cluster Computing | 2018
Alaa Tharwat; Mohamed Elhoseny; Aboul Ella Hassanien; Thomas Gabel; Arun Kumar
Path planning algorithms have been used in different applications with the aim of finding a suitable collision-free path which satisfies some certain criteria such as the shortest path length and smoothness; thus, defining a suitable curve to describe path is essential. The main goal of these algorithms is to find the shortest and smooth path between the starting and target points. This paper makes use of a Bézier curve-based model for path planning. The control points of the Bézier curve significantly influence the length and smoothness of the path. In this paper, a novel Chaotic Particle Swarm Optimization (CPSO) algorithm has been proposed to optimize the control points of Bézier curve, and the proposed algorithm comes in two variants: CPSO-I and CPSO-II. Using the chosen control points, the optimum smooth path that minimizes the total distance between the starting and ending points is selected. To evaluate the CPSO algorithm, the results of the CPSO-I and CPSO-II algorithms are compared with the standard PSO algorithm. The experimental results proved that the proposed algorithm is capable of finding the optimal path. Moreover, the CPSO algorithm was tested against different numbers of control points and obstacles, and the CPSO algorithm achieved competitive results.
Expert Systems With Applications | 2017
Noura Metawa; M. Kabir Hassan; Mohamed Elhoseny
The bank lending decisions in credit crunch environments are big challenge.This NP-hard optimization problem is solved using a proposed GA based model.The proposed model is tested using two scenarios with simulated and real data.The real data is collected from Southern Louisiana Credit Union.The proposed model increased the bank profit and improved the system performance. To avoid the complexity and time consumption of traditional statistical and mathematical programming, intelligent techniques have gained great attention in different financial research areas, especially in banking decisions optimization. However, choosing optimum bank lending decisions that maximize the bank profit in a credit crunch environment is still a big challenge. For that, this paper proposes an intelligent model based on the Genetic Algorithm (GA) to organize bank lending decisions in a highly competitive environment with a credit crunch constraint (GAMCC). GAMCC provides a framework to optimize bank objectives when constructing the loan portfolio, by maximizing the bank profit and minimizing the probability of bank default in a search for a dynamic lending decision. Compared to the state-of-the art methods, GAMCC is considered a better intelligent tool that enables banks to reduce the loan screening time by a range of 12%50%. Moreover, it greatly increases the bank profit by a range of 3.9%8.1%.
Cluster Computing | 2018
Moein Sarvaghad-Moghaddam; Ali A. Orouji; Zeinab Ramezani; Mohamed Elhoseny; Ahmed Farouk; N. Arun Kumar
Progress in the technology of submicron semiconductor device, which makes a short-channel and quantum effects, having equations of nonlinear modelling, leads to complicated and time-consuming calculations. In order to control these complexities and obtain the device characteristics according to device parameters, a faster method is needed. In this paper, a combinational algorithm is proposed for modelling a nano silicon-on-insulator metal–oxide–semiconductor field effect transistor (SOI MOSFET) characteristic. The proposed method shows the same device characteristics with lower input parameters. In this method, a combination of genetic algorithm (GA) and artificial neural network are used. Then quantum evolutionary algorithm (QEA) is employed instead of genetic algorithm (GA) for comparing and modifying algorithm. Results show that the algorithm’s accuracy is 95% and 98% for test data of GA and QGA, respectively. Moreover, the reduction percentage of input parameters are 11% and 52% for GA and QEA, respectively. The simulation results represent that the implemented quantum genetic algorithm for prediction of device characteristics is more effective and accurate than GA.
Journal of Computational Science | 2017
Mohamed Elhoseny; Alaa Tharwat; Aboul Ella Hassanien
Abstract Mobile robots have been used in different applications such as assembly, transportation, and manufacturing. Although, the great work to get the optimum robots path, traditional path planning algorithms often assume that the environment is perfectly known and try to search for the optimal path that contains sharp turns and some polygonal lines. This paper proposes an efficient, Bezier curve based approach for the path planning in a dynamic field using a Modified Genetic Algorithm (MGA). The proposed MGA aims to boost the diversity of the generated solutions of the standard GA which increases the exploration capabilities of the MGA. In our proposed method, the robots path is dynamically decided based on the obstacles’ locations. With the goal of optimizing the distance between the start point and the target point, the MGA is employed to search for the most suitable points as the control points of the Bezier curve. Using the chosen control points, the optimum smooth path that minimizes the total distance between the start and the end points is selected. Our model was tested on different environments with different scales, different numbers of obstacles, and six benchmark maps. As a result, the proposed method provides an efficient way to avoid robots energy consumption in harsh environments.
international computer engineering conference | 2016
Noura Metawa; Mohamed Elhoseny; M. Kabir Hassan; Aboul Ella Hassanien
With the increasing impact of capital regulation on banks financial decisions especially in competing environment with credit constraints, it comes the urge to set an optimal mechanism of bank lending decisions that will maximize the bank profit in a timely manner. In this context, we propose a self-organizing method for dynamically organizing bank lending decision using Genetic Algorithm (GA). Our proposed GA based model provides a framework to optimize bank objective when constructing the loan portfolio, which maximize the bank profit and minimize the probability of bank default in a search for an optimal, dynamic lending decision. Multiple factors related to loan characteristics, creditor ratings are integrated to GA chromosomes and validation is performed to ensure the optimal decision. GA uses random search to suggest the best appropriate design. We use this algorithm in order to obtain the most efficient lending decision. The reason for choosing GA is its convergence and its flexibility in solving multi-objective optimization problems such as credit assessment, portfolio optimization and bank lending decision.
Future Generation Computer Systems | 2018
Mohamed Elhoseny; Ahmed Abdelaziz; Ahmed S. Salama; A. M. Riad; Khan Muhammad; Arun Kumar Sangaiah
Abstract Over the last decade, there has been an increasing interest in big data research, especially for health services applications. The adoption of the cloud computing and the Internet of Things (IoT) paradigm in the healthcare field can bring several opportunities to medical IT, and experts believe that it can significantly improve healthcare services and contribute to its continuous and systematic innovation in a big data environment such as Industry 4.0 applications. However, the required resources to manage such data in a cloud-IoT environment are still a big challenge. Accordingly, this paper proposes a new model to optimize virtual machines selection (VMs) in cloud-IoT health services applications to efficiently manage a big amount of data in integrated industry 4.0. Industry 4.0 applications require to process and analyze big data, which come from different sources such as sensor data, without human intervention. The proposed model aims to enhance the performance of the healthcare systems by reducing the stakeholders’ request execution time, optimizing the required storage of patients’ big data and providing a real-time data retrieval mechanism for those applications. The architecture of the proposed hybrid cloud-IoT consists of four main components: stakeholders’ devices, stakeholders’ requests (tasks), cloud broker and network administrator. To optimize the VMs selection, three different well-known optimizers (Genetic Algorithm (GA), Particle swarm optimizer (PSO) and Parallel Particle swarm optimization (PPSO) are used to build the proposed model. To calculate the execution time of stakeholders’ requests, the proposed fitness function is a composition of three important criteria which are CPU utilization, turn-around time and waiting time. A set of experiments were conducted to provide a comparative study between those three optimizers regarding the execution time, the data processing speed, and the system efficiency. The proposed model is tested against the state-of-the-art method to evaluate its effectiveness. The results show that the proposed model outperforms on the state-of-the-art models in total execution time the rate of 50%. Also, the system efficiency regarding real-time data retrieve is significantly improved by 5.2%.
Journal of Intelligent and Fuzzy Systems | 2017
Mohamed Elhoseny; Abdulaziz Shehab; Xiaohui Yuan
Robots have recently gained a great attention due to their potential to work in dynamic and complex environments with obstacles, which make searching for an optimum path on-the-fly an open challenge. To address this problem, this paper proposes a Genetic Algorithm (GA) based path planning method to work in a dynamic environment called GADPP. The proposed method uses Bezier Curve to refine the final path according to the control points identified by our GADPP. To update the path during its movement, the robot receives a signal from a Base Station (BS) based on the alerts that are periodically triggered by sensors. Compared to the state-of-the-art methods, GADPP improves the performance of robot based applications in terms of the path length, the smoothness of the path, and the required time to get the optimum path. The improvement ratio regarding the path length is between 6% and 48%. While the path smoothness is improved in the range of 8% and 52%. In addition, GADPP reduces the required time to get the optimum path by 6% up to 47%.
Computers & Electrical Engineering | 2017
Walaa Elsayed; Mohamed Elhoseny; Sahar F. Sabbeh; A. M. Riad
Abstract Wireless Sensor Networks have wide variety of applications and their nodes are prone to failure due to a hardware failure or malicious attacks. The self-healing mechanism is used for fault detection, diagnosis and healing. However, implementing the self-healing procedures at the cluster head affects the network performance. In this paper, we present a distributed self-healing approach for both node and cluster head levels. At node level, battery, sensor and receiver faults can be diagnosed while, at cluster head level, transmitter and mal-functional nodes can be detected and recovered. Compared to the state-of-the art methods, our model tolerates up to 67.3% of different hardware faults at node level. Moreover, it realized a detection accuracy of sensor circuit fault tolerate up to 76.9%, 52% of battery fault and 71.96% of receiver faults. At head class level, 75.7% of transmitter fault and 60% of microcontroller circuit fault are realized.
Computers & Electrical Engineering | 2018
Rizk M. Rizk-Allah; Aboul Ella Hassanien; Mohamed Elhoseny
Abstract In this paper, a new compromise algorithm for multi-objective transportation problem (MO-TP) is developed, which is inspired by Zimmermanns fuzzy programming and the neutrosophic set terminology. The proposed NCPA is characterized by assigning three membership functions for each objective namely, truth membership, indeterminacy membership and falsity membership. With the membership functions for all objectives, a neutrosophic compromise programming model is constructed with the aim to find best compromise solution (BCS). This model can cover a wide spectrum of BCSs by controlling the membership functions interactively. The performance of the NCPA is validated by measuring the ranking degree using TOPSIS approach. Illustrative examples are reported and compared with exists models in the literature. Based on the provided comparisons, NCPA is superior to fuzzy and different approaches.