Sa'ed Abed
Kuwait University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Sa'ed Abed.
Applied Soft Computing | 2016
Sa'ed Abed; Suood Abdulaziz Al-Roomi; Mohammad H. Alshayeji
Display Omitted Proposing a work that is competitive with state-of-art optic disc detection methods.Describing a novel pre-processing method that improves optic disc detection accuracy.The novel use of four swarm intelligence algorithms (artificial bee colony, particle swarm optimization, bat algorithm, and cuckoo search) for optic disc detection.Providing accuracy, consistency and speed comparison between five swarm algorithms.Providing high performance parameters for swarm intelligence algorithms for optic disc detection and performance study on each parameter and its effect on the accuracy. Diabetic retinopathy affects the vision of a significant fraction of the population worldwide. Retinal fundus images are used to detect the condition before vision loss develops to enable medical interventions. Optic disc detection is an essential step for the automatic detection of the disease. Several techniques have been introduced in the literature to detect the optic disc with different performance characteristics such as speed, accuracy and consistency. For optic disc detection, a nature-inspired algorithm called swarm intelligence has been shown to have clear superiority in terms of speed and accuracy compared to traditional detection algorithms. We therefore further investigated and compared several swarm intelligence techniques. Our study focused on five popular swarm intelligence algorithms: artificial bee colony, particle swarm optimization, bat algorithm, cuckoo search and firefly algorithm. This work also featured a novel pre-processing scheme that enhances the detection accuracy of the swarm techniques by making the optic disc region the highest grayscale value in the image. The pre-processing involves multiple stages of background subtraction, median filtering and mean filtering and is named Background Subtraction-based Optic Disc Detection (BSODD). The best result was obtained by combining our pre-processing technique, firefly algorithm and the parameters used for the algorithm. The obtained accuracy was superior to the other tested algorithms and published results in the literature. The accuracy of the firefly algorithm was 100%, 100%, 98.82% and 95% when using the DRIVE, DiaRetDB1, DMED and STARE databases, respectively.
Computers & Electrical Engineering | 2018
Mohammad H. Alshayeji; Mohammad Al-Rousan; Hanem Ellethy; Sa'ed Abed
Abstract In this work, an efficient multiple sclerosis (MS) segmentation technique is proposed to simplify pre-processing steps and diminish processing time using heterogeneous single-channel magnetic resonance imaging (MRI). A spatial-filtering image mapping, histogram reference image, and histogram matching techniques are effectively applied to possess a local threshold per image using the global threshold algorithm. Feature extraction is performed using mathematical and morphological operations, and a multilayer feed-forward neural network (MLFFNN) is used identify multiple sclerosis’ tissues. Fluid-attenuated inversion recovery (FLAIR) series are used to integrate a faster system while maintaining reliability and accuracy. A sagittal (SAG) FLAIR-based system is proposed for the first time in MS detection systems, which reduces the number of utilized images, and decreases the processing time by nearly one-third. Our detection system provided a significant recognition rate of up to 98.5%. Moreover, a relatively high dice coefficient (DC) value (0.71 ± 0.18) was observed upon testing new images.
Artificial Intelligence Review | 2018
Sahel Alouneh; Sa'ed Abed; Mohammad H. Al Shayeji; Raed Mesleh
Boolean satisfiability (SAT) has been studied for the last twenty years. Advances have been made allowing SAT solvers to be used in many applications including formal verification of digital designs. However, performance and capacity of SAT solvers are still limited. From the practical side, many of the existing applications based on SAT solvers use them as blackboxes in which the problem is translated into a monolithic conjunctive normal form instance and then throw it to the SAT solver with no interaction between the application and the SAT solver. This paper presents a comprehensive study and analysis of the latest developments in SAT-solver and new approaches that used in branching heuristics, Boolean constraint propagation and conflict analysis techniques during the last two decade. In addition, the paper provides the most effective techniques in using SAT solvers as verification techniques, mainly model checkers, to enhance the SAT solver performance, efficiency and productivity. Moreover, the paper presents the remarkable accomplishments and the main challenges facing SAT-solver techniques and contrasts between different techniques according to their efficiency, algorithms, usage and feasibility.
international conference software and computer applications | 2017
Sa'ed Abed; Mohammad H. Alshayeji; Zahra'a Abdullah; Zainab AlSaeed
As the industry of Network of Chips (NoCs) evolves, the reliability and performance of these systems are becoming more critical requirement. The fault tolerance issue is an essential factor that has a direct impact on the reliability of the system. Many techniques were developed to boost the fault tolerance capability of NoCs. This is either implemented on the routing algorithm level or architecture level. This paper analyzes previous work that enhances the fault tolerance by modifying the router architecture. The model of Partial Virtual Sharing (PVS) architecture was modified to improve the fault tolerance capability. We proposed a technique to implement fault tolerance at the input unit of the router architecture. Additional enhancements to implement fault tolerance at the output unit of the router was proposed and implemented too. The reliability of the proposed design was evaluated and compared based on the Mean Time Between Failure (MTBF) metric. The proposed design had shown a remarkable improvement of 263.2% over existing approaches.
international conference on modeling simulation and applied optimization | 2017
Abdullah AlWatyan; Wesam Mater; Omar Almutairi; Mohammed Almutairi; Aisha Al-Noori; Sa'ed Abed
Secrecy in communication is critical for people that have sensitive information. Steganography is a field that secures sensitive information by hiding it inside media files (such as images). What makes the concept of steganography powerful is that the secret message is hidden cleverly within the media file and thus carried around invisibly. Our paper proposed an automated method to secure a message using two levels of security. In the first level, the data is encrypted by our encryption method developed in Java and named Character Bit Shuffler (CBS). Then in the second level, the encrypted data is hidden inside an image using the Least Significant Bit (LSB) technique that changes only the last bits of the image pixels. The advantage of the LSB method is that it is simple as well as it maintains the quality of the image. A 1-1-0 LSB scheme will be used on 32-bit images in this paper. Experimental results that measure the image quality of our proposed approach are presented at the end of the paper.
Security and Communication Networks | 2016
Sahel Alouneh; Sa'ed Abed; George Ghinea
The early days of voice over IP (VoIP) adoption were characterized by a lack of concern and awareness about security issues related to its use. Indeed, service providers and users were mostly preoccupied with issues related to its quality, functionality, and cost. Now that VoIP is a mainstream communication technology, security has become a major issue. This paper investigates the major security threats for VoIP communications and proposes a multipath approach solution, especially targeted for low bandwidth networks. Results show that security has an effect on VoIP quality especially for a large distance between communicating nodes and packet size. Results also show that our proposed multipath solution reduces significantly packet losses and performs better than single routing techniques in networks with low bandwidth capacities. Copyright
Security and Communication Networks | 2016
Mohammad H. Alshayeji; Suood Abdulaziz Al-Roomi; Sa'ed Abed
In this paper, we propose a novel least significant bit embedding approach that capitalizes on the skewed distribution of letter and word frequencies to achieve higher image capacity, quality, and security. We initially conduct a study that involves all of the character frequencies using a data set of 14.245 billion characters. Huffman coding for each character is generated on the basis of its probability of occurrence. Furthermore, the top 100 000 most frequent words are transformed into a smaller ciphertext that has a lower cost. Our work demonstrates that recognizing characters and words on the basis of their frequency patterns and prioritizing them accordingly has a greater prospect of reducing the overall cost of embedding. The proposed scheme significantly outperforms Lempel–Ziv–Welch compression with an average of 45% fewer embedded bits. Moreover, the image quality is improved by a mean peak signal-to-noise ratio value of 6.9%. The proposed method also establishes a security embedding by proposing a novel shuffling algorithm. Copyright
Iet Computers and Digital Techniques | 2016
Sa'ed Abed; Mohammad H. Alshayeji; Sari Sultan; Nesreen Mohammad
An effective design of cache memory is an important aspect in computer architecture to improve the system performance. In this work, we study the effect of reducing the cache comparisons to map the cache address on the performance experimentally and analytically. Cache miss penalties have drastic impact on the systems’ performance. To overcome this, we propose a novel tag access scheme, which uses a partial comparison unit called n-bit comparator and use multiple search methods inside the data cache to improve the cache performance by reducing cache access time. Partial tag comparison (PTC) enables the cache to compare the tag in multi-stage techniques starting with the least significant bits (LSBs). Thus, useless tag comparison and number of tag bits being compared can be effectively reduced, hence reaching the requested tag is faster and the cache hit time is reduced. Simulation results show that the proposed approach outperforms conventional mapping techniques. The PTC technique improves the hit time in 2-bank and 4-bank fully associative caches by 70–96% and 67–88% over a cache with full tag comparison. Moreover, the proposed technique provides the minimum hit time when using a hash searching method rather than other searching methods: linear and binary.
Journal of Computer Applications in Technology | 2015
Sa'ed Abed; Ashraf Hasan Bqerat; Ahmad Al-Khasawneh; Ayoub Alsarhan; Ibrahim Obeidat
Distributed virtual memory DVM is one of the techniques that aim to maximise throughput under the reality that data transfer time between memory to memory is less than memory to hard disk which motivates these techniques. Distributed reversal cache DRC is a DVM technique that has been developed recently and is considered as a complete new approach. In this paper, part of memory of each master node is taken off to create the reversal cache then distributing its load over all nodes to free more area in the master node. Furthermore, we tackle the convenient issues of reversal cache and modify it for the sake of overcoming some stalls which mainly affect the performance of the master node by connecting the reversal cache technique with the DVM concept. Experimental results show that our technique outperformed other techniques in terms of page faults reduction and thrashing enhancement of 14% and 27%, respectively.
Archive | 2006
Sa'ed Abed; Otmane Ait Mohamed