Hussein Al-Bahadili
Petra University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Hussein Al-Bahadili.
Computers & Mathematics With Applications | 2008
Hussein Al-Bahadili
This paper introduces a novel lossless binary data compression scheme that is based on the error correcting Hamming codes, namely the HCDC scheme. In this scheme, the binary sequence to be compressed is divided into blocks of n bits length. To utilize the Hamming codes, the block is considered as a Hamming codeword that consists of p parity bits and d data bits (n=d+p). Then each block is tested to find if it is a valid or a non-valid Hamming codeword. For a valid block, only the d data bits preceded by 1 are written to the compressed file, while for a non-valid block all n bits preceded by 0 are written to the compressed file. These additional 1 and 0 bits are used to distinguish the valid and the non-valid blocks during the decompression process. An analytical formula is derived for computing the compression ratio as a function of block size, and fraction of valid data blocks in the sequence. The performance of the HCDC scheme is analyzed, and the results obtained are presented in tables and graphs. Finally, conclusions and recommendations for future works are pointed out.
Computers & Mathematics With Applications | 2008
Hussein Al-Bahadili; Shakir M. Hussain
This paper presents a new and efficient data compression algorithm, namely, the adaptive character wordlength (ACW) algorithm, which can be used as complementary algorithm to statistical compression techniques. In such techniques, the characters in the source file are converted to a binary code, where the most common characters in the file have the shortest binary codes, and the least common have the longest; the binary codes are generated based on the estimated probability of the character within the file. Then, the binary coded file is compressed using 8 bits character wordlength. In this new algorithm, an optimum character wordlength, b, is calculated, where b>8, so that the compression ratio is increased by a factor of b/8. In order to validate this algorithm, it is used as a complement algorithm to Huffman code to compress a source file having 10 characters with different probabilities, and these characters are randomly distributed within the source file. The results obtained and the factors that affect the optimum value of b are discussed, and, finally, conclusions are presented.
International Journal of Business Data Communications and Networking | 2011
Hussein Al-Bahadili; Alia Sabri
In mobile ad hoc networks MANETs, broadcasting is widely used in route discovery and other network services. The most widely used broadcasting algorithm is simple flooding, which aggravates a high number of redundant packet retransmissions, causing contention and collisions. Proper use of dynamic probabilistic algorithm significantly reduces the number of retransmissions, which reduces the chance of contention and collisions. In current dynamic probabilistic algorithm, the retransmission probability pt is formulated as a linear/non-linear function of a single variable, the number of first-hop neighbors k. However, such algorithm is suffers in the presence of noise due to increasing packet-loss. In this paper, the authors propose a new dynamic probabilistic algorithm in which pt is determined locally by the retransmitting nodes considering both k and the noise-level. This algorithm is referred to as the dynamic noise-dependent probabilistic DNDP algorithm. The performance of the DNDP algorithm is evaluated through simulations using the MANET simulator MANSim. The simulation results show that the DNDP algorithm presents higher network reachability than the dynamic probabilistic algorithm at a reasonable increase in the number of retransmissions for a wide range of noise-level. The effects of nodes densities and nodes speeds on the performance of the DNDP algorithm are also investigated.
International Journal of Computers and Applications | 2010
Hussein Al-Bahadili
Abstract In this paper we proposed and evaluated the performance of a new bit-level text compression scheme that is based on the Hamming codes based data compression (HCDC) algorithm. The scheme consists of six steps some of which are repetitively applied to achieve higher compression ratio. The repetition loops continue until inflation detected and the accumulated compression ratio is the multiplication of the compression ratios of the individual loops, therefore, we refer to the new scheme as HCDC(k), where k refers to the number of repetition loops. To enhance the compression power of the HCDC(k) scheme, a new adaptive encoding format was proposed in which a character is encoded to binary according to its probability. This method of encoding reduces the binary sequence entropy so that it grants higher compression ratio. A number of text files from standard corpora were compressed and the obtained results demonstrated that the proposed scheme has higher compression power than many widely used compression algorithms and it has a competitive performance with respect to state-of-the-art programs.
Archive | 2012
Hussein Al-Bahadili
Computer networks have become essential to the survival of businesses, organizations, and educational institutions, as the number of network users, services, and applications has increased alongside advancements in information technology. Given this, efforts have been put forward by researchers, designers, managers, analysts, and professionals to optimize network performance and satisfy the varied groups that have an interest in network design and implementation. Simulation in Computer Network Design and Modeling: Use and Analysis reviews methodologies in computer network simulation and modeling, illustrates the benefits of simulation in computer networks design, modeling, and analysis, and identifies the main issues that face efficient and effective computer network simulation. This reference will inform the work and research of academics, post-graduate students, developers, network designers, network analysts, telecommunication system designers, and others who are interested in using simulation in computer network design and modeling.
International Journal of Mobile Learning and Organisation | 2011
Ghassan Issa; Hussein Al-Bahadili; Maher Abuhamdeh
There has been an enormous increase in the use of mobile learning (m-learning) systems in many fields due to the tremendous advancement in information and communication technologies. Although, there are many frameworks that have been developed for identifying and categorising the different components of m-learning systems, most of them have some limitations, drawbacks, and no support for quantitative assessment for the success factors (global weights) of the system criteria. In this paper, a new scalable hierarchal framework is developed, which identifies and categorises all components that may affect the development and deployments of cost-effective m-learning. Furthermore, due to the hierarchal structure of the framework, any of the analytic hierarchy process techniques can be used to quantitatively estimate the success factors of the system criteria. In order to demonstrate the benefits and flexibility of the new framework, we develop an interactive software tool for computing success factors of the different system criteria. The tool is referred to as SFacts, and it is used to compute success factors for different sets of preferences.
International Journal of Interactive Mobile Technologies (ijim) | 2010
Ghassan Issa; Shakir M. Hussain; Hussein Al-Bahadili
This paper presents a description of an interactive satellite TV based mobile learning (STV-ML) framework, in which a satellite TV station is used as an integral part of a comprehensive interactive mobile learning (M-Learning) environment. The proposed framework assists in building a reliable, efficient, and cost-effective environment to meet the growing demands of M-Learning all over the world, especially in developing countries. It utilizes recent advances in satellite reception, broadcasting technologies, and interactive TV to facilitate the delivery of gigantic learning materials. This paper also proposed a simple and flexible three-phase implementation methodology which includes construction of earth station, expansion of broadcasting channels, and developing true user interactivity. The proposed framework and implementation methodology ensure the construction of a true, reliable, and cost effective M-Learning system that can be used efficiently and effectively by a wide range of users and educational institutions to deliver ubiquitous learning.
2013 1st International Conference & Exhibition on the Applications of Information Technology to Renewable Energy Processes and Systems | 2013
Hussein Al-Bahadili; Hadi Al-Saadi; Riyad Al-Sayed; M. Al-Sheikh Hasan
A photovoltaic (PV) solar panels exhibit non-linear current-voltage characteristics, and according to the maximum power transform (MPT) theory, it can produce maximum power at only one particular operating point (OP), namely, when the source impedance matches with the load impedance, a match which cannot be guaranteed spontaneously. Furthermore, the maximum power point (MPP) changes with temperature and light intensity variations. Therefore, different algorithms have been developed for maximum power point tracking (MPPT) based on offline and online methods. Evaluating the performance of these algorithms for various PV systems operating under highly dynamic environments are essentials to ensure a reliable, efficient, cost-effective, and high performance systems. One possible approach for system evaluation is to use computer simulation. This paper addresses the use of MATLAB software as a simulation tool for evaluating the performance of MPPT for PV systems.
International Journal of Information and Communication Technology Education | 2014
Ghassan Issa; Shakir M. Hussain; Hussein Al-Bahadili
In an effort to enhance the learning process in higher education, a new model for Competition-Based Learning CBL is presented. The new model utilizes two well-known learning models, namely, the Project-Based Learning PBL and competitions. The new model is also applied in a networked environment with emphasis on collective learning as well as collective outcomes. The new model, which is referred to as CBL, provides educators with an alternative solution to overcome many of students deficiencies associated with traditional learning practices; such as lack of motivation, lack of self esteem, insufficient practical and real-life experience, and inadequate team work practices. The new CBL model makes a clear distinction between PBL and competitions and CBL. It avoids the disadvantages of competitions, while at the same time gaining from the many benefits of PBL. Identification features of CBL, components of CBL, as well as advantages are presented. An open source Learning Management System LMS, namely, Moodle is used for the implementation of a networked environment to support CBL.
International Journal of Cloud Applications and Computing archive | 2013
Hussein Al-Bahadili; Awad Al-Sabbah; Mohammed Abu Arqoub
Cloud Computing IT infrastructure has the potential to be particularly suitable for collaborative commerce (c-commerce) applications; because it generally requires less efforts and interferences for development, customization, integration, operation, and maintenance than other traditional IT infrastructures (e.g., on-premises and data centers). However, upgrading c-commerce applications running on traditional IT infrastructures, to run efficiently on cloud computing infrastructure, faces a number of challenges, mainly, lack of effective and reliable architectural model. This paper presents a description of a new architectural model for developing cloud computing based c-commerce applications; which is denoted as cc-commerce model. The model is an basically based on the standard cloud computing model, and it consists of six main components; these are: client, provider, auditor, broker, security and privacy, and communications network. The new model is implemented in a simple and flexible Web-based test tool, namely, the cc-commerce test (3CT) tool, which is used to evaluate the performance of the model through measuring the response times for four different configurations. The analysis of the obtained results demonstrates that the cc-commerce model can provide better response time than equivalent c-commerce models.