M. Z. Rehman
Universiti Tun Hussein Onn Malaysia
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by M. Z. Rehman.
international conference on software engineering and computer systems | 2012
M. Z. Rehman; Nazri Mohd Nawi
The traditional Gradient Descent Back-propagation Neural Network Algorithm is widely used in solving many practical applications around the globe. Despite providing successful solutions, it possesses a problem of slow convergence and sometimes getting stuck at local minima. Several modifications are suggested to improve the convergence rate of Gradient Descent Backpropagation algorithm such as careful selection of initial weights and biases, learning rate, momentum, network topology, activation function and ‘gain’ value in the activation function. In a certain variation, the previous researchers demonstrated that in “feed-forward algorithm”, the slope of activation function is directly influenced by ‘gain’ parameter. This research proposed an algorithm for improving the current working performance of Back-propagation algorithm by adaptively changing the momentum value and at the same time keeping the ‘gain’ parameter fixed for all nodes in the neural network. The performance of the proposed method known as ‘Gradient Descent Method with Adaptive Momentum (GDAM)’ is compared with the performances of ‘Gradient Descent Method with Adaptive Gain (GDM-AG)’ and ‘Gradient Descent with Simple Momentum (GDM)’. The learning rate is kept fixed while sigmoid activation function is used throughout the experiments. The efficiency of the proposed method is demonstrated by simulations on three classification problems. Results show that GDAM is far better than previous methods with an accuracy ratio of 1.0 for classification problems and can be used as an alternative approach of BPNN.
ICSS | 2014
Nazri Mohd Nawi; M. Z. Rehman; Abdullah Khan
Metaheuristic algorithm such as BAT algorithm is becoming a popular method in solving many hard optimization problems. This paper investigates the use of Bat algorithm in combination with Back-propagation neural network (BPNN) algorithm to solve the local minima problem in gradient descent trajectory and to increase the convergence rate. The performance of the proposed Bat based Back-Propagation (Bat-BP) algorithm is compared with Artificial Bee Colony using BPNN algorithm (ABC-BP) and simple BPNN algorithm. Specifically, OR and XOR datasets are used for training the network. The simulation results show that the computational efficiency of BPNN training process is highly enhanced when combined with BAT algorithm.
Mathematical Problems in Engineering | 2015
Nazri Mohd Nawi; Abdullah Khan; M. Z. Rehman; Haruna Chiroma; Tutut Herawan
Recurrent neural network (RNN) has been widely used as a tool in the data classification. This network can be educated with gradient descent back propagation. However, traditional training algorithms have some drawbacks such as slow speed of convergence being not definite to find the global minimum of the error function since gradient descent may get stuck in local minima. As a solution, nature inspired metaheuristic algorithms provide derivative-free solution to optimize complex problems. This paper proposes a new metaheuristic search algorithm called Cuckoo Search (CS) based on Cuckoo bird’s behavior to train Elman recurrent network (ERN) and back propagation Elman recurrent network (BPERN) in achieving fast convergence rate and to avoid local minima problem. The proposed CSERN and CSBPERN algorithms are compared with artificial bee colony using BP algorithm and other hybrid variants algorithms. Specifically, some selected benchmark classification problems are used. The simulation results show that the computational efficiency of ERN and BPERN training process is highly enhanced when coupled with the proposed hybrid method.
international conference on neural information processing | 2014
Nazri Mohd Nawi; Abdullah Khan; M. Z. Rehman; Maslina Abdul Aziz; Tutut Herawan; Jemal H. Abawajy
The Levenberg Marquardt (LM) algorithm is one of the most effective algorithms in speeding up the convergence rate of the Artificial Neural Networks (ANN) with Multilayer Perceptron (MLP) architectures. However, the LM algorithm suffers the problem of local minimum entrapment. Therefore, we introduce several improvements to the Levenberg Marquardt algorithm by training the ANNs with meta-heuristic nature inspired algorithm. This paper proposes a hybrid technique Accelerated Particle Swarm Optimization using Levenberg Marquardt (APSO_LM) to achieve faster convergence rate and to avoid local minima problem. These techniques are chosen since they provide faster training for solving pattern recognition problems using the numerical optimization technique.The performances of the proposed algorithm is evaluated using some bench mark of classification’s datasets. The results are compared with Artificial Bee Colony (ABC) Algorithm using Back Propagation Neural Network (BPNN) algorithm and other hybrid variants.Based on the experimental result, the proposed algorithms APSO_LM successfully demonstrated better performance as compared to other existing algorithms in terms of convergence speed and Mean Squared Error (MSE) by introducing the error and accuracy in network convergence.
Archive | 2014
Nazri Mohd Nawi; Abdullah Khan; M. Z. Rehman
Selecting the optimal topology of neural network for a particular application is a difficult task. In case of recurrent neural networks (RNN), most methods only introduce topologies in which their neurons are fully connected. However, recurrent neural network training algorithm has some drawbacks such as getting stuck in local minima, slow speed of convergence and network stagnancy. This paper propose an improved recurrent neural network trained with Cuckoo Search (CS) algorithm to achieve fast convergence and high accuracy. The performance of the proposed Cuckoo Search Recurrent Neural Network (CSRNN) algorithm is compared with Artificial Bee Colony (ABC) and similar hybrid variants. The simulation results show that the proposed CSRNN algorithm performs better than other algorithms used in this study in terms of convergence rate and accuracy.
DaEng | 2014
Nazri Mohd Nawi; Abdullah Khan; M. Z. Rehman
Nature inspired meta-heuristic algorithms provide derivative-free solution to optimize complex problems. Cuckoo Search (CS) algorithm is one of the most modern addition to the group of nature inspired optimization meta-heuristics. The Simple Recurrent Networks (SRN) were initially trained by Elman with the standard back propagation (SBP) learning algorithm which is less capable and often takes enormous amount of time to train a network of even a moderate size. And the complex error surface of the SBP makes many training algorithms are prone to being trapped in local minima. This paper proposed a new meta-heuristic based Cuckoo Search Back Propagation Recurrent Neural Network (CSBPRNN) algorithm. The CSBPRNN is based on Cuckoo Search to train BPRNN in order to achieve fast convergence rate and to avoid local minima problem. The performance of the proposed CSBPRNN is compared with Artificial Bee Colony using BP algorithm, and other hybrid variants. Specifically OR and XOR datasets are used. The simulation results show that the computational efficiency of BP training process is highly enhanced when coupled with the proposed hybrid method.
Applied Mechanics and Materials | 2013
Nazri Mohd Nawi; M. Z. Rehman; Mohd Imran Ghazali; Musli Nizam Yahya; Abdullah Khan
Noise-Induced Hearing Loss (NIHL) has become a major health threat to the Malaysian industrial workers in the recent era due to exposure to high frequency noise produced by the heavy machines. Recently, many studies have been conducted to diagnose the NIHL in industrial workers but unfortunately they neglected some factors that can play a major role in speeding-up NIHL. In this paper, a new Hybrid Bat-BP algorithm which is based on the trio combination of BAT based metaheuristic optimization, back-propagation neural network, and fuzzy logic is proposed to diagnose NIHL in Malaysian industrial workers. The proposed Hybrid Bat-BP will use heat, body mass index (BMI), diabetes, and smoking along with the century old audiometric variables (i.e. age, frequency, and duration of exposure) to better predict NIHL in Malaysian workers. The results obtained through Hybrid Bat-BP will be able to help us identify and reduce the NIHL rate in the workers with high accuracy.
INTERNATIONAL CONFERENCE ON MATHEMATICS, ENGINEERING AND INDUSTRIAL APPLICATIONS 2014 (ICoMEIA 2014) | 2015
Nazri Mohd Nawi; M. Z. Rehman; Abdullah Khan
Wolf Search (WS) is a heuristic based optimization algorithm. Inspired by the preying and survival capabilities of the wolves, this algorithm is highly capable to search large spaces in the candidate solutions. This paper investigates the use of WS algorithm in combination with back-propagation neural network (BPNN) algorithm to overcome the local minima problem and to improve convergence in gradient descent. The performance of the proposed Wolf Search based Back-Propagation (WS-BP) algorithm is compared with Artificial Bee Colony Back-Propagation (ABC-BP), Bat Based Back-Propagation (Bat-BP), and conventional BPNN algorithms. Specifically, OR and XOR datasets are used for training the network. The simulation results show that the WS-BP algorithm effectively avoids the local minima and converge to global minima.
international conference on neural information processing | 2014
Nazri Mohd Nawi; Abdullah Khan; M. Z. Rehman; Maslina Abdul Aziz; Tutut Herawan; Jemal H. Abawajy
Metaheuristic algorithm is one of the most popular methods in solving many optimization problems. This paper presents a new hybrid approach comprising of two natures inspired metaheuristic algorithms i.e. Cuckoo Search (CS) and Accelerated Particle Swarm Optimization (APSO) for training Artificial Neural Networks (ANN). In order to increase the probability of the egg’s survival, the cuckoo bird migrates by traversing more search space. It can successfully search better solutions by performing levy flight with APSO. In the proposed Hybrid Accelerated Cuckoo Particle Swarm Optimization (HACPSO) algorithm, the communication ability for the cuckoo birds have been provided by APSO, thus making cuckoo bird capable of searching for the best nest with better solution. Experimental results are carried-out on benchmarked datasets, and the performance of the proposed hybrid algorithm is compared with Artificial Bee Colony (ABC) and similar hybrid variants. The results show that the proposed HACPSO algorithm performs better than other algorithms in terms of convergence and accuracy.
SCDM | 2014
Nazri Mohd Nawi; Abdullah Khan; M. Z. Rehman; Tutut Herawan; Mustafa Mat Deris
Recurrent Neural Networks (RNN) have local feedback loops inside the network which allows them to store earlier accessible patterns. This network can be trained with gradient descent back propagation and optimization technique such as second-order methods. Levenberg-Marquardt has been used for networks training but still this algorithm is not definite to find the global minima of the error function. Nature inspired meta-heuristic algorithms provide derivative-free solution to optimize complex problems. This paper proposed a new meta-heuristic search algorithm, called Cuckoo Search (CS) to train Levenberg Marquardt Elman Network (LMEN) in achieving fast convergence rate and to avoid local minima problem. The proposed Cuckoo Search Levenberg Marquardt Elman Network (CSLMEN) results are compared with Artificial Bee Colony using BP algorithm, and other hybrid variants. Specifically 7-bit parity and Iris classification datasets are used. The simulation results show that the computational efficiency of the proposed CSLMEN training process is highly enhanced when coupled with the Cuckoo Search method.