Habib Dhahri
University of Sfax
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Habib Dhahri.
international joint conference on neural network | 2006
Habib Dhahri; Adel M. Alimi
We develop a modified differential evolution algorithm that produces radial basis function neural network controllers for chaotic systems. This method requires few controlling variables. We examine the result of applying the proposed algorithm to time series prediction, which illustrates the effectiveness of this technique. We apply this algorithm to several computational and real systems including Mackey-Glass time series, the Lorenz attractor, and experimental data obtained from the Henon map. Our experiments indicate that the structural differences between our approach and the other methods existing in the bibliography particularly are well suited to modeling chaotic time series data.
Neurocomputing | 2013
Souhir Bouaziz; Habib Dhahri; Adel M. Alimi; Ajith Abraham
Abstract In this paper, a tree-based encoding method is introduced to represent the Beta basis function neural network. The proposed model called Flexible Beta Basis Function Neural Tree (FBBFNT) can be created and optimized based on the predefined Beta operator sets. A hybrid learning algorithm is used to evolving FBBFNT Model: the structure is developed using the Extended Genetic Programming (EGP) and the Beta parameters and connected weights are optimized by the Opposite-based Particle Swarm Optimization algorithm (OPSO). The performance of the proposed method is evaluated for benchmark problems drawn from control system and time series prediction area and is compared with those of related methods.
Neurocomputing | 2012
Habib Dhahri; Adel M. Alimi; Ajith Abraham
This paper proposes a hierarchical multi-dimensional differential evolution (HMDDE) algorithm, which is an automatic computational frame work for the optimization of beta basis function neural network (BBFNN) wherein the neural network architecture, weights connection, learning algorithm and its parameters are adapted according to the problem. In the HMDDE-designed neural network, the number of individuals of the population multi-dimensions is the number of beta neural networks. The population of HMDDE forms multiple beta networks with different structures at the higher level and each individual of the previous population is optimized at a lower hierarchical level to improve the performance of each individual. For the beta neural network consisting of m neurons, n individuals (different lengths) are formed in the upper level to optimize the structure of the beta neural network. In the lower level, the population within the same length is to optimize the free parameters of the beta neural network. To evaluate the comparative performance, we used benchmark problems drawn from identification system and time series prediction area. Empirical results illustrate that the HMDDE produces a better generalization performance.
international conference on neural information processing | 2012
Souhir Bouaziz; Habib Dhahri; Adel M. Alimi
In this paper, a new time-series forecasting model based on the Flexible Beta Operator Neural Tree (FBONT) is introduced. The FBONT model which has a tree-structural representation is considered as a special Beta basis function multi-layer neural network. Based on the pre-defined Beta operator sets, the FBONT can be formed and optimized. The FBONT structure is developed using the Extended Genetic Programming (EGP) and the Beta parameters and connected weights are optimized by the Particle Swarm Optimization algorithm (PSO). The performance of the proposed method is evaluated using time series forecasting problems and compared with those of related methods.
international symposium on neural networks | 2012
Habib Dhahri; Adel M. Alimi; Ajith Abraham
This paper presents an application of swarm intelligence technique namely Artificial Bee Colony (ABC) to design the design of the Beta Basis Function Neural Networks (BBFNN). The focus of this research is to investigate the new population metaheuristic to optimize the Beta neural networks parameters. The proposed algorithm is used for the prediction of benchmark problems. Simulation examples are also given to compare the effectiveness of the model with the other known methods in the literature. Empirical results reveal that the proposed ABC-BBFNN have impressive generalization ability.
international symposium on neural networks | 2010
Habib Dhahri; Adel M. Alimi
Many methods for solving optimization problems, whether direct or indirect, rely upon gradient information and therefore may converge to a local optimum. Global optimization methods like Evolutionary algorithms, overcome this problem although these techniques are computationally expensive due to slow nature of the evolutionary process. In this work, a new concept is investigated to accelerate the particle swarm optimization. The opposition-based PSO uses the concept of opposite number to create a new population during the learning process to improve the convergence rate of generalization performance of the beta basis function neural network. The proposed algorithm uses the dichotomy research to determine the target solution. Detailed performance comparison of OPSO-BBFNN with learning algorithm on benchmarks problems drawn from regression and time series prediction area. The results show that the OPSO-BBFNN produces a better generalization performance.
international symposium on neural networks | 2008
Habib Dhahri; Adel M. Alimi; Fakhri Karray
Many methods for solving optimization problems, whether direct or indirect, rely upon gradient information and therefore may converge to a local optimum. Global optimization methods like evolutionary algorithms, overcome this problem. In this work it is investigated how to construct a quality BBF network for a specific application can be a time-consuming process as the system must select both a suitable set of inputs and a suitable BBF network structure. Evolutionary methodologies offer the potential to automate all or part of these steps. This study illustrates how a hybrid BBFN-PSO system can be constructed, and applies the system to a number of datasets. The utility of the resulting BBFNs on these optimization problems is assessed and the results from the BBFN-PSO hybrids are shown to be competitive against the best performance on these datasets using alternative optimization methodologies. The results show that within these classes of evolutionary methods, particle swarm optimization algorithms are very robust, effective and highly efficient in solving the studied class of optimization problems.
congress on evolutionary computation | 2010
Habib Dhahri; Adel M. Alimi
Many methods for solving optimization problems, whether direct or indirect, rely upon gradient information and therefore may converge to a local optimum. Global optimization methods like Evolutionary algorithms, overcome this problem although these techniques are computationally expensive due to slow nature of the evolutionary process. In this work, a new concept is investigated to accelerate the differential evolution. The opposition-based DE uses the concept of opposite number to create a new population during the learning process to improve the convergence rate of generalization performance of the beta basis function neural network. The proposed algorithm uses the dichotomy research to determine the target solution. Detailed performance comparison of ODE-BBFNN with learning algorithm on benchmarks problems drawn from regression and time series prediction area. The results show that the ODE-BBFNN produces a better generalization performance.
world congress on computational intelligence | 2008
Habib Dhahri; Adel M. Alimi; Fakhri Karray
This paper proposes and describes an effective utilization of the heuristic optimization. The focus of this research is on a hybrid method combining two heuristic optimization techniques; Differential evolution algorithms (DE) and particle swarm optimization (PSO), to train the beta basis function neural network (BBFNN). Denoted as PSO- DE, this hybrid technique incorporates concepts from DE and PSO and creates individuals in a new generation not only by crossover and mutation operations as found in DE but also by mechanisms of PSO. The results of various experimental studies using the Mackey time prediction have demonstrated the superiority of the hybrid PSO-DE approach over the other four search techniques in terms of solution quality and convergence rates.
NICSO | 2008
Habib Dhahri; Adel M. Alimi
In this paper, we propose a differential evolution algorithm based design for the beta basis function neural network. The differential Evolution algorithm has been used in many practical cases and has demonstrated good convergences properties. The differential evolution is used to evolve the beta basis function neural networks topology. Compared with the traditional genetic algorithm, the combined approach proves goodly the difference, including the feasibility and the simplicity of implementation. In the prediction of Mackey-Glass chaotic time series, the networks designed by the proposed approach prove to be competitive, or even superior, to the traditional learning algorithm for a multi-layer Perceptron network and radialbasis function network. Therefore, designing a set of BBFNN can be considered as solution of a two-optimisation problem.